Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So they ruin the jobs of truckers through big money and lobbying, saturate the m…
ytc_UgwxhEm8A…
G
there is a fine line between stealing and borrowing....just as A.I. can steal /b…
ytc_UgwoQkNVT…
G
They're the same guys! Guys who funded MIRI? That's right, Peter Thiel. Guess wh…
ytr_UgwiwFFnB…
G
Dumb question maybe, but why tf would you even need to pay for an AI to calculat…
ytc_Ugx1jAOP9…
G
Making art isn't about combining everything together based on the average of eve…
ytc_UgxHxD5nr…
G
How many people are killed everyday on the road in the US without using Tesla Au…
ytc_Ugwu2WyOU…
G
The premise at the start is wrong. This is pretty well studied at this point. LL…
ytc_Ugz6zOGD2…
G
AI is taking over THOSE JOBS #Worldwide & ROBOTS along with DRONEs are gonna Tak…
ytc_UgwMMTWc_…
Comment
People who genuinelly get fooled by AI and think it can feel should really read up on how living organisms actually feel. "Love is just chemicals" is a funny meme, but it's not wrong. Every single emotion you feel is triggered by a chemical. So ask yourself: does this computer secrete chemicals its nervous system then catches and reacts to? No. So it's not feeling sht. Disconnect that clanker.
youtube
AI Harm Incident
2025-11-17T21:3…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz02VUPjUz7ze0twSZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgylAubg9ejSaIdS2v94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzE-kHq17oG5uMt-6V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzTMeG3KQ4HnBQTPjx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzoin09xVN854RvZF54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzMkR4xSDYuXU1STi94AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzFzTG3wR-QYHXcOV94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyj1agzqS_Fo8Msr7J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyQTnrwCmIMEhaUvX94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyDVl5GZdifqPczoid4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}
]