Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI bros will bring up the "don't blame AI, blame Capitalism" excuse. Why can't …
ytc_UgyWCFrHY…
G
It is not about jobs. If AI can do the work we don't need humans to do it. What …
ytc_UgzcASuXY…
G
Artificial intelligence is becoming artificial consciousness. There is nothing d…
ytc_Ugxmg4I2d…
G
don't worry, they're also developing their own AI, so we can watch them fight ea…
ytr_UgzGegMAf…
G
When this happens I hope this guy has an AI army cause he will need it.…
ytc_UgziNsWOV…
G
Have a tea.
The dystopian view is an overreaction.
No robot is going to replace …
ytr_Ugx3HLIRJ…
G
Placing ai avatars on pictures are fine to me. But adding non-existent details i…
ytc_Ugx7gfG8h…
G
It's not even an AI problem—if you're using a model as a service, you should jus…
ytr_UgxwUbktS…
Comment
Seems more like people are scared of a new thing that isn't quite perfected yet, just look at every other technological inventions, there was always a huge amount of people criticizing it, like TV, or video games, heck people even thought books were bad at one point. I think it's best to take precautions to avoid these things, but AI is way to valuable of a tool to just give up on in my opinion.
youtube
AI Harm Incident
2025-09-10T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzEY0yU1dzfb1R-aJZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzuu-STTy7jObsp-5N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugw3GeaR99a240lYSLt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzfz6ujfvlze9RAUgx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxWcqfU3f-gqW2T16Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy7dE1_R27qI-Hj1MZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwV5VNsp7Qyg1cTkiZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnmxfSsvlqUV4ZdFt4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzRN_kI3P5JdFnqQp94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy13_P-cEdTqIEhyEd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]