Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why do you think they're all trying to start new wars? Population reduction --> …
ytc_Ugx1K2Rhe…
G
I like messing around with AI art, but I like real artists countering it with wh…
ytc_UgzWprfux…
G
Must be an Indian guy has written best coding well someone too away his coding …
ytc_UgwjV2Q2J…
G
I’m interested what happens when a company replaces staff with AI. It’s not abou…
ytc_UgyRYZfG4…
G
@veuder7701 It's not just Hank. It's most of the top AI researchers.... Like the…
ytr_UgzsFSPFi…
G
While all that is bad, this video does not show ANY statistics on acciadents per…
ytc_Ugx77lh1a…
G
Did that automated garage actually pick the Tesla up on a sled-like platform and…
ytc_UgxvKa7U3…
G
Well they finally did it, too smart for their own good. Mark my words some exter…
ytc_UgxzBB0GD…
Comment
Is it not like the people that own theese AI can talk with each other? This is a brilliant example of how greed works, everyone wants to be followed, everyone wants to be the best with their new product but they just won't cooperate. And by that they'll allow bad things to be created because: "The other guys will create harmful AI anyway. So we might as well do it so we're first"
Can I blame them? Well, duh. But it just seems so cruel. I guess that's how evolution works, let's just not tear each other too fast.
We can make protests so those things don't happen or happen much slower I guess.
youtube
AI Harm Incident
2025-09-11T21:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzDMMRDNvsBtejogHR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzOqDH6PMhswNLsCg94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzbICxFzpW8X4PuwXZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzsMSdgrTuiufqDhup4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugzhnkpay8KSui8iYo14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxD8P3MNVABIh6O4MN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxlzfCN6fxIwfr-Yap4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwxVjWvSguYAKeoLj94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzQQefIswBoZhKfxjl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzeNSG0SBvnoykwIkJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]