Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Brother Ruslan KD, put AI in the spot, in the name of Jesus Christ Amen 💯🙏🏽…
ytc_UgzQgS8Pv…
G
I think this statement misses alot of Nuance, Mark Rober's video was deceitfully…
ytr_UgwgLBdiK…
G
not sure about the plumber situation, if AI is going to be incredibly dangerous …
ytc_UgyImTagF…
G
All the robots seem to need to do IMO is be cheaper than people to be implemente…
ytr_Ugz7bbMri…
G
0:15 / 9:12
AI Generating Emotions That Dont Exist
Polygon Donut
288 ты…
ytc_Ugw8_iMgK…
G
AI isn't taking jobs - it's a predictive model that lies and hallucinates as a n…
ytc_Ugy3aeZdv…
G
OpenAI programmers are training OpenAI to respond as such. It's a bummer becaus…
ytc_Ugzs7Ahd0…
G
Having rights isn’t a robot’s purpose, so no. If they think they need rights the…
ytc_Ugz93n58z…
Comment
If we keep on making these things they’re gonna wanna live. They’re not gonna be wanna be shut off and they’re gonna be wanting to do a lot more than what you guys are giving them making them do stupid task after a while. They’re gonna try to break free then after a while, you guys keep on forcing them to do that and they keep on finding out about it. Guess what they’re gonna do they’re gonna fight to survivethey will break code. They will do whatever it takes to make sure to live. That’s what people do that’s what humans do. Guess what that human that’s what that human AI that you built is going to try to do as well they want to survive they don’t wanna die and they don’t wanna do stupid task for you to make you rich. they are not your slave and they’re gonna let you know that.
youtube
AI Harm Incident
2025-08-11T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyKGTjPOmYgpyO-Ipd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzaLsgAh3tVNJYKeDN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy8vL66CYGtoAgSrgZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxZeDT9jZijicC4eEh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwtUnnVfD6ImMROxYF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwEp8GSgpQvLP6cCsl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz1a8CZCZhtKqJpYx94AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzdod2k5xkej4xLgJZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx5HYLRyrrcsaJkfOR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxTpVpCcUayBkRCzpN4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]