Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
we were begging it for mercy, but AI did it again and again
28 stab wounds…
ytr_UgzO8oLg7…
G
I agree it is an arms race we can’t stop. But I don’t agree that AI needs to gai…
ytc_Ugy5LSHIA…
G
I use the extension that brings back the dislike counter and see 2,668 DISLIKES …
ytc_UgynEq7EJ…
G
This guy sounds nihilistic. AI will never replace human usefulness. Technical ta…
ytc_Ugzz7jC1L…
G
As an CS engineering student I can say : no our job is not at "risk" it's just s…
ytc_UgzBBi0Yo…
G
Anonymous if you want to go be a robot like their citizens go ahead. I would nev…
ytr_UgxYAMxSE…
G
I am a PhD scientist. I have used ChatGPT several times, asking it questions ab…
ytc_UgxpuNkHC…
G
This time she hasn't convinced me. "If you are working with a character, and you…
ytc_Ugzxbaz6M…
Comment
I think people just have difficulty grasping what thinking differently from humans is. It's not like just a disagreement. You can't talk it out. It's like trying to talk to the sun so it doesn't release solar flares, or to the atmosphere so it doesn't make hurricanes. They just don't care, but it's not that they don't care like a desensitized or egoistical human don't care. They work by a completely different set of rules, such that we can't even say they think or want something without using these words in a very literary way. That's what AI is to us, and while we evolved to survive to the atmosphere and the sun, we did not evolve to survive AI. And if it gets enough resources it will do whatever it does at a rate and pace we will likely not survive. And we don't know what it does.
youtube
AI Governance
2025-12-02T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwfuJldpu13N5yIjgJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwQtPiShjExt_Mm1Vp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzd-yLBfi9WMHa4g0p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwnLFQDQY47429ZVih4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyENebu1tHpusFjJtd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz0siumGK2Szqinj4x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDEYCalO3RoZEuB_J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzdMVHYcOIlKj399Gl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzsQj-ugSeNf558p_d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw-wjLHTVGqJ0RSS-t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]