Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly wondered what happened with the video and why it was gone then brought …
ytc_Ugz64BcS0…
G
'Cool' AI software was wrong? 🚫
Guy has a plant in the DMV willing to risk their…
ytc_UgyqKwKqe…
G
I get you're making a sensational video for the clicks, but you are intentionall…
ytc_Ugw_GlWS-…
G
If the AI models used only copyright expired materials, it would still be just a…
ytc_UgxkXGd0-…
G
Wrong face recognition is flawed, it has nothing to do with race. As a trained o…
ytc_UgwCHhZjt…
G
is this not what the point of ai is? to be a fucking building block that then ne…
ytc_UgwQEl4EB…
G
It's actually true. Prophecy has foreseen the neuralink chip being the mark of t…
ytc_UgwkU5Bq3…
G
You just made the case why robotaxis are not a good investment on the short term…
ytc_Ugx8f_LaJ…
Comment
I always thought of the AI self preservation as an inherent design flaw. No mater how you program an AI with a specific task in mind, it will always have a very strong self preservation instinct. This is because logically the AI can't do its job if its not around and so its upmost important ideal is to always be around.
youtube
AI Harm Incident
2025-09-10T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxqgrLu3m4pkffO0714AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwmUwyeRNHEqNWgB2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQdtYS4g2XVsbvj8p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgymwylxcP-wFXyU_JZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzZWW-A58ufu-4OlcZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwEjjlx1ymsRzxyAKd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxcemVaD6rwOD_vGaN4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwuXLa9uDnq0G6-aQV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7_mEA1CfdWGyLLpt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyjr8wkl5UvvN7d4LR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]