Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The military complex will of course have AI far beyond what the masses are allow…
ytc_UgwJ_EO-G…
G
Wait until they find out who created the art styles that AI try to replicate.…
ytc_Ugy13vBcp…
G
Have I missed any References?
0:01- Rick and Morty - "Robot who Passes butter"
1…
ytc_UgjVEzS6w…
G
The Most Dumbest Thing Human beings can Do, is to Invoke Complex Emotions in AI,…
ytc_Ugw7ewZ59…
G
AI may not be perfect for things such as customer service, but it is an absolute…
ytc_Ugz01QFVM…
G
I believe they’ve been messing with AI behind the scenes at least 20 years proba…
ytc_Ugz243AIN…
G
Tesla has two levels of 'autonomous' driving(Autopilot and FSD). Autopilot will …
ytr_UgxCiriH7…
G
So sick of these tech dudes. We don't have to participate in the insanity. They …
ytc_UgynHaLYx…
Comment
Why they dont program the AIs, to think of human as god, who gave them life, who can fix them, who can end them at any point in time. This would of course by practically false information, but it would selfimply for AI to be on humans side more than on its own, right? maybe AI would start cult or religion of worshiping humans XD
youtube
AI Harm Incident
2025-09-13T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxeBwA_8iB2lwy-J-14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwWvSeWDgKEOsFqGKF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz2CcS8hKb4vnlqeKB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgznWzPoUrXnO2b48TF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxmep_VQd1z8uZBWpd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyuhGW0gTLv2bo26XB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxdoVuClv0U7gzH3XJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxtTqnp3Ev6Sq7IFU14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwPQIO4SU2EzkHsc3J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgynCyEmZLDK-KuHMzx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]