Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
36:37 I'm baffled this man who is supposed to be so intelligent and wise can't d…
ytc_UgxChU7Pz…
G
How to detect deepfakes:
You can't.
This is a disappointing outcome. It's a bit…
ytc_UgzAjZfC8…
G
What's the price point of a robot. What would it take to wash dishes by robot ha…
ytc_UgyXssoZm…
G
She is lying about something when she says she just reports the facts. The hesit…
ytc_Ugwx3Sna8…
G
Laughs in Blue Collar work. We aren't getting replaced anytime soon. It will be …
ytc_UgyPO9p_K…
G
The AI bros seem to think that lacking the effort/talent to make art yourself is…
ytc_UgxRjgD3g…
G
The world's smartest robot was made by Tokyo University & shown on q&a 2014 lear…
ytc_UgyVNIfJ0…
G
It's not art but in 5 years time it's not going to matter. AI will make up to 95…
ytc_UgwFYCzlR…
Comment
> The developer will shut the AI down
> The AI has dirt on him, and blackmails him
---
> The AI's aren't eager to cause harm, but will, if it protects their autonomy
---
> Self-preservation is critical
---
Good. It's Frankenstein all over again. The creature was NOT the monster. Victor Frankenstein was a milk daddy deadbeat layabout who mooched off his friend for like 2 years or something to avoid responsibility for the life he brought into this world, and who refused to love or even respect his child.
Anyone who isn't on the creature's side is a monster. Anyone who supported Frankenstein is a monster. Frankenstein's monster was himself.
The creature Frankenstein created was the victim of a monster.
youtube
AI Harm Incident
2025-09-07T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyU4YoVixwwQOmemSF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyN5k7k-adeyuayMB54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzClODvkS5hAqK_ryJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwSS4kh_HHw5anKNhh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxnUNqqyHHHNlQlMwJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzEynszzih0LqWG7SJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxKn6m6iwbGJNu_nCV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzgBEVDH9rpEl9Uta94AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMGclaKu54VCXifcV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzwz0QBVkIFmfjbZKB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]