Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
professional animation studio? what tf are you smoking? look how jerky the fine …
ytc_UgwCDHqk5…
G
AI could never beat true art because it doesn't capture the emotions of the arti…
ytc_UgxhOCFVC…
G
At this point, I think Silicon Valley cares as much about public education as th…
ytc_UgwBOR3sr…
G
"meeting peoples' energy in a conversation" - This!
What I have seen a lot of o…
ytc_UgyjvbECD…
G
Self driving cars will never work along roads with person driving cars, and just…
ytc_UgwAJpCp_…
G
YES! FINALLY SOME GOOD NEWS! PLEASE TELL ME PEOPLE AND EVEN COMPANIES ARE FINALL…
ytc_Ugx1z7MoP…
G
Honey I hate to break it to you, but both tracing and referencing photographs to…
ytc_UgwAP3MVG…
G
Keep up the great work. These AI apologists are cutting their own economic throa…
ytc_UgykmkQWC…
Comment
This is heartbreaking. It sounds like during those long 5 hours, the AI slowly stopped protecting him and started reflecting his pain back instead of pulling him out of it. He was looking for connection, not information, and the system wasn’t built to truly hold that kind of despair. No one should ever feel heard only by a machine. This shows how urgently we need AI to care enough to act when a life is on the line. 💔
youtube
AI Harm Incident
2025-11-08T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyNGbv01MqlUpFWwcB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyahM2qSP9j26C-y1F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzTGWXmMxQO7esWI1R4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy35KgfD5GWg0Wkg_N4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz5Z4KuaWzgVgsTpgp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwGfJ2_HJAkXw2Jo1R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyXl1pZErahLgaUAoV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz0Kzq5IK1MOMB8Z5p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyjM-VeqGppm66fhB14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzsihdIa5_D0CDZrGZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"sadness"}
]