Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But you will be replaced by a Roomba, while the AI still requires real art to be…
ytr_UgztLheAh…
G
I tried it and its really good! I mean good enough at least for generative AI.…
rdc_m9g2f5o
G
For 1:41 - how about these for human preservation to start:
Isaac Asimov's Three…
ytc_UgyucO0Em…
G
Sounds like certain people have been put up to getting driverless trucks. Au…
ytc_Ugz7v4FTG…
G
Allah is the best of Creators, the whole world can't even create a fly nor an an…
ytc_Ugw3zJIUX…
G
I find ai things just look fake ,some is very obvious, but even the subtle just …
ytc_UgyizaPRC…
G
Asking the AI to make it doesn’t mean you are an artist. A bit of a writer, and …
ytr_UgyBGwRc4…
G
i am studying and learning art because i like drawing.. and if i ever feel like …
ytc_Ugz0MCjLD…
Comment
I'm gonna blame the billionaire who doesn't care about human life. Also computers cant make a moral decision, so they shouldn't be allowed to. No self-driving anything, ever. If they actually cared about road safety and people's safety, they would really beef up public transit and massively improve it. But that will not happen because ✨money✨
youtube
AI Harm Incident
2025-08-15T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxWWyBn6n6G3LSZWm94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxgSbnhGhn5jljvGz94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwOIY9-DDAfetIgUC94AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz4Wv9FgboFZm_4QAZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxmLQTnVEWeEVulOt14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxvqUU-rfxHzSPR-R54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxFtU7bSXsjMDIctMl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxAntYtc1sV3QQm8Ih4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-6ID5ni9LkCFqgxN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxZWlEUFoWuYLVY7ix4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]