Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Why people hate AI:
Short term: It drives up their electricity and pc part bi…
ytc_UgxzUhm0i…
G
Yeah exactly. There is no real effort in AI. You are doing 0 work, AI is doing a…
ytr_UgzrrFKO-…
G
Ne soyez pas inquiets, si une minorité surpuissante souhaite contrôler l’humanit…
ytc_UgwUqfVxp…
G
That's true, they could. However, Ai artists and doing actual work and putting e…
ytr_UgzdDnqe3…
G
Thank you for your observation! Sophia, the AI robot, might not have a physical …
ytr_UgxfZY8ev…
G
That was the longest “I don’t know” I’ve ever heard. He knows that AI will take …
ytc_UgzdlYwKI…
G
sooner you'll believe AI are creating invisibility cloaks already or apps we did…
ytc_UgxJfmuQq…
G
this is extremely late, but tbh, you can support both ai and real artists, n…
ytc_Ugyhe1FtF…
Comment
The issue with the premise is there isn't a manual override for such a situation. I don't know if self-driving cars will be able to react to this situation ever, and even then, it wouldn't be premeditated homicide if it was a deliberate decision to take the path that will save the driver. You can't blame them for the consequences of that, the car did it's job. If anything, the truck driver would be charged with manslaughter and reckless driving for not ensuring his cargo was secure.
youtube
AI Harm Incident
2017-07-09T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugjnw_pI28jYpXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghwOGDepVXCWngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj3YY9osWlB4HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UghifWP6y7_ogXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg4ldklSPeo8XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjywnlXpJLqFHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjokRxbpwiSqHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBFWCU-Fp7bngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjOOT8Vua498ngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgirnfINQGNpP3gCoAEC","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]