Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They are doing the same in Dublin Ohio. Driverless trucks travel through I-270 …
ytc_UgwUlmpRg…
G
We already have the software to stop any evil AI from taking over - it is called…
ytc_UgzCbvqG6…
G
In the perfect world of fully automated luxury space communism™, (i.e. Star Trek…
ytc_Ugyjf3qaw…
G
This is proof that today’s teens and kids have the brain size of a actual bottle…
ytc_UgyRVMlAx…
G
I questioned AI about a vehicle shituation on ice and a hill. AI advice (disabli…
ytc_Ugx67vqJm…
G
Hinton is saying something inherently contradictory: AI will make health claims …
ytc_Ugz2l7-5G…
G
AI is not evil, nor is it a bad thing
Like guns, it is those who use them selfi…
ytc_UgwK_4NdU…
G
The real question, no one is asking, is why are power holding humans so hell ben…
ytc_UgznZJFHX…
Comment
Tesla Autopilot did not kill anyone. The responsible driver being assisted with Autopilot killed them. Since autopilot is still learning, this is why drivers are still responsible for any actions taken by their vehicle. Autopilot some day, will be better than any of us. As a rider myself, I am not afraid of Autopilot, just inattentive drivers.
youtube
AI Harm Incident
2022-11-29T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzxz48txMxhQeiCBNd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzhuiJI9p_eVb8udIl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwzPyIM8H7iRtRiqqF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyKQLq0botMsua25RZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzR9i35eMPKGoLn7ZZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyF6aWxUW0YAyLJSst4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx5oqkYGEbplxxy90d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgynoeuXTRbvaAdSGoZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyWHtt-bvpIkZp8YkR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyIIb2CL_D9Co96AR94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]