Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Learned this last week:
62 people control the equivalent of half of the world's …
rdc_d7khw1q
G
The real Joker was scared of the I.R.S but he was never scared of beating up Naz…
ytc_UgwiJPL6z…
G
I don't see how anyone can be OK with computerized driverless vehicles. Let alon…
ytc_UgxtcVqKP…
G
Imagine if billionaires were better people ... We'd build a better world with be…
ytc_UgwfayDbe…
G
It takee descriptive informatiom to marrow down certain detail, they are also ma…
ytc_Ugy3jZwQL…
G
Interesting take and it makes sense but lets say everything goes how you say. A…
ytc_UgwXSK_N3…
G
No offense, but every single person I’ve seen do apologia for AI art has used th…
ytr_UgyU1q3vM…
G
If we all banded together and refused to use AI It wouldn’t be as powerful I’m r…
ytc_Ugw0Rw5rM…
Comment
So the error was that the car allowed the driver to override the speed by continuing to press the gas pedal? In other words the autopilot gave the driver more freedom than it should have. But if the autopilot worked the other way and didn't let drivers override it then Tesla would be liable for even more accidents. There is a reason that so-called self-driving cars are required to have a human behind the wheel.
youtube
AI Harm Incident
2025-08-16T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyTp4bS-FxEYBWa_2R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugy8lbz2-ZDkN6IZbG14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzPvbGYo29-rcR8b1p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyR5B9KXHgr2nBlMMB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxZgShccTacBLeLF3x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugw5GL5gHFEFix5GnUR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxeZVWqD4x5xANjm8p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyBrGwuGA7xpoPWgBB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzoCWixuWHaQ8HgI9t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw6mxPBMtanB5Is5hJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]