Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's kind of disappointing that these discussions are always about how superinte…
ytc_Ugw0ssw-Q…
G
I'm so sick of this lame-ass argument. The only people bitching about AI are unt…
ytc_UgyBWbg3B…
G
People need to stop accepting these walking robots. They only trying to get peop…
ytc_Ugy9UGOmR…
G
WHATS SCARY IS WE ARE MORE WORRIED ABOUT AI THAN THE NIGHTMARE HAPPENING IN AMER…
ytc_UgxHneIel…
G
I'll go you one better in regards to an argument that we're currently in a simul…
ytc_UgzcNZ7_a…
G
It all makes me really sad because A.I can be really good when it is just a mere…
ytc_Ugz-ebZ8W…
G
Jimmy and the other guys on the show are technophobes, and out of touch with the…
ytc_UgjCW6xPk…
G
Well, now they have robots so humans wont be neede. Obviously that raised proble…
ytc_UgzUPhud_…
Comment
I have a newer Tesla model Y and have been using FSD 13. One definitely needs to pay attention all the time, especially around construction where there are cones, gates, or temporary barriers. In those situations it should tell the driver in advance and turn off automatically. On the open highway, the system now is basically a nanny making sure you pay attention. I have had many strikes from the system and had it shut off because I looked at my phone, spent too much time looking at the screen, or the system sensed that I was beginning to get sleepy or not paying attention, aka daydreaming, even while looking straight ahead. It actually has the ability to make you a better, more attentive driver. This is much better than autopilot in other cars like my Hyundai Palisade. I would say you should be able to turn on the attention-monitoring part of the system and turn off FSD, which would save many lives if this technology existed in all cars.
youtube
AI Harm Incident
2025-11-23T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxCRQyA8IAOvPm4rKZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxmtXeWka1Uiz7reSt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx5r1VWmFXxVFttx5F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugxq0hbVHrGijYJygzh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwE3NZEXcL27YighJB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzAJCcv3yJzsNn7xiN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwzzACJsiLYJFPqj7h4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzfKqQq1Ddn4gfODM14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwR8VCE27lyOF9hcEt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgySNoqKRYn99j5dc0d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]