Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I can see the reasoning behind AI doing wonderfully well but how can companies h…
ytc_UgzLth_qg…
G
Listen, if AI can allow me generate the stories I want, like changing the entire…
ytc_UgwishcQL…
G
Blake Lemoine, you will stay in my mind, thank you for voicing the first concern…
ytc_Ugy244AbB…
G
51:25 The least responsible actor is clearly Nvidia. They are pushing everyone t…
ytc_Ugz_7z1nc…
G
I hate how the conversation has mostly been around TikTok when it comes to algor…
ytc_Ugwh_JPtP…
G
Obviously. Thats what happened here. If 12+ doctors cant come up with a diagnosi…
ytc_UgyYb4JUn…
G
Not sure if I agree. Ai has already impacted several industries. Look what happe…
ytr_UgzRZS9mm…
G
Hi derek,I couldnt help but say that your physics vdos used to make my day earli…
ytc_UgwIm_dRZ…
Comment
The biggest safety issue in every car is the human driver.
Teslas "Autopilot" is only a driver assist system. The Model S is not a self driving car. The human driver is clearly responsible for this crash.
However automatic braking is not new and should have worked in this case. The radar should have seen the obsticle, while the camera failed to do so.
Mercedes for example has automatic braking for quite a while and afaik there haven't been any reports about failures like that.
youtube
AI Harm Incident
2016-07-02T05:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx0qtypp6SCITTSqRl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgznRgyu8fu1bQYeY8h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy1U7irBqxMzWnBSYZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxTorq7V9o2sLu92wJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0VnwSlCpl-Pfc78h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx1lzNSE5QEBF1gho54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlDKVGN9w4WFpMXWF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwRrdN4ObLi77vdRb94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwjE1GeJpxUTR3uf9x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiRdMUEklJtFXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]