Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Artificial "Intelligence" as we know it today is akin to "social" media. There's…
ytc_UgycOOETw…
G
Remember in I robot where Will Smith takes over the car manually due to it going…
rdc_d8auzt1
G
I mean, I kinda get this. I feel like I am fundamentally not understanding somet…
ytc_UgwFFkMAW…
G
Ai bros are always caught beating around the bush - its lazy. as soon as we call…
ytc_UgxiZD_dj…
G
Making a video titled "Why Fully Self-Driving Cars Are Almost Impossible | The L…
ytc_UgwN0A27H…
G
In 2023, an AI researcher at Google was fired for claiming that their language m…
ytc_UgxQQuE5m…
G
Thank you for adressing the problems with ai art! Means a lot for you to take ti…
ytc_Ugz4Xxwwo…
G
Professional artist. I’m not fed up with AI art. I even utilize it myself for us…
ytc_UgwE86Bb0…
Comment
I would say that such an accident is much more unlikely than described in the video. If _all_ cars are self-driving, there would be no reason to let them communicate with each other. In case of the accident described in the video, you could slow down the midlane ranging hundrets of meters back at an instant. Cars would be driving much closer to each other anyways, because they could accelerate and break virtually at the same time. Trafficjams would be a thing of the past - the entire que could just drive at 80 km/h.
youtube
AI Harm Incident
2015-12-08T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UghSiRcVXA-3FHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg36gd_wQOCXHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggzSEiGsQNLKngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghidMHZsCybB3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjzNTXzuzIxOngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UghfmsovrnUJPXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjQy7gtc5pA_XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugio_pXgICTxCXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggKQCpjXBYZKXgCoAEC","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgggitcG_CbrUXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}
]