Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hope they will develop AI that will lead and guide the world, because we are f…
ytc_UgzZbh7Cc…
G
As a bug fixer it has made my job easy. You just drop error in that chat bot win…
ytc_UgyRlhqMK…
G
Let me tell you what AI training AI is going to do: poison the well of training …
ytc_Ugz497nbt…
G
Hey @saribhaider3985, thanks for dropping a comment! I appreciate your interest …
ytr_UgwgSninu…
G
takes one strong swift bad code used to hack AI - programmed to destroy all huma…
ytc_UghVP7t4I…
G
@Agente13840 "cry more" "luddite" "cope"
finally decided to learn the concept …
ytr_UgyrU2Xbm…
G
To this day I still don't understand why they're trying to achieve with AI. Like…
ytc_UgwgtEPQ7…
G
When you get ChatGPT into situations like this, you actually see an effect of wh…
ytc_UgzBxI_59…
Comment
2:25 The Tesla was on AUTOPILOT not FULL SELF DRIVING. Tesla’s autopilot is meant for highways and straight roads. NOT INTERSECTIONS. Not only this, but you are supposed to be fully attentive at all times when using these programs. That is made abundantly clear when signing up for and using these programs. Autopilot did exactly as advertised, however I will admit that the driver safety system should have taken over and stopped the vehicle. So not autopilot, and completely the driver’s fault, but the safety system did not work as intended. This should be a lawsuit against the safety system, not Autopilot. Not to even mention the fact that there are millions more human crashes per 100,000 miles than self-driving cars. You would not sue the car manufacturer if you were on say, adaptive cruise control, which is fundamentally the same thing (except for the steering obviously). The plaintiff is in the wrong!
youtube
AI Harm Incident
2025-08-15T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyRyYExbjBGN58LckR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1uaIjR8b2_oZMcVZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxagnHKB5-C3ZS0RJJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzTePAZAbXFU8yQZnZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySTtEHDTMFct28Y-N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw1KkYSh4ryH7NADE14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_1sFSGf7fv5HB4vB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgycT5jiUPvdsr4-dkh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwzrqJJ-VUowr7SuJ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9szE_xIODSoEP-uB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]