Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Naw...he's naive with his answer to the 'trick question' of sentience he interac…
ytc_UgwFihX22…
G
Ive seen people talk about this but I am like 90% sure the only feasible way for…
ytr_UgzukOBia…
G
I'm so sorry man, as others have said, this is a mental health problem. Nothing …
rdc_m2bti4u
G
You think Rallies are as bad as this will get? What they need to do is prevent c…
ytc_UgxhFoZPz…
G
There's a lot of speculative bullshit and fearmongering in this video.The AI did…
ytc_UgyjK9JtT…
G
simple solution, if a job is eliminated by automation, the persons who held that…
ytc_UgyHcCiWc…
G
The catch is that once the AI is advanced enough, it will be able to conceive of…
ytc_Ugz8li4V4…
G
I disagree with the idea of Elan Musk of creating policies behind closed doors o…
ytc_UgyzC39Pn…
Comment
IT WAS NOT AT ALL THE PEDESTRIAN'S FAULT. THE CARS HEADLIGHTS WERE SET ABSURDLY LOW. Essentially the monitor was driving blind and could not see more than 40 feet or so in front of the car!
The normal range of headlight settings does not allow the lights to be set at such a low angle. The only way that can happen is if BY DEFECTIVE DESIGN OR MANUFACTURING FLAW the adjusting screws just come completely out of the thread. There should be a stop on the end of the screw to prevent that. Apparently whoever made the last headlight adjustment - probably as part of his/her normal procedure, ran the screw to the limit of adjustment and then adjusted up til they hit the target. But since the stops were not there it ran off the end and there was nothing that could be done to correct the problem without disassembling the headlights. And, probably thinking it was a flaw only on that headlight, did the same on the other side.
He/she had to know what happened BUT THE CAR WAS RELEASED FOR USE ANYWAY. Some supervisor probably thought since the car could "see in the dark" it would be OK to drive that way until there was time to do the repair. (maybe the monitor himself had driven in in for the maintenance and the technician didn't even know how to disassemble the headlights.
IN EFFECT THAT MEANT THAT THE CAR WAS BEING DRIVEN WITHOUT A MONITOR.
Whoever made the decision to let the car be driven at night with the HEADLIGHTS TOTALLY USELESS (because they could not detect anything until it was so close that there wasn't even time for the monitor to hit the brakes) should be held fully responsible for the accident.
THE PEDESTRIAN WAS MISLEAD BY THE FAULTY HEADLIGHTS INTO THINKING THE CAR WAS A LOT FARTHER AWAY THAN IT REALLY WAS. The pedestrian was not remotely at fault here.
Obvously either the AI and/or the LIDAR was also faulty because the LIDAR does not use the light from headlights to work.
The person making the decision to let the car be driven at night knowing full well that the monitor could not see, in effect OKed the driving of the car without a monitor and should be tried for criminal negligence.
youtube
AI Harm Incident
2018-03-25T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"ytc_UgyMWK-nvR7FtyjuiON4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxbznWimuiIRA8b02J4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgJm2foN3ZPF_9ded4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgygZAHb0p3MSFR23s54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx9hcK6Q1Zr5LNiXrh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"}
]