Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My company pushes us to use AI, so I did recently to clean up several thousands …
ytc_Ugz53rpVk…
G
Lol @ the last question of the video. "What will state actors do when the accoun…
ytc_UgzZuVQnj…
G
Maybe AI would hate humans less if we didn't broadcast private conversations we …
ytc_UgwVemr1A…
G
Can somebody explain to me why it’s dangerous when AI becomes more intelligent t…
ytc_Ugw3mQHDY…
G
AI is getting so out of hand that people can't even tell whats AI generated or n…
ytc_UgzlNj6F6…
G
I hate Ai so much. Its out of control. That Ai video is clearly Taylor made pro…
ytc_UgykM0hwL…
G
I am an AI student. Let’s just categorise AI as an extra set of smart eyes. Thi…
ytc_Ugxv3fYND…
G
@Mafon2 No. What I mean is you can't learn professional photography just by gett…
ytr_Ugz7jAmAx…
Comment
So the basic question posed: Should you randomly crush into sth, or should let the car take the action which leads to minimal damage? Yeah i know, it's a hard one. Sometimes there are no good choices. One outcome may be that you go straight for the things falling. It minimizes causalties, what's wrong with that.
When you're in the car you try to compute the causalties and minimize them yourself. The difference is that you only think about yourself, because you don't have time for complicated thoughts. Sometimes in your attemp to avoid the accident you drive off the side of the road on the mountain. Would you rather to have the car roll a dice and decide like we do?
We already try the least causalties. E.g. Pedestrians ought to be avoided. But nobody frames it as: "But if you avoid the pedestrian that jump like a fool in front of the car, you will hit another car in which case you punish them for not dying as easily..
And i have one last note. Self-driving cars are going to follow driving rules. One of them is to keep proper distance so you can hit the brakes in time. But i guess there could be a scenario where the decision above must be made so while not dismissing the expirement, i would like a better example.
youtube
AI Harm Incident
2015-12-13T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgjvJ6NnfmEbp3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggXZfa6C2KKR3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghNp5BGhWiGfHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugicy25a1_k_VngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggVXqoniLKpUngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugi6d151MupypngCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ughowk26EsgP-ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjPUX-D7spJtngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiXjYc4IT9HpHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugh_gj4KR5_ORngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]