Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m trying to learn actual art partly so that I can help prove ai artists wrong…
ytc_UgyRbN682…
G
If AI is doing all the work, then who pays AI? It can't be us humans because we …
ytc_Ugz7kMMge…
G
I get what Rick’s saying, but here’s the thing — I do have a point of view. It's…
ytr_Ugy0a6yrI…
G
Does it really matter who makes art or does matter if the art is appreciated?
A…
ytc_Ugw1BLypm…
G
Robot 1: oh no The box fell
Robot 2: BRO WHY YOU DO THA AAAAAAAAAAAAH…
ytc_UgzOiAQqJ…
G
worth pointing out also that all arguments against ai as far as i can tell only …
ytr_UgzvWyQnz…
G
Thank you for sharing your thoughts! If you're interested in exploring more abou…
ytr_UgzwY7QGO…
G
what if we create an ai that takes control of us for our own safety?…
ytc_Ugym6zDXp…
Comment
We have also kinda programmed ourselves what to do in situations like this. We know that we should avoid collisions with objects and cars, but given the circumstances we decide how to act. Perhaps if i will see in the SUV someone in the right side of the car, I would prefer getting hit, but if the driver is far from the edge of the car I would rather hit the car. One can program a computer to make not specific desicions, but to consider the circumstances at the moment
Edit: I wrote this without watching the entire video, but the motorcycle problem will not be a problem for a car. A robot wouldn't think ethically about "punishing", but try to maximize the chances of survival for all sides, so choosing the helmet wearing motorcycler wiould be the "correct" choice, or perhaps the computer would prefer getting hit because his chances of survival are way better
youtube
AI Harm Incident
2021-06-28T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyHUa4MwqTkhowKM4h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHkU5bsdw5P7-8JZp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdBJyQxP861oljRU14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzgl7iBXRe5tFYLHYt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzUKFC8r1V6HNUTydd4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxFxLtLEjGce9wq8iZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxI7TNMt_TdEOfHQpN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw6WKiqU09QhjUsxKJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCgvj2u0e_RN985ix4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxaV7FZRp4oz9oNI6t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]