Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Self driving cars have been around for nearly forty years or so, the prank of tu…
ytc_UgyZS_qWn…
G
I mean if you think about it it’s just cuz the AI is efficient and is trying to …
ytc_UgwV_pwnJ…
G
Well, it would be the same as a self driven ambulance. Which is plausible, espec…
ytr_UgyRdN3R6…
G
Human train AI, be replaced by AI. AI training AI. Human collect unemployment. A…
ytc_Ugz8CS50M…
G
A.I. FOUNDATION IS A FALSE GOD AND A RELIGION SO I SAY NO BECAUSE I WORSHIP THE …
ytc_UgyK2Qnv5…
G
we need to make these people in power (billionaires) see what were talking about…
ytr_UgwxjzAeu…
G
They ignore that there are generational relationships here. Very young children …
ytc_Ugx_RDicX…
G
I am more and more convinced thar AI have nothing intelligent, it is only langua…
ytc_Ugwgch7-6…
Comment
I get the complication involved here, but what needs to be understood is that in the situation described here, an accident is bound to happen. We’re focusing on *who* we want the accident to occur with. The way I see it, that’s inconsequential. A self driving car is designed and programmed by humans, so some our humanity is transferred there. Including our disability to make rational decisions when under extreme and sudden pressure. I say we don’t try and micro-analyse every decision a car made during an accident, because that’s the same as micro-analysing every decision a human made during an accident. Which is hardly a fair thing to do to the human. The car was made by a human, it will have human limitations.
youtube
AI Harm Incident
2022-08-01T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxi9edyRH6MBe-gmlR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyHzAmobw_w11Mnb-d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxyiKB7SdSvnhxtM-N4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugz4tLk_cr4X5Hr_e5Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDx5EZ27hVBDO-G3J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy52cuaNxv0pCVCSnh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzbIlfqTdOKTNc9ph14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxh5MKYmtMPkv_l1S94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlSt_4SUBX-NmdT0p4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx2arFYsbTn-QoyeUF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]