Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I like the theory that some companies pretended to go with AI just to get rid of…
ytc_Ugx22oZHd…
G
14:48 is bullshit. A.I. doesn't have a conscience to draw emotion or feeling fro…
ytc_UgxcIom6_…
G
With the rise of AI, I think The Matrix is more relevant than ever ..and being a…
ytc_UgxKJ4gsP…
G
Why is AI Generated Art so Much? Why Wouldnt People Use any Type Of Paint Brush …
ytc_UgzzkDR82…
G
Is it? Or are they hired by someone to make these things and they do it with a g…
ytr_Ugx1TPAuC…
G
These people don’t think about all of us, just about money and profit. Corporat…
ytc_Ugx0Xy6qh…
G
I recommend the book 'Superintelligence' by Nick Bostrom. It's brilliant and loo…
ytc_UghCEDSQh…
G
The world’s always been inhospitable lol. It didn’t just start to happen. We’re …
rdc_hm7r4eo
Comment
Money money money... Tesla wants to make the most money possible. OK, let's grant that - if Tesla releases self-driving software that crashes a bunch, everyone stops buying it and Tesla gets inundated with lawsuits and the company either goes bankrupt or limps along, a shell of its former self. So if Tesla wants to make the most money, by FAR, what should it do? Release a self-driving car that is safer than humans. The argument they're trying to save a few bucks in hardware to try to make more money is preposterously stupid. Working, safe, reliable self-driving is the difference between being the most profitable car company ever and risking bankruptcy.
youtube
AI Harm Incident
2022-09-04T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxCxYZo9m_LHfHEkp14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyKUOLxpSSl60xSwC54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyD6nQYTbMDdVNZHn94AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy3SrPcKbufoo9yxx14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJBR2a-RPrCK4RpP14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyt-Fa7mbWvHZ2gW5R4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxlZO5_AoMLXPOG5Qp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgysN3F3pzx6MrDVMVZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxu19QGgTea-JCpAHB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx51pLGEND4ettC6SR4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"}
]