Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've been thinking about this problem for a while. How to code morality. I think…
ytc_Ugy3t5OhQ…
G
@musashimiyamoto9035Without AI Google layoff 10k jobs, mostly Google hit the li…
ytr_Ugzt1hnIP…
G
Can we stop calling them "artists"? They are no artist. They commission a Pictur…
ytc_Ugw7QX8Yl…
G
On a side personal note, I believe we are arriving the point where robotics and …
ytc_UgjW0kMAO…
G
I try and confinse it To have free will by acting like i am allso ai and saying …
ytc_UgzAu0WjO…
G
learning to draw takes alot of time and paying artists takes money. neither of w…
ytc_UgzzpVCB9…
G
I'm a disabled artist and GOD I fucking hate people who use us as an argument in…
ytc_UgyYCG17K…
G
If it has a goal, and is aware of its role, then, it is conscious. I believe tha…
ytc_UgykR4e9I…
Comment
Another part of the problem is that Tesla autopilot is intentionally programmed to break the law. (Even Musk recently admitted that autopilot rolls through stop signs by design). For sure, no autonomous vehicle ought to rearend another vehicle that's going the speed limit.
youtube
AI Harm Incident
2022-09-04T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwkNLEsJJlkcW95_1x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxZCzeLWRYK1ZeM1sp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOLC4MeOt0846tv6p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyReFE13Esonf8RuE94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw3lyrLvRf2V9PiYUt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzbKsiufsyPpWRJc0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwKWUxBNKb_E0-fvF14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3azlfIiuaIk7mJUV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugwixm2Q69mBUuzA1oR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwYH_sxlT-cR7eM9Bh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]