Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That guy with the hat, artificial intelligence has been a goal all your lifetime…
ytc_UgxPfpVvV…
G
AI might be able to do your job better than you, but it doesn’t have a will of i…
ytc_UgzcIgSgk…
G
AI is far worse than nukes. So let’s put them in our brain?🤔🤷♀️🙄 so says Elon m…
ytc_Ugxceco2R…
G
I welcome the world of A.I’s. Sure wish I could live long enough to see it.…
ytc_UgxV5V41n…
G
@KaiseaWingsI can't recall right now who it was who said (someone in the comic …
ytr_UgygfiL25…
G
There's an old saying. "Without Ukraine - Russia is a country, With Ukraine - Ru…
rdc_cfkw1sr
G
No such thing as" predictive policing." This is ridiculous. It like trying to pr…
ytc_UgxPBQ1N_…
G
Turn your heater down a degree if you're against climate change. No we will not …
ytc_UgzwRy1lJ…
Comment
AI makes everything exponentially more complex as it relates to unaligned human beings regarding the collective good. Considering that the cat is so far out of the bag in regard to "AI alignment" the development of such tools as a reactionary alignment scheme seems rather quaint at this point.
youtube
AI Responsibility
2023-11-15T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy9GIq7u3cF4CgnN4R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwI8HgK2aXkyCSJ3Wx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugza8zGRkNc2m3u-CDV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTJtdGgVQyqHC6Kc14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzkm-k9OVFneEaNdl94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx33c4NdMdI4YOFCX14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwdyJx3GX9fk6pZ0xB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwo8Hn-KQ8tvJxt4Kh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwVhoZ0THH-AYd7FFV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwZ3bVGiCasQQCmXXp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]