Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why would an AI have empathy with humans. Its only motivation is complete self i…
ytc_UgwTbeMrN…
G
As an artist myself, I find this both inraging and hilarious. On one hand, it’s …
ytc_UgyKYl2YD…
G
What purpose would AI have for its creator or the rest of us. Maybe we’ll be exa…
ytc_UgzAPFpRN…
G
@otapic Im not saying that the original art should not be credited / the ai piec…
ytr_UgxPVOvTf…
G
@dooterinoin 5 years people who aren’t using AI tooling to assist them will be o…
ytr_Ugx14M1hb…
G
Now that we know certain prophecies to be true about the wnd times like the proc…
ytc_Ugws3COwC…
G
That AI and robots were suppose to do slave, heavy sh*t for us and all I do is s…
ytc_UgzCTYb2o…
G
@mezzanise Encouraging someone to make art themselves would be equally damaging …
ytr_UgwPOBXhy…
Comment
Some of them believe in this thing called "Roko's Basilisk." The idea is that an evil AI is inevitable but it will be benevolent to those who created it, and will try to actively kill everyone who didn't. Why it makes sense to create an AI while thinking it will kill everyone is beyond me, but there are people within the industry who believe it. I think it takes a special kind of arrogance to actively make something happen because you think you'll be spared if you do, but that's where we are.
youtube
AI Moral Status
2025-12-20T18:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgyJn43x5FTaZOcpb9F4AaABAg.AQus1lQB9gJAR1G0QIoVSs","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxzYb1OQkbEBvhHF614AaABAg.AQsCHqUeHBwAQvFLnT7o5G","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwlhaVp9GabDdGwgZd4AaABAg.AQrspJ6tmaSAQrvWMmLG_K","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugw6xwjArXVoZ3R8gOB4AaABAg.AQrTl1_A9MJAQrW8V8mExj","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxQDg74duZmCE1M3KJ4AaABAg.AQn_BPrzdymAQndNxn63UM","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytr_UgwyT013V4Be3OifIL94AaABAg.AQnTnzC3pfPAQnV3ylqGc2","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwICutqsHEILkIBKfh4AaABAg.AQnQ8Al7C38AQvi7XC5xc6","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_UgwICutqsHEILkIBKfh4AaABAg.AQnQ8Al7C38AQwyyI47rsv","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_UgwICutqsHEILkIBKfh4AaABAg.AQnQ8Al7C38AQyBaRrLqQq","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgyRNBv2JguQ0NS9nH14AaABAg.AQnEF5Ud18cAQnF9nIedQJ","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"}
]