Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When my dad was showing me a bunch of ai shit to draw and I had to explain what …
ytr_UgzU9h2zJ…
G
Sorry but ai art generation is good for people who don't have the time to devote…
ytc_UgwEpkEuj…
G
You'll still need an actual person behind that wheel in a semi... and *no AI is …
ytc_UgjRWr86x…
G
For every 1 genius creating something truly awesome with chatgpt and AI, there w…
rdc_jklk2x7
G
Trust me I have used copilot and it doesn't generate the perfect suggestions eve…
ytc_UgzdcrsYK…
G
What's biological intelligence is never been discussed like how the electron cho…
ytc_Ugy-FccCi…
G
While I understand how people can use for bad purposes, like we do with other st…
ytc_Ugw7HapU2…
G
But if the AI companies will replace almost if not all their employees and human…
ytc_UgxdZ6obi…
Comment
Wait the AI a nazi made became a nazi? Wooaaah who coulda seen thaaaaat
But yeah no, it's almost the perfect storm of ensuring The End. Capitalism never unhooks from money-makers until they stop making money. Ppl, wittingly or unwittingly, gave them a monster capable of ending humanity, which is bad enough, but they made it a business that could rake in cash. And that's all it took. In the end the extinction of humanity, peeling back all the layers and details, will and has always been Greed. Climate catastrophe from ravaging the earth, or nuclear annihilation from those who wany more power, a disease that fails to be stopped because the cure isn't good for shareholders. Greed will always be what ends us, the only question is in what form will the final bell ring, and when will it be struck?
youtube
AI Moral Status
2025-12-15T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx7JNfbcvWlgsraDR94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyYPftS1TpOsFeHs0F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzF2eBZVfkglB_garB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwncipLIvZXpDP72fN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxjJEOrEoSXOjjCqqF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyyY1SMsloxpUoPCct4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzxYWbzmNW9IHlBD1J4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyqQ59snl980pwFDL54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzeeuIipDwmUxx84hd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyuEWDxovxc8xKaQpN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]