Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As an artist and a photographer, I've been able to detect little giveaway nuance…
ytc_UgxQ1WOTw…
G
I'm no expert in AI but even I known that's not how AI is made.…
ytc_UgwK2zwcH…
G
People who use AI will really use any argument to defend it 😮💨
(Your art style …
ytc_Ugy0M7oXn…
G
It's crazy because a lot of companies think " if we dont automate we are dead"..…
ytc_UgxUwiols…
G
My ex would constantly compare herself to her rich friends. They were great peop…
rdc_ljc2qej
G
Sci-Fi fans don't need this report. We already know how bad AI can get. Books …
ytc_UgywtmWcY…
G
You know why I have a grudge on AI? When I was little I saw that cartoon about s…
ytc_Ugw284vwr…
G
1# robot: i love my job! idc if im being controlled!
2# robot: Oh shit. i just d…
ytc_Ugw1IrQ7e…
Comment
Great discussion. Reminds me that the disturbing thing about Lord of the Rings is that no one can be trusted with the Ring of Power. Gandalf is wise enough to understand even HE can't be trusted with it. It is comforting to think that the future can be secured as long as Ai development is kept away from bad actors and managed solely by good people. But if Tolkien's assessment of humanity is correct, the threat is not external to us, but within us when confronted with the possibility of wielding absolute power. While we can aspire to be Gandalfs, seems to me we're running low on Frodos.
youtube
AI Moral Status
2025-11-07T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxn5ipi2RXqS-OCfyN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwVMuNj0Ht7jJHamMN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzfjVjUN5_VvtuQxI94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzrS-iMyGDbBbhq9Wl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxAIPAJip9IRsErZkZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxjzvqUWJicorFntxt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzGAGGZIz-5DspMn4J4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5l42IHAIaY-kUg_V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzEqJedDVi3v8AbWw94AaABAg","responsibility":"user","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyJ4kM03PZcC6RdIrV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]