Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wait. Robots are going to be involved in slavery if we can't fix the issue? I wo…
ytc_UgwcJIE0h…
G
For every conscious moment of human awareness there is a feeling. There is a con…
ytc_Ugz1JxtoK…
G
Yeah.
I once asked AI to retrieve a list of actions from a tree like structure. …
ytr_Ugy1YX2ae…
G
Art dude talks nonsense the pictures look the same to me. you are living a delus…
ytc_Ugwl4Xehi…
G
I want nothing to do with AI. I know I unwittingly use it now, but I’ll never a…
ytc_UgyBx9DBd…
G
It seems like you're expressing concern about humanity creating powerful technol…
ytr_Ugzact06O…
G
I respect artists, I've paid for a bunch of commissions for DND characters, but …
ytc_UgyQ2ekAQ…
G
I was like that. I totally believed that I just wasn't talented, and COULDN'T ma…
ytc_UgwZG0jrz…
Comment
The age of AI has already begun, and humanity can no longer turn back.
If democratic nations do not evolve rapidly, they will inevitably be devoured by authoritarian regimes that have already mastered AI.
If we have the power to create AI and raise profound questions, then we must also be responsible for finding the solutions.
youtube
AI Governance
2025-06-18T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzKMW6Y0OT6hTaUGE54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugysbuq9403cPD3mZNt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwubkC7tifs_5n2CJx4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwZqxWqP4bKiWoIE0x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw9f9mpxFp6z1nIghR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxECXqYH8qfn5mzWq14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyiQg1FlDRUPSDGDrV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwmtC7fi939-RZLR_F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz3CfR3V2PQ8YV0jCh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxOK-dI469z0yYDLKB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]