Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They said AI will be allowed to one day take over the Bureaucracy of the state a…
ytr_UgyisTuOf…
G
The thing is. If ai took over everything would look the same. Nothing new to cop…
ytc_Ugwhsejh7…
G
and the hypocrisy is they ask ai to write about a specific topic to ask the anot…
ytr_UgyvVkten…
G
Reminder Kwame N’Krumah of Ghana wanted to do this back in the 1970s with suppor…
rdc_et7s8gd
G
@bensalemi7783 Unironically some might actually think that. “If everyone is in t…
ytr_Ugy7VQyQj…
G
Anybody else having strong CATHY NEWMAN / JORDAN PETERSON FLASHBACKS right from…
ytc_UgzwL2RGQ…
G
no i completely disagree, i think it will be easier for the AI overlords to just…
ytc_Ugz9f-S01…
G
Is this actually real?? Like am not being deceived by the wonders of ai tech…
ytc_Ugw7bXmRk…
Comment
Philosophical / theoretical question: He says he believes we are living in a simulation. A simulation is an artificial existence that has us living “outside of reality”, so technically we are already living in a complete construct of artificial intelligence if we are living in a simulation, so why be threatened by the creation of more artificial intelligence, that we create inside the simulation? Do people fear the destruction of the simulation because that is all we are or do we fear the destruction of the simulation would reveal the truth and reality of our existence to us?
youtube
AI Governance
2025-09-07T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwB5DbfeR-Q0wX5lL14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwn-XCd0zWf4PgK-pJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8jNCUQmFyHWoNGcJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxIwtA_f3pIeR3SA2t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzidIC7WvMBhy5g1Ex4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwbp9qrYLzAfI5chF94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxtKlf5ItmrSSxeQid4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxemCj-XncHA7IRe714AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwQNReAZ6ntIXRQS5J4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgznP2zGYyblXNm7K294AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}
]