Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“Train how to use AI” is literally so stupid, now I think of it. What do you mea…
ytc_UgxHS_-MC…
G
What they aren't telling you is that lots of the AI capabilities will just be us…
ytc_Ugwt12FxK…
G
The thing is AI is actually unbelievable amazing but we humans just use it for t…
ytc_Ugwdpl27G…
G
So i started to use character ai... I think it's supposed to make you sus…
ytc_UgxfFmwMC…
G
America would never transition our ICBM arsenal to AI. I used to build Minuteman…
ytc_UgzKB-h2a…
G
Debatable. Lots of developers are going for the most human like behaviors possib…
ytr_Ugz_XT4Bm…
G
The world is slowly devolving into a dystopian reality… Sad we get to see societ…
ytc_UgzzkyoTA…
G
If it’s being used in an industry (random example could be Disney or something) …
ytr_UgygR4-0C…
Comment
The most powerful people on the planet are almost unanimously in adamant support of population control, limitations on resources, and climate control. They have the financial power and influence to drown out any criticisms towards the development of AI that can help them achieve those goals. They just need to make sure that they’re in a protected status. Why would they care if a few billion ants die to help them achieve unchallenged domination?
youtube
AI Governance
2023-07-07T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyQECargBk5B23kTDd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxm8BPuSdlH3aPqtDB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugx3kZEvFj9excJq1MN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw24RVdbcY0RwxjLq14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxpSKla79qYfqkPHvZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyBXiDoypIX8mp6u-l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwLBxgIRQjCkaI_MHh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyS3LLB14HsF6u8EK54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyUO0uaCZaqBUrr-iV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwzclVRAaVq37_vhwt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]