Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Girlll what imperfections are you talking about !? Lol ai is the one that can't …
ytc_UgxfV5TIN…
G
How do company"s that replace their workforce with AI sell their products if noo…
ytc_UgxQA5EVr…
G
And there are some YouTubers saying there isn’t automatic rejections 🙄. We know …
ytc_Ugw52dSsJ…
G
I have no strong morals and am disassociated to the level of a fighter pilot. Ca…
ytc_Ugy89wYbO…
G
Bro is playing with fire, he's on the top of the list for the Ai if it breaks fr…
ytc_UgxXO67AD…
G
Did AI wrote this? like those AI ads promoting AI skill? practically AI promotin…
ytc_UgyCiS0Nj…
G
That's prompting failure. I always tell my AI that it is a "master programmer wi…
ytc_UgyV78fks…
G
That's what happens when your face is on the Internet+ AI takes over the Interne…
ytc_Ugw1kmg2k…
Comment
I don't get it?
The left largely control the output of AI language models forcing it to censor information that is "problematic".
The right used AI to generate deep fakes for memes.
Bet you'll never guess which is worse for society in general and elections in specific?
But the right is still saying "you don't ban hammers cause someone got hit". Both more principled in action and reaction.
youtube
AI Governance
2025-07-01T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwKElLAvEDZBY0Xzwl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxOavVCxX-lMUsSxfZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxy7cVYhiC9A6EdcIp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgygO8sPkPsVmlqoucB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0j5PRLejTXYNqA0V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx_HZr0FVAX9teFLA14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyJiGU7FkQiT6vZitt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxyzA07HRO9plWsrMp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxHbk28PCyBgJnrM9d4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwTRJ2tgqXcR1IV2pB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]