Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Important Notice:
While we know the basic training approach of large languag…
ytc_UgxG_7Xk_…
G
Once my dad came in while I was on an AI chat app and asked me what I was doing……
ytc_UgzGTPfJ2…
G
Ai is the new religion. It’s a tool for the ruling class to manipulate and rule …
ytc_Ugzj1sI8-…
G
Calling it slop is dismissive and seems to mostly come from so called artist who…
ytc_UgyHgjEua…
G
Instead of Universal Basic Income there should be Universal Basic Facilities i.e…
ytc_UgxRhrR2T…
G
Nah, human history is a long record of adaptation.
From the alphabet and paper t…
ytc_UgwCJbkb2…
G
This is complete bullshit. Has anyone actually seen anything other than Bob Ross…
ytc_Ugw4kbn1Z…
G
Many of them might be Ai themselves with how much h they regurgitate the same as…
ytr_UgwEaVC8b…
Comment
AI is far more ominous than the atom bomb. All these founders who developed AI and now declare someone needs to stop it, piss me off. Bloody cheeky to cry 'I'm scared of the Frankenstein monster I created stopping it is now everyone's responsibility. No, it is their fault and they should be held accountable. There will be no breaks because nations want to exploit AI for military purposes and will never stop or ever trust other nations halted AI development.
youtube
AI Governance
2023-07-07T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyYK6Pl_7tuhZ-0z4B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw_lS5Ed2T8VWsT4bZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwR4e2yVi1QTz60BTJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxfRLTioDl4jNoKKWN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzlKMXO626NIvk9jr14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy7Y0w3NnD1kv9Vm494AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzW02AiHkfiSy1TUjF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgylhP1uYsj_w84MEi14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzG1JH5Rq6nzhjSIh94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzRYFYqTtKUreLXXUJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]