Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thanks for upholding human level perception which is something these early AI ro…
ytc_UgyzcZxQe…
G
I think that if there was an AGI and it was able to upgrade its self millions of…
ytc_UgwHI3YS3…
G
The difference is that even in digital art you still need something called SKILL…
ytc_UgzmNHgjJ…
G
By 2032: System reset and GSM cold phase will run in parallel. AI control is the…
ytc_Ugx59XvAf…
G
Nobody Doesn't like the feeling of being replaced by something. Like 5 years ago…
ytc_UgzPvpOqa…
G
"We have enough cases."
Should be, "we have too many cases."
I love Judge Fleisc…
ytc_UgzGGJsN9…
G
i always call them "ai image creators" because feel more accurated, never feeled…
ytc_Ugy1ekcrt…
G
I took a course at Harvard last year, and the way they handled their AI was that…
ytc_Ugwtwspeb…
Comment
I think politicians might actually surprisingly enough save us on this one, because they wouldn't give away their power to that degree to some robotic system. If everything is done by a robot - there's no society to dominate and rule over. If there's no one or only small amount of people who's able to buy the goods or services produced by AI - it's a crash and anarchy, they wouldn't want that
youtube
AI Governance
2025-10-20T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyz0fUVjo8KhsTChE14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzvv6WQbzp6mMMUTGZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDBSGs6_clTdfOnFB4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyZbtSV7KETijOaRrR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyRYZfG41cE5p6nAPx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyhval6FlHID9VJvqB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwLNgoykqKc2azpb_R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyQD0t0cr-Heag4B1p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxH-SLMrumKP4NFugt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzq_Cx5LNH8RU19Ayx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"}
]