Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Very relieved to finally see a member of Congress speak about what will eventual…
ytc_Ugwc1LRvE…
G
This could have been 30 second clip. It’s algorithms pushing AI to your feed. It…
ytc_UgzSFfbho…
G
It doesn't matter to billionaires, they'll "buy" and have the best security in t…
ytr_UgxD8lsYk…
G
Palantir drone blowing up a toyota in afganistan with 10 innocents (7 kids)- whi…
ytc_UgyQW8Ki2…
G
@воининтернета hostility is a form of drive, and drive is something AI has to be…
ytr_Ugw3Zv8zb…
G
Please don't stop creating art. I love to look at art pieces made by humans. And…
ytc_UgyQA2V_L…
G
yea but do u want to continue to living in a system where in order to survive u …
ytr_UgxfFZywJ…
G
What I think a lot of people are missing is the fact that, as humans, we don’t w…
ytc_Ugx_MkJD5…
Comment
I think a lot of it is what we choose to accept as "good enough". If AI art is "good enough", we won't need real artists. If AI music is "good enough" we don't need musicians. If AI movies are "good enough" we won't need filmmakers. If AI podcasts are "good enough", we don't need podcasters. If we start to demand more rather than just accepting "good enough", then AI will take a lot longer to replace us. Human authenticity is one of the few things we have left, and we need to hold on to that.
youtube
AI Governance
2025-09-05T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwtMZ498dGVfo_bcHd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwn4LMAaKJFfknwwI54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy736Rkwl_EJBQ7tyB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwQ_XSNGfoAHITRtKV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyUAZPnKQPOmFODa_94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwyyrIbaG3NGt7l-Q94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugysq7uIQRYlKcYFRO94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzQAc0zUuP3Vz60qzZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzKf2QovsHBfISLLsp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx9QPKaz0BPMgRvwT14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]