Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai “artists” defenders are similar to flat earthers: denial, accusations, false …
ytc_UgwzEQoKW…
G
Will ChatGPT access this very video soon? And other similar AI videos and writte…
ytc_UgwLQ0vNk…
G
when America asked the GPT chat who would win the Russians or the Americans alon…
ytr_UgwcfyOeg…
G
I don't think this guy factors in the phenomenon of escalation. 100 years? Bro w…
ytc_UgwbnKg6K…
G
The guy just said for a $20 subscription you can get an AI Bot to do the job. My…
ytc_UgwaOz6Dq…
G
soon you won't have to worry about that because thanks to AI you won't have a jo…
ytr_UgzMKE9zI…
G
Instead of AI, Do it yourself mate. Also using other people's art was always hat…
ytr_Ugyx6sPDv…
G
I just don't want children at all now, just think about what hell teachers go th…
ytc_Ugy20tDCG…
Comment
So much talk and criticism for AI doing what 1 percenter humans already do. Yet these same people also defend what the 1 percenter humans are doing. Why not take the anti-AI rhetoric and convert it to anti-greed rhetoric instead? Why do so many value greed/money/capitalism/jobs over humans in the first place? Conservatism = love of money. Progress = love of humans. Stop getting distracted!
youtube
AI Governance
2024-07-01T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgybClLuoAI4Dj241dR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxR--ghM1hBvNBzSzJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx5GuCyK6rZcrRIXUJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyeEXZ7HF1megIaGFx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWuHc7ddca96AxB-V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzX8AofQpnBvIeeVPN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx5sPaafb2vTVe_5dF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzXm4X7ZzIxYFAK8OR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugym_-2buvGuCQqDHc54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx3_3OaZfrT9uRNydB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]