Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Individual workers know that they are using AI to assist on tasks. They do this …
ytc_UgywqUIcf…
G
A few years ago when I used to work in IT operations I thought all of that work …
ytc_UgyMVry6K…
G
Anyone defending AI are just making excuses to be lazy and steal. I rarely say s…
ytc_UgzlcJ2F6…
G
> People are absolutely right when they say the planet doesn't give a fuck ab…
rdc_gtcx22c
G
It's the freckles. If I'm not mistaken, AI's textures are run on something along…
ytc_Ugy4f2d5T…
G
Everybody suddenly gets scifi brain whenever LLMs that are being called AI comes…
ytc_UgxvBMXFx…
G
I don't see AGI Superintelligence wiping out humanity. We are a necessary link …
ytc_UgzcwETlj…
G
Omg this facial recognition is racist and it need to be canceled immediately. Th…
ytc_Ugyif0-IX…
Comment
UBI is such a useless idea...instead it must be made financially worthwhile for (fit and healthy) people to SHARE the jobs we would agree we actually NEED people to do and work LESS. Then there would be no such thing as unemployment or any shortages of jobs almost regardless of how much of the work robots, machines, automation or AI takes over doing for us.
youtube
AI Jobs
2025-09-09T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzKCpy0pzuJ5ABiXC14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx8hErxXc9PG3q17rx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwb_uiTNKAZYW-GX5B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw02Ina63IW9EYX1Xh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzv2VI3wGQwmn0COHt4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxuSXBAvaMHR1I0NJh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzYyrC6CAM6ardgIZ94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxS2J_4dzEOPhwYG4J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwWzF-JOfa4OBynzkd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzEIGe21ytXyUdgAyN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]