Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A self inflicted gun shot wound is supposed to automatically be investigated to …
ytc_UgzcNAhMY…
G
This is one of the biggest issues I have with autonomous cars. It cannot make sa…
ytc_UgzxDZ3mV…
G
See what i dont like about all this ai images drama is how most artists are so a…
ytc_UgwhYdUaC…
G
Porn has been using AI CG for a decade and its not as popular as the real thing.…
ytc_Ugx5M6Ve-…
G
okay think of it like this. the machine or computer running the ai realizes, on …
ytc_UgwRQFKmd…
G
Me: pfft I did not take the cake!
Them: I saw your AI chat
Me: so what do you…
ytc_UgziX6HlY…
G
Nice!
I also used to use AI to generate pixel art but it literally cant do 16x16…
ytr_UgzcoVuYz…
G
This is why transplanting hired worker's human brain to operate machines is dang…
ytc_UgwkhPIBk…
Comment
Let's assume that what some claim is true—that most of the jobs currently performed by humans can be done more efficiently by AI.
If this is what happens in the near future, it means there will be millions, or perhaps billions, of people without a job. If most people no longer have an income, who will buy the goods that companies so willingly sell us?
Will people without an income passively accept starvation? And if no one can buy anything anymore, what's the point of producing anything at all?
In short, either people will be paid to do nothing, or trade will simply cease to exist.
But isn't that what our capitalist society is based on?
Frankly, I find this all rather funny.
Why? Because it seems to me that human society is run by complete idiots who don't know what they're doing.
youtube
AI Governance
2025-12-09T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz2aG7N3OMQ3Rntjwh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgztS0q8_1H6nvUNjLd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxGTEpfVBbzTPfVp_Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwQrQ96usVKwd9P00p4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzi0cA4lSkk8w_dZox4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzTsObAhGsVY2Dk_IJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxpIrXnQ63fSjI-RtJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgztULkQzdrjx6GpvZ54AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyPTmGp6f8gH9Md3lZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwu44HtihKP4y8xWRN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]