Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel like content like this and the majority of AI doomerism is just sensation…
ytc_Ugzdv7gXT…
G
If most people lose their jobs because of AI, who will have the money to buy the…
ytc_Ugx3vh5y_…
G
😂😂😂😂😂, they are stupid. Let's assume they are correct about prediction(they are …
ytc_UgxshXnE5…
G
We are meat machines, we do exactly what the Ai is doing. We have to spend hundr…
ytc_UgyoZ1pQE…
G
If you believe AI is not controlled by the army, you may be an enemy, and not an…
ytc_Ugzu3xOO8…
G
These people who support AI as inevitable are the same people who would tell Ame…
ytc_UgzXPCmZ7…
G
Let me explain something ai is literally bullshit ai doesn't exist people to dum…
ytc_Ugy57MoMO…
G
You can’t stop progress. If it means that a few billion humans are SOL, so be it…
ytc_UgySaKBAz…
Comment
The problem I think is that ai is currently being designed to replace human work not to supplement it. Specialty ai systems for detecting cancer will not replace your doctors but generalized ai that targets white collar work will almost certainly be used to shrink staff. The evidence of this I think is obvious. The most expensive part of an organization in the US has been labor. That is why labor was sent off shore. generalized Ai is the exact same thing it will be used to lower the costs a business has. As for allowing people to work less and enjoy more benefits . . . there is absolute no chance that happens. It has been nearly 50 years since computers revolutionized modern jobs and the hours people work has not come down.
youtube
AI Jobs
2025-10-07T16:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxrF2BfL85SR17vg7x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxdtPYsWJruyKcUgtd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxAUvYHCAv4WFHw-RB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzsfJ-Zoyqw4cnFSZF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzr1n9epVxKPm2o6IV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwD6QHjJJ0NXZI-LEd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxDCiaDUp3LDg_aH6d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyPJy-18x0aDs3s-5Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyDxDhR2weSPFIsgst4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwsJVaGl4kVwSW-8G14AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"unclear","emotion":"approval"}
]