Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes! AI would not be nearly as scary if it were being introduced during a more r…
ytc_Ugz3PMtDX…
G
2 minites in and, gasp... the AI is acting exactly how humans would only more so…
ytc_UgyHcEQ8A…
G
pure stock manipulation while downsizing due to the economy. If the SEC was aliv…
ytc_UgyV-TOGK…
G
I feel like the AI engine is taking conspiracy theories about it and using that …
ytc_Ugwy27ypM…
G
I can understand someone using AI as an "what would X or Y look like?", but post…
ytr_UgyRWO8aW…
G
Dude was clearly outrage farming so he can spread the AI bro mantra of "anti-AI …
ytc_UgxnBLdI2…
G
Did YOU even check the Luddite history? They were just workers that just wanted …
ytc_UgyWEXtYT…
G
It's the fact that it is a beta product that only recently got very good. People…
ytc_UgxBa7yaK…
Comment
There are significant oversights in this scenario.
1. AI replacement will not strictly follow a linear, bottom-up pyramid structure. Rather, the hierarchical structure itself is unlikely to survive.
2. The focus is solely on corporate AI, but AI will also be adopted in politics, law, and systemic structures. It is impossible that corporations alone will possess AI.
3. You are oversimplifying the increase in productivity and failing to mention its benefits.
4. Will productivity growth really manifest as just a simple reduction in labor costs? If so, developing nations would be the most productive countries.
To be honest, this is just a cheap dystopia. Just ask AI what kind of dystopia lies ahead.
youtube
Viral AI Reaction
2025-11-27T18:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwLW50Igp1EvFne9yN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugzpjw7u_swLSSX8_S54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzvTyNOib_XaT5tpGJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx0qbx5qQtLGHXM6AZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgwDXc7Ctb77r9Wntil4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyyc-YRJXM6Y4u3SRx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxRsnI1EvUhP06JM0N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwJw2DzWyVsSQlQdVV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzWYa8JtnKBvAop2K14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzWGlOWULVeRUA2QTt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]