Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We will seize back the means of artistry from the blue blooded bourgeoisie and h…
ytc_UgyRgt__n…
G
This is the same bs that Amazon chat bots have been doing for the past 2 years. …
ytc_Ugw_qQgg-…
G
I suspect this script was written by AI to warn us about the dangers of independ…
ytc_UgxIneAbA…
G
Because it’s AI enhanced…? I don’t understand. This is upscaling and probably do…
ytc_UgwXjWzbh…
G
I think your example lacks any actual grounding in reality. First and foremost, …
ytr_UgzZAlbgD…
G
This was a much needed interview. While everyone was singing to the Gallery, it …
ytc_UgxvadcZD…
G
AI should has been created for assistance not replacement, but those greedy CEOs…
ytc_Ugwqr9YyI…
G
Most scary it's if people really are stupid to make AI and to have trust in robo…
ytc_Ugx4Uq2GK…
Comment
Knowing human nature, why will those who control the robots want to build them for humans who will have no purpose but to consume? It would be easier and simpler not to build robot to serve 8 billion emotional, trouble making humans. When they can build enough robots to take care of all their needs for the controllers and their families? And if robots can work for free, why would we need money? Control of the robots, not money will shape the future.
And don't think you can just shut the robots down. With AGI they can think a thousand times faster than you and shut you off before you can shut them off.
youtube
AI Jobs
2026-02-20T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwI92fA5SrALfYyUTB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugww1XojyRhKvC3Mb9V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwaQIM07y-f0XzXKrF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugza7wiYpqlfkA-0Spt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx3sJk52IXv2Nfq6tZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzB7oL20f-iaPfpjuZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz3vFXRFmBDun4WoYp4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyWEuaWNG9OM4yVnJV4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugylaj5EUoeaK57R-xl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwBaPwoI7hytk_sb1t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]