Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Now, they just have to figure out how to build the most important part of her, w…
ytc_UgwV7SPqk…
G
In economics terms, when productivity gets better (i.e. lower cost for same work…
ytc_UgytvxpDl…
G
If you’re not a human then you don’t get to draw inspiration from humans for art…
ytc_UgwW28IoZ…
G
The only engineers who feel AI writes "good code" don't know any better, as they…
ytc_UgwgWUvss…
G
I argue that the invention of fire led to metals, energy... AI... So it was not …
ytc_Ugzp8rfl-…
G
Police: "Stop the vehicle!!!!"
Waymo keeps driving.
Police: " Police!!! Stop …
ytc_UgwETIe8T…
G
This episode just hit differently, gentlemen. The Dyson swarms, the sentience (a…
ytc_UgyH_aMex…
G
GitHub copilot already exists and is way better than ChatGPT. No point in ChatGP…
ytr_Ugxm5YQ-_…
Comment
AI is extremely stupid to the point where it can't teach you to do long division without 10 prompts. Everybody's way over-playing this thing and the whole narrative about AI replacing jobs is about getting people to enlist in the military, and pumping up institutional infrastructure stocks. Once they do a sell off of the data center builder stocks they'll ease the gas pedal off the fake news AI shit. I don't know if anybody remembers Trumps big meeting with all the tech CEO's where they were talking about 2 trillion dollar deal or something. They know they don't actually need these data centers but they're getting free money. They can pay to play by buying off the media to spin a lie that they need more data centers.
youtube
AI Governance
2026-04-05T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw6NRFr44WEKesJfq54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxG-tlIEiDdnXBEHOx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzcBuDfvADFRUAZDyh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwcW084eUVFsIDyIsV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxUj3FWykCa35pFDY54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4oQ7D07gt1CXaPwx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy_KQcndkDCVNucoUZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyJm1VdpDyQsro1-UR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz-fs01-LsYjutdEJ94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwj2tDLgXFOWOj7b-14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}
]