Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I realllllly despise the SilValley Ai tech cult sentiment that "we need to build…
ytc_UgzOBLcNh…
G
Honestly I believe a significant number of humans don't care if AI destroys us. …
ytc_UgwlQ3CAx…
G
AI can kill us all, easily. We experience time, a computer doesn’t. It just need…
ytc_UgzeS0Tvc…
G
AI art is still art, technically. It looks like art and by all measures of defin…
ytc_UgzwDJxiN…
G
Not that I agree with the AI artist, but Charlie's counterpoint to AI being the …
ytc_Ugx2LpR_w…
G
The way humans are destroying planet by their consumer mindset approach for ever…
ytc_UgyorWRox…
G
@skyswimsky1994 1. The problem with your claim is that „AI” „art” goes against t…
ytr_UgxmK6zjM…
G
Can we start calling it AI images, please? We all agree AI images is not art. Al…
ytc_UgzYB2EH5…
Comment
AI-generated code isn't "good enough" yet, but many real-world apps are mundane, not critical. Security concerns are important, but won't halt progress. Historically, it never has. Think of the early days of servers, of browsers.
Regardless of how close AGI is, dismissing AI’s transformational potential is as unwise as assuming dystopia is imminent.
What I'd like to see people talk about more is the culture of burnout that we're fostering. Today it's "because AI", but yesterday it was the "grind mindset", the "hero". Endless names to "don't live, just work more and more and more", an endless race to the bottom. That's not on technology, that's on policy, on leadership, on culture. It stops when we get together and say, “This is not acceptable”.
youtube
AI Jobs
2026-02-06T02:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwiIw_HBESWs-3gwGp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxryxZ22s1Uyb0rfGR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxnfpd_IRVPhfNaikh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyGqeZb6XoT1u6vRcp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyIv_BXSbkTE1T-6Bx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz2s9PVjCqHLPUDUcV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxe0LOqN9R74cXyBq54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzbloCWCYrI8bjgsnN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwqr9Hty2T-Z0LOZcl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4P00M_rhZXTzFEf94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]