Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I do agree that the use of AI art is causing bad outcomes like the loss of jobs …
ytc_UgyxtmJxF…
G
Yeah, but... what's more important? Training LLMs to predict spending habits an…
ytc_UgxdN3j3r…
G
I agree that ai art is pretty stupid I really hate AI. Seriously it’s just not a…
ytc_Ugyl9WkCn…
G
Get chatgpt to converse with grok, its weird shit and gemini to summarize and co…
ytc_Ugw1rUiOA…
G
Artists can still put pen to paper. Nobody is stopping them. AI is just helping …
ytr_UgwEeCjtE…
G
yep, got us by the clickbait-anger, served us the opposite! Although, "need ppl …
ytr_Ugwt9eRmh…
G
So enough people have exposed this. --WHEN ARE LIMITATIONS AND LAWS GOING TO BE …
ytc_UgzQbpn-i…
G
The only solution to this problem is to have the government write a law that AI …
ytc_UgziqT_12…
Comment
99% job loss by 2045, 50% by 2030. That’s what the experts are saying. During the Great Depression in 1929 the US saw a 25% unemployment rate, and this was temporary, not a permanent loss. This is the greatest threat the world is facing and it’s coming at warp speed. Everyone is talking about how efficient companies will become with AI but they will all go out of business with no customers. Work is coming to an end. The brightest minds need to come together to address the biggest change coming to humankind. AGI and ASI are the last major inventions in human history and it’s month away.
youtube
AI Jobs
2025-12-11T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyD_pT1SPbEORWBR854AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyyZyZxSzF_AkDt2Mp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz1sMxigyFppQGnF9N4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4Dcxb6MVCk6q-VKh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyvPM7SAdCAPp0eRIx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNnWJIvd025-fy1w54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy181EU7sTWU4BZQ654AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugw1mOmdvcaLBKLalA94AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzmDSWGnOQJ3DFpLrx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzKhQcKz6J1R3wX8K14AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]