Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm not a real fan of your work, in fact I don't think I've even once fully watc…
ytc_Ugzn7b5z4…
G
Ai will be smarter than any human in 5-10 years. Be wary.
The end is nigh.…
ytc_UgxSJ42in…
G
I am getting ready to buy my first house. And while I'm hearing all of this, I'm…
ytc_UgzX8AofQ…
G
i delete my ai chat acc after im done with it so no one can read them XD…
ytc_UgwRfNlUV…
G
You’re telling me a program doesn’t care about the progressive bullshit we’ve ha…
ytc_UgyxLgs1X…
G
Such an insightful discussion on why AI can never truly replace the human touch …
ytc_UgyLZi_ml…
G
If the only thing that this grieving mother got out of this sad situation was th…
ytc_Ugyq6ZMXY…
G
We are, by the things, fully immersed into the developed of the dystopia.
The th…
ytc_Ugyy8Y0z8…
Comment
1980 - Computers are going to take all our jobs - Reality - far more jobs and better paying jobs were created than were lost
2000 - The Internet is going to take all our jobs - Reality - far more jobs and better paying jobs were created than were lost
2025 - AI is going to take all our jobs - Likely - far more jobs and better paying jobs will be created than are lost
youtube
AI Moral Status
2025-09-24T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwP7llphkOQwLmzMex4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzOqtsgTBKLnh6gO9l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgywIOUOYLnviRivsYd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxBVuaXfKXF2clQiFJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzrW-rqNsIEvqf5j_Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzuUtI7zyoyStgL8IR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzqH8w4OwVCWtC6liF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgznPwf24gjDEV0sJbN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz_Yqg3gID2gcjjPkF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwjBiXKPRmgf1Qg5MJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]