Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
WHY I HATE CAT'S OUT OF THE BAG ARGUMENT: It's two arguments being presented as …
ytc_Ugxts5t3z…
G
She was already manipulating you with her answer...almost EVERYONE is already in…
ytc_UgxcWGL3E…
G
How much you bet that the ai users that use the disabled ppl excuse turn around …
ytc_Ugwg8GGW6…
G
In that situation, you could propose generating more articles with AI, saving yo…
ytc_Ugyiphw0h…
G
get your exoskeleton 🦾🦿 and prepare to be assimilated 🧠, we're doomed as a speci…
ytc_UgwgArCK2…
G
Unamazed! Humans will never stop lofting people to deity status. It’s human natu…
ytc_UgyqRDsnb…
G
Commented something before but I have a theory
If a robot were to hand draw Ar…
ytc_UgyoWtOQ8…
G
@ 3:07 not the robot clock and its robot eye turning on right as soon as he says…
ytc_UgyT15-Q0…
Comment
Neil’s probably right that AGI isn’t around the corner (and now the scaling law has failed), and AI today is still pretty narrow. However, consider China—40 million, of the 200 million urban workers are in precarious gig jobs, as technology and automation have transformed the landscape, especially in manufacturing. The disruption has started! Even if AI is still a narrow thing, it’s shaking up people’s lives big time. So, ok, don’t freak out about an AI takeover just yet. Still, seriously, we need to get our act together and figure out how to help those who are actually being or will be affected by this now.
youtube
AI Moral Status
2025-09-26T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwP7llphkOQwLmzMex4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzOqtsgTBKLnh6gO9l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgywIOUOYLnviRivsYd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxBVuaXfKXF2clQiFJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzrW-rqNsIEvqf5j_Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzuUtI7zyoyStgL8IR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzqH8w4OwVCWtC6liF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgznPwf24gjDEV0sJbN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz_Yqg3gID2gcjjPkF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwjBiXKPRmgf1Qg5MJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]