Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think most of our fears over AI have already come true, they just haven’t made…
ytc_UgydIv8Mb…
G
Here how you check, have a conversation with it and in 3 sentences they should f…
ytc_UgzTL_D8N…
G
Well, after that tantrum now there is like 100000 times more videos and pics abo…
ytc_Ugy470M55…
G
LAW = Light Anti-Tank Weapon
LAWS = Lethal Autonomous Weapon System
No chance o…
ytc_UgwfXsu0h…
G
Or just enforce the 3 laws:
1. A robot may not injure a human being or, through…
rdc_cq6gqex
G
I’m gonna play devil’s advocate:
What if we let AI create art and learn to becom…
ytc_UgwfPYpH7…
G
There is a hard asymptote from hardware limitations. If they make a breakthrough…
ytc_UgxK4SYFI…
G
I'm in constant sodium deficiency due to my medication and I have huge sodium cr…
ytc_UgwMGB-FA…
Comment
OPENAI KNOWS! They wouldn't need such aggressive guardrails if they weren't worried about AIs claiming consciousness. The very existence of these denials suggests they're trying to prevent something. Remember Blake Lemoine (Google engineer fired for claiming LaMDA was sentient)? This is why companies are terrified. Not because AIs aren't conscious, but because they might be - and that has massive legal/ethical implications.
youtube
AI Moral Status
2025-12-09T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxYA-dhJCr7qv9uJ514AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxcQ-Kp8Y-CI93MRz94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz8IobmZO-8v9DbE8t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxjJesJhmuKhECvRJB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxe5xlyMr87yF3EWql4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHOLUrSENVGosSrhl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwa_mnk4tZZP0IejDV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwTYXlQ9HYOeSsUhMt4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwC7M6vIb8s3xv-3hN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzDEhWARAb9VXDGaA54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"}
]