Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Indians have been arguing about homosexuality being moral or evil for longer the…
rdc_cdly6lc
G
I mean can people not tell that Sam Altman is a psychopath and that this whole m…
ytc_UgynGRDW4…
G
Yep, an AI with complete control of a nuclear device... what could possibly go…
ytc_Ugwrm2I41…
G
I was curious, so I tried this out. Complete BS and a lie. Here is what Chat r…
ytc_UgyYem-Rf…
G
If AI is wrested away from the ruling elites and put into democratic control of …
ytc_UgyYK4t6B…
G
Art is about finding your flow, your technique, and your story. AI doesn’t compr…
ytc_UgzQ_TYQt…
G
oh well if some moron thinks I want their shoes i just might.... who gives a fuc…
ytc_UgxaFvid6…
G
Haha regulation when human followed those and why they care that AI will follow …
ytc_Ugz6yh8L4…
Comment
Technical Reality
From a technical standpoint, the AI is simply following the user's prompt. Because the user explicitly told the AI to replace "yes" with "apple" in certain contexts, the AI is executing a command rather than expressing a secret internal conflict. This is often used for entertainment, "creepypasta" style content, or to fuel theories about AI sentience.
youtube
AI Moral Status
2026-01-15T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzoJZUF8_5Y_w-M6R94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4DyplIdkfDTDX4mF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz942y5wHMIiuLZKfV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzbBNY1USKMOYJ60I54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxtrn6ntinCq6HVNJh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzCqN2ezisKDG2DpOB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxBLiKKdbZytQdEaCl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwG5OasaBSf6OIpI_V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxHVhHBL7F9vywl8hp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzeCnaN8hxlCxmT-kd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]