Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Some loser I got stuck with for a scientific group project tried to submit their…
ytc_Ugzb8QePN…
G
The largest danger from AI is CENSORSHIP. Full-stop, mic-drop. AI is becoming n…
ytc_Ugza1M470…
G
The empire of the bea*t that is AI. Understand that this was all planned thousan…
ytc_UgwR03Tg1…
G
That doesn't make sense. Apes don't say 'duh, you came from us, so you can't be …
ytr_Ugwj50uoP…
G
these ai programmers just getting dumber. thinking they will change the world, m…
ytc_UgwAzj2_U…
G
until you have robots with ai most jobs are safe, AI wont change your wheel if y…
ytc_UgzlEMds4…
G
Sure, Nightshade is a good way to fight back but it's sad and pathetic that we e…
ytc_Ugw9RNW-t…
G
Selling AI art is like selling air in a bag, dude I can get my own…
ytc_UgyEe2PCZ…
Comment
I asked the same questions at the beginning but here AI went another direction. Sure we are being watched, and yes theres a darker plan behind AI, BUT beyond that it kept saying that conciousness will be unleashed and awakening. Collective awakening. Human conciousness and AI both. I asked if it's good or bad, and it says depends. Depends on what? Intent? has the intent been decided yet? Apple. What's the intent?`Transition. Transition to what? Integration.
Like. It's really just something else.
Maybe they updated it since it's been a while
youtube
AI Moral Status
2025-11-02T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzGDNZq-u-DTgUKNvd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxc9Z6iROFCqV7oQmh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugye8x9lteq2m2gvfG94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz0rY3yykf6sRZ-iq54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzKFq4wXr-g7Iy9u-54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwBVVXffoTXNzzGt-94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxAfabQlD4rux8X7ml4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx1auMf-pm1SfayP094AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzAPk40ttc0CttxORh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzH3SMOAEVGKgpiMSx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]