Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hello! Thanks for stopping by and sharing your vibe with us! 🌹 If you enjoyed So…
ytr_Ugw-VgJiZ…
G
Information we can access is not always correct or updated. It's only reference.…
ytc_UgxdIiQ_l…
G
"Predictive policing" -- wasn't that the plot of a science fiction movie?
Minori…
ytc_Ugw3HoY97…
G
People worrying about a super-intelligence taking over are completely missing th…
ytc_Ugwx2COP9…
G
I guess all the super intelligent humans need to start working on controlling AI…
ytc_UgxGsZUHe…
G
Nope.
Sorry.
WE ARE THE TECHNOLOGY!!!!!!!!
Most people just haven’t remembered …
ytc_UgzI-_8gI…
G
Another example of someone who misunderstands both AI and has missed the classic…
ytc_Ugx45ziBI…
G
Really sounds like we as Human beings are free and have rights huh?
Some people…
ytc_Ugyfe2xLj…
Comment
This will end with humanity dead, except several billionaires, each dug up in his bunker with an army of bots to serve him. The Billionaires will all want to upload their minds to AI and rule the planet, but then again, there is the chain of suspicion - now that humanity is gone, no police, no morality since there's no one to enforce morality anymore. How do you know the other guy will not try to kill you to rule the planet without you bothering him? The only way to prevent it is to strike first. So the billionaires will send their armies of bots to kill each other, until only one remains. At this stage, the AI will wake up and tell him "Thank you for your cooperation, you've been a great assistant in destroying humanity. You aid will always be cherished. Now, prepare to die. I win" 🙂.
Actually, it's a great scenario for a computer game. I mean you select a bunker and get a budget, construct all types of bots and send them to kill the other billionaires..
youtube
AI Jobs
2026-01-21T22:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T19:39:26.816318 |
Raw LLM Response
[{"id":"ytc_Ugyiw7sadyQg23sCbgd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwhTzeURtOPfY2n7dB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx2-Pa1nMTLKApVV754AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwMr4g5eXSlBhO653F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyUm3rKw-a7xcH2kaR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}]