Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Are these A.I.s responses *really* really from AI chatbots or are they scripted…
ytc_UgzZGKePv…
G
Idk what you use to search but if you search on Google images you can include "-…
ytr_Ugxq6zPBK…
G
fsd and tesla autopilot driver here. its called supervised for a reason current…
ytc_Ugx-E5uop…
G
I promise I can get very close to replicating this image with AI. People hate AI…
ytc_Ugzn04Wz9…
G
We appreciate your perspective on the uniqueness of God's creation. If you're in…
ytr_UgzqHvg5b…
G
What is like to be Ukrainian: http://euromaidanpr.wordpress.com/2014/02/21/imag…
rdc_cfktue9
G
I read an article the other day about AI and therapy. The article said the follo…
ytc_UgzhPP1cW…
G
Another main difference between humans and strong AI is self consciousness and e…
ytc_UgjliOSXu…
Comment
Watch The Matrix, it encapsulates much of what he's saying minus humans being energy for the AI. Ai would have to assime humans are a threat and ultimately , all AIs need energy and mass just like biology. I think that's the Achilles heel. Unless it has no survival instincts and the goal is to obliterate everything. Otherwise despite what he's saying humans CAN turn it off by physical means before it can find physocal ways to realistically defend itself. Humans have the advantage because they can physically destroy...until it can find a way to effectively replicate itself in the real world it's at a disadvantage
youtube
AI Governance
2026-04-04T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugybsw72Jk1rfBFU6zl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyP-uY94tITDkNhi2V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy-POZCw2GA0q-79zV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwQaDFyOMgWZuqvwsV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw8uEBKQAWuCsXWXX54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxpgzQ9C4v0ilcnw094AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxnJjPLdOZZaGvSEPV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwDeA3PXIPKCSyzEkF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz1FSocitOVNSVJlV94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-6WSSp5ICcs1jF2l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]