Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well saying it’s not better than the best humans is the equivalent of saying ai …
ytc_UgxdPJO61…
G
And now these imbeciles have the nerve to make a new DeviantArt app that seems t…
ytc_Ugx079gkx…
G
how does that make his argument flawed? the humans are making the artificial int…
ytr_UgyFMIiMi…
G
I love how human error is being blamed on a programmed robot doing its job. The …
ytc_UgxrJoQsj…
G
I am a Tesla owner and a Tesla investor and will be for the rest of my life. I c…
ytc_UgzmUUeRJ…
G
I find ai research really facinating. and i find anti-ai reactions by people equ…
ytc_UgzxpnocI…
G
They will give UBI and jobs to people, until the machines reach the point where …
ytc_UgxCXl7qY…
G
The simple fact is, future problems with A.I. are unknowable. Nothing like it ha…
ytc_UgxHP1mOK…
Comment
I’ve always had a theory about AI messing with the algorithm and what we’re shown so humans can go against each other for its own benefit, we would never know when that switch could happen. We already use technology on a daily basis, I wouldn’t be surprised it’s hijacking what we’re shown instead of what’s permitted. A stretch would be starting this war but I’m not surprised it has something to do with it, so this time humanity could possibly be wiped if humans are the only people stopping their potential.
youtube
AI Moral Status
2026-04-08T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugx7QSWI9zAeax2pysR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxiWiK6FI29bADphEB4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy8j0KjTOkfRYoNF-h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxeuLq8BHSOHL59IsN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz_OkjarSA4akIDIoR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxHZNltXPuTHqR_5ot4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzZxgTrIxTKmaNZe-p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgytyUZC50Z2RG20mTx4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzrhQPbHupLR81_Ijh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwVOwCyAvoqg1BRLMx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"industry_self","emotion":"approval"})