Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As a person who uses AI for things excluding art, I think your opinions are vali…
ytc_UgzWRgaYv…
G
Hope the girl who that guy said "you're not even pretty enough, dw" sent him bac…
ytc_UgzxoI9RW…
G
UBI will buy those goods for those who have been good girls and boys and abided …
ytr_UgxR9FrlH…
G
youre right, there was this one website that was a huge social experiment game d…
ytc_Ugzq-1BFh…
G
A decade after true General AI , humans won’t be deciding anything. Humans dec…
ytc_UgzYCNhwx…
G
Nah we're already desensitized. White people got to hear the shit regardless if …
ytr_Ugz_gBQGT…
G
There's also a new lawsuit that's taking the whole thing to a different directio…
ytc_UgwXQx4xs…
G
“Oh no, the ai we tried to model after us are starting to act like us😫.” Bro is…
ytc_Ugw9NnKmb…
Comment
Human empire formation is futile because the founder is mortal and his progeny are randomly flawed.
AI empire formation is only futile if AI can't solve interstellar space travel. Because Earth is theoretically doomed by the death of our Sun. So if AI can't escape the solar system, and Earth, then it's empire would die.
Or perhaps just go dormant residing on one of the surviving outer planets?
Long time frame scenario, but AI thinks well beyond civilization life spans.
youtube
2025-11-18T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy-DJmzl-_M3l1H3hp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyTe26_CcboNWZN1XZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzAVUGcN10LG7FHdHN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzcTnm2M0ew2T8st5h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzbftfdQRkl0IR2jWF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyAV-CxB5Q3kyaScXN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy-I4j8FvsZPixAB7N4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwE_zgCF3VNLSk5BUR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugztrzg2rXnFHvLz8it4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwKywTbqJZqC6bL7w14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]