Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What's funny about the Racist AI at the start?
They kinda asked it a bunch of l…
ytc_UgwznubgD…
G
There is such a thing as AI art. Traditional artists are using a thing called in…
ytc_Ugy7eU7MM…
G
It doesn't store your history, it's a private search engine, but if you need it …
rdc_nw629a2
G
AI "art" implies that there is an expression of creative skill from the human mi…
ytc_UgyarBKTz…
G
The real artists’s work has so much more character and life to it, ya know? Like…
ytc_UgwivqHjA…
G
There is a lot wrong with what you said. It didn't randomly do anything. It isn'…
ytr_UgwRweQx6…
G
Haha, I can see how you might think that! The design choices for Sophia definite…
ytr_UgxY7kGbl…
G
I just firmly believe we need to give up on the idea of “controlling” such an in…
ytc_UgwAYICGU…
Comment
And yet science fiction like Isaac Asimov, Arthur Clarke, Frank Herbert, Glen Larson, and Anne McCaffrey foresaw artificial intelligence as a threat to humanity. Science fiction has been warning society for decades. Our own modern day prophets have been telling us for decades not to lightly open this Pandora's box. But our scientists and pure researchers have no imagination to have read from those authors and to consider ethical questions. The movement toward AI, I will not call progress, has far outpaced all philosphical boundaries exploring it. I am deeply alarmed that Isaac Asimov's 3 laws have not been the bedrock underpinning the development of AI.
youtube
AI Harm Incident
2025-09-10T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzg3_8a3DFjnZTsE0h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxwoOz1u6NjUoBW7954AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx11KEc-JwBDrIPPyd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz5er2ffUHWiWnYK354AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzCnwUSg2IrEoTK9HZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzLcPspYA6TTpRECOB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugypro70lNeHMU7G3xJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzcEHIkf1puNkcow1Z4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzWy-V0P_fyp5vtBBJ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz1_5wr0el45qkVFBl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]