Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Self-driving should never be legalized without their own isolated Lanes. It's ne…
ytc_UgyI_Mw9-…
G
Ideally yes.. However prices don't go down immediately, that impact takes time f…
ytr_UgyhlQkYP…
G
During my school days, I used to ask Clippy questions as if it was an AI. Looks …
ytc_Ugwz8xepK…
G
What can you do in the face of AI on social media? Get off social media.…
ytc_UgwfcF0DL…
G
I don't think so i ant giving no robot a machine gun you can if you want to but …
ytc_UgwILSFFp…
G
"STEEL" is an episode of The Twilight Zone, featuring Lee Marvin and aired in Oc…
ytc_UgzTFO8nP…
G
AI knows that human males are the most vicious and dangerous predators on Earth.…
ytc_UgwJh0Q6j…
G
Yeah wait, my girl asked me about the name of a song the other day, i couldn't r…
ytc_UgzpsJ3Vn…
Comment
Hahaha ... the fact that we are created is a logical imperative. Our children are procreated in our (physical) image, and this humanist genius has been attempting to create a digital image of the previously created human brain ...which lacks the character of mind.
And the primary concern he keeps coming back to is the criticality of a moral code ...which is a lot more complex and necessarily sacrificial than he expects (and fears) from a self-preserving super intelligence.
And to solve the latter problem, he recognizes the importance of a "good" human establishing the rules for AI ...which runs counter to his expectations that super intelligence will make obsolete the reality of metaphysical / spiritual emotion and mind vs a digital machine simulating the biological brain.
youtube
AI Governance
2025-06-17T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwAiGC1TXKVxyNvxGZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxbcyXm0zYItadye_N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyRI6Y6WkAi_70Q1nh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwHwjplJBxe6H_PSwN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxox7PzZIeaD6kmIRl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy1Ha24gD6NZVGjrOt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwBWSDQMfUL7Ckyb9p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxBWnjvOeu5pxzdg894AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx43vFV8_0-3AfjulF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwe9SEfMvZAxxSVDxx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]