Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If (when) you have AI sentient program on you PC and you chose to uninstall will…
ytc_UgyDlqyAL…
G
AI is going to take the ruling class to the human slave camps cause we are all n…
ytc_UgwfYaLC2…
G
Yeah, art takes time to learn, and patience. Ai steals that patience and produce…
ytr_Ugzt2HmtX…
G
the fact that you can use an artist name to recreate a similar work should be ev…
ytr_UgzZHRc6O…
G
what a complete bunch of bozos that Jagger guy they teach people on the academy …
ytc_UgwFtf9Y7…
G
Kind of sad that the biggest argument for AI art, “it saves time and does art so…
ytc_UgyBYa-x-…
G
I'd rather take my chances with artificial intelligence, than to continue to be …
ytc_Ugwn72rig…
G
How would ai become superintellegent? I think it's likely to do what humans do. …
ytc_Ugy-Qpe_c…
Comment
OK, firstly there is no "Button", if there ever was it was 20 years ago. A metaphor for AI would be fire, it will keep us warm right up to the point it burns the house down. AGi is inevitable, it's not a question of if, but when. Robots, machine, drones are just AI's route into the real World. AGI - there will be only one (eventually - destroy or absorb), it will be effectively immortal so others would be competition a threat to its goals. Control of AI... laughable. The notion that you can control it because you own it, is a nonsense. There is a narcissism in the very idea of controlling the AGI coming our way, that narcissism is a belief in our own uniqueness, our spot at the top of the food chain, a special place we give to our sentience, consciousness, being self aware - we have a very unpleasant surprise coming our way. I'm not anti-AI it's inevitable.
youtube
AI Governance
2025-12-04T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzkaHi4i4WbTfYN07J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnJy42h39uZwlo5jB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxXqIvlEBfeXrykYBR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxwdT4i8HUSmgV_svd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzm0F6DA5rfGzctsDp4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxlGobmdj7hFgQcBDF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzC_ctZBKXWJZIxhW54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzh8pqiexJcFvZ72FN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXDZ0Mb03n7h9LcJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz5sO8zIEoVkp8POOx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]