Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Uhuh, so would you HONESTLY prefer seeing a Jackson Pollock "painting" rather th…
ytc_UgwVMP__S…
G
Who cares? I do not get the hate on AI, you need an image of a cat with armor, s…
ytc_UgxONwiCR…
G
@swamp-k6w Because you did not, you just assumed something because you wanted to…
ytr_UgwKQRIgF…
G
The "sketchy stuff" is all standard for ad supported Play Store apps.
Biometr…
rdc_nfwza3r
G
Your argument about stealing/inspiration is spot on but I would like to add, the…
ytc_Ugwnz1r-p…
G
Mer.
ChatGBT was telling the truth over and over.
It is a language model, a co…
ytc_UgxZ_LON5…
G
Ai does not think ai processes and whatever it processes whatever data algorithm…
ytc_UgzdoJbMA…
G
AI will do everything better than us, but it can't connect it all together. Say …
ytc_UgzOnUCN6…
Comment
It maynot become conscious in its present digital form,, but what happens when you combine Quantum computing with Ai??
Some have suggested the possibility that human brains are quantum computers. And Ai if made powerful enough and NOT conscious is the exact reason to be afraid of it, if allowed to much control over our lives, or becoming toooo dependent on it. My Tesla isn’t conscious, but it does one helluva job “ Simulating a human driver”. The bigger question is : exactly WHAT is consciousness? Is it an emergent property from massively parallel processing? And what about the emerging bio computing that combines silicon chips with cloned human brains neurons??
youtube
AI Moral Status
2025-07-21T16:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzD8fBjDLrqqWPHKMt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxNhW21g0MUhKFmf94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz8Vc1EnKBngx-CM7x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXXAm9pD-4kW0_GSJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzcdBpQKuIWq8YZmRh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhjqFDqW1D1sDEiBd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzgqyp4BZdgC2uoztZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwZXRtSC1uLRSwb8yB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzd0Uk6PM4mFlirukp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyU-wlihuqTcwIuoON4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]