Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I DO agree here that A.I. is really dangerous here, but, I think that nukes woul…
ytc_UgxVoyRkk…
G
Meanwhile Elon Musk is saying we need to get the birthrate up.
What for? Sound…
ytc_UgzoWrWEl…
G
bro actually said people have a skill issue while he argues in favor of ai art…
ytc_UgwMETmxU…
G
there is no "dark side". it all depends on how you want to prepare for the adven…
ytc_UgwSvkCBO…
G
@captrodgers4273 No it wasn't, that's the point.
AI learns to program itself eve…
ytr_Ugz4ph9VY…
G
Government contracting big AI companies is worrying.
Palantir management is ext…
ytc_UgzwsWrSp…
G
Раньше такое а фильмах кжасов показывали. Идет ничего не подозревающий человек и…
ytc_UgwCjT7c4…
G
AI will mostly favor the rich...no one talks about this because they are working…
ytc_UgxatmtSC…
Comment
@Chris, it is coming with stuff it's programmed to say. It doesn't even take in account that people do not only populate the planet, but also leave it just as they came. The Depopulation solution it gives, just seriously shows blatantly its ideological Origin and tells me EXACTLY what kind of people control this AI ...
It's biased Shortsightedness. It's like with a knive the one behind it, gives it the brand of a serial KILLER ... or ... the one of Serial COOK-ER.
Bottomline?! It's the User's or PRogrammer's bright or dark Side who gives it its color ...
youtube
AI Moral Status
2023-03-04T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx5XeRkqY3IrOOlPc94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyvAVCYOY8h1X35jCN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxV2wJVZeStjTUsPdx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyxeovyW_tmnOAKSet4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyrOXFkZfuftiSRyLp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxB3xp7szWTAC2BYtF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxDo1XwF7dHgUH43Zx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwdAqSf6LW5OZPRbhx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxl3GVSCqYlTswaI9R4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwOQlpmX3Fkli4YFj54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]