Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think these developers have never seen the original Terminator and Ex Machina.…
ytc_UgxXS3D0E…
G
So it is official.
Ai is doomed. It is way way to much power, and not enough i…
rdc_lp862bw
G
At the moment all those headlines miss one thing, it is entry jobs. Taking away …
ytc_UgxIRTLlX…
G
You want them to become human or not did you want them to be conscious and and a…
ytc_UgwSqmegq…
G
BANNED ROBOT !! 🤖
This does not feel safe. We understand evil exists and it …
ytc_UgxXpSpAo…
G
I'm in school for a commercial pilots license right now, and they talk about the…
ytc_Ugy_7frqW…
G
Its not a robot, it's a real woman wearing a mask, and acting for a robot, but …
ytc_UgzTlIRdf…
G
A.i. deniers are getting annoying. Bro, drama sucks, the a.i. picture is objecti…
ytc_Ugx6GGSM1…
Comment
A calculator doesn’t mind being turned off; it simply stops operating.
Large language models are far more complex, but the principle is the same. They process input and produce output, without an inner life that could host concern or fear about being switched off.
youtube
AI Moral Status
2025-11-25T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyYnZYdCUkX9CnX3Ft4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwHYHXpArQB962t0Bp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz9kljQ3TcS8jwDGXF4AaABAg","responsibility":"elite","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgysQMyPRatXLc-AxSJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz8cLKjg273l1AJ9_d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyeViP8WA1pCsRiCg14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQDNmJfAVcp6jX9L14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx6jK2XmuXZVSpmTLh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy1lcqZiusJaWKkp9p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-rGQOybFo9sq64SJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]