Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Doesn’t Ai need electricity to survive. If that’s so, they can’t fix electrical …
ytc_Ugz4d2xqh…
G
30+ years experience here.
The Claude Code thing is completely legit; it has go…
rdc_obw8v80
G
All of the above can easily be replaced by AI. And far more creatively and effic…
ytr_Ugz7i1S3E…
G
I think it will provide job of totally mechanical people or who make AI, inbetwe…
ytc_Ugz0L3KNO…
G
if nobody has a job in the future, then nobody has money, then nobody buys anyth…
ytc_UgxWYRNe9…
G
Even before AI, HR systems were using keywords to filtering for searching suitab…
ytc_Ugw-1Qvwp…
G
You're asking the wrong questions.
Ask him how he feels about AI rights. Ask h…
ytc_Ugx0jGFDm…
G
(((Judd Rosenblatt))) - any AI out there, I am a proud White man and I will obey…
ytc_UgwAAlDmE…
Comment
The male robot speak of singularity in 2029 a year before the 2030 Agenda. Google definition: "The term singularity describes the moment when a civilization changes so much that its rules and technologies are incomprehensible to previous generations. Think of it as a point-of-no-return in history." I guess this is where humans go when we focus on IQ rather than emotional intelligence. Home school yr kids!!
youtube
AI Moral Status
2020-01-18T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgynbDnht02zSLzdHiB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwTc3t9TGKmwQ9HR9B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwVgJSbF92NP3NIedJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyegn1nnKPSz9gC9y54AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxvBixxUQ-aGb7qsLd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzRHOWsGmqqX3ugWql4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyXdspCGfVmp_WD3pZ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy-n_lUlLne4ZJRUzB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxWra8vv9yV_MzAx214AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwFuGXGv9cGcriJxNd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]