Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A human pretending to be an AI producing AI content bores me more than an AI try…
ytc_UgzitOEQ3…
G
Not true.
The overriding promise for his first election was simply - Developme…
rdc_j4xxql8
G
AI: yeah the black folks don't get help, are criminals, and I don't like them
P…
ytc_UgwCpp492…
G
I understand where you're coming from! The dialogue may seem scripted, but it's …
ytr_UgxL1c9vX…
G
It's predicting based on a single tweet. Not an entire profile. That makes it im…
rdc_fvvy9k8
G
As someone who uses 'delve', 'commendable', 'meticulous' and hyphenates words co…
ytc_UgzjHKUtJ…
G
Thanks for the episode. So many things about this guest, I understand he comes/i…
ytc_Ugzpx2bZP…
G
I hate this comparison, and how nonchalantly people disregard the fact that ther…
rdc_enj8bvb
Comment
We will definitely programme A.I.s that interact with the real world to feel physical pain. It would be dumb not to. Pain (under the right circumstances) is an excellent teaching tool for avoiding damage and avoiding more damage if already damaged. It is also a motivator to repair any damage. In a sense we have already done this with some machines. The check engine lights in cars for example, are car speak for "I sense something is wrong with my engine, please have me checked out". The only difference between the car's pain and a human feeling sharp pains in their chest is that humans also feel fear.
youtube
AI Moral Status
2017-02-24T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Uggbtq-WGdMdsngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UggAjot1l7w9IngCoAEC","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggEmH3Lq4V_vHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ughlh2BiQzNAdXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugg0tBq-Ha2NR3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghbXQbC6Eut-HgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiaJXOE27QNsXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UggWMgkXXwlosXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggATgq0eeHyfXgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UghF5eT9DDh8F3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]