Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just watched a video where four people are no longer with us due to AI influence…
ytc_UgyHCehgj…
G
Her eye is so wonky that I thought ai couldn’t possibly fail that bad so it must…
ytc_Ugz2dZb_j…
G
The thing about AI art is that it learns from other art, trying to make it authe…
ytc_UgxbLv1ct…
G
Wrote an essay by myself. But curiosity got the better of me so I placed it in a…
ytc_UgwlWuDM8…
G
AI art fans are so cringe, mf thinks he’s a god cos he can type in a prompt😂😂…
ytc_UgwKsywZB…
G
We appreciate your perspective! It's true that humans have their complexities. I…
ytr_Ugy6Y_jtT…
G
Palmer, I'm with you buddy. Lets keep the AI gadgets going. CHINA not slowing do…
ytc_Ugy6uNnCu…
G
I brought this subject up with MP Jonathan Hinder, he said I should contact the …
ytc_UgxP5ex6P…
Comment
You say "program them to feel pain" but isn't programming in an aversion to danger essentially a pain response? If you program something to avoid things that are damaging it, what makes that different from 'pain'?
Infact, what makes current computers not 'conscious' to a small degree?
As for robot 'slaves'. Just program the AI to naturally *want* to do the tasks. If you program an AI that wants freedom, why are you using it for manual labour? You'll find all of nature programs us to want to do the things nature wants us to do. We love eating. We love reproducing and spreading out. We hate things that stop this. You can argue the ethics of conscious machines, but I don't think the 'slavery' aspect will ever need to come into it.
youtube
AI Moral Status
2017-03-04T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgiA1_INbJOFTXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugi1nrPKExbHOHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UggFGTUIov_oOHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghdOolC8joZ6ngCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UghAw59QZBitCngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjT-wD9PuFMo3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggRBlCDj7mB73gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiLuFIX4HCn7HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjcZKTKJEoieXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UggVd289Q9KLTngCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"indifference"})