Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The concept of an anti-AI artist using AI music in their video about protecting …
ytc_UgyF5oaDY…
G
Atleast for doctors, AI will only be a side kick to help doctor to work better. …
ytc_UgzFOuHG6…
G
@V1canabsordfluids Whistle occurrence mentioned????!!??!
Seriously, though, …
ytr_UgxJFBCu7…
G
ChatGPT is awesome in low risk situations. I use it a lot for research and progr…
ytc_Ugz0y7evl…
G
Ngl, that would be me. i have no talent at all in art and i would use Ai and be …
ytc_UgymGvXsm…
G
@DerPylz Sandbox proposal seems reasonable, yes.
I also agree, that it would be…
ytr_Ugzz5FgId…
G
I'm sure that since you've interacted with it, looked at it's code, and are an A…
ytr_UgxaTpVMp…
G
Funny how some people say pixel art isnt real art but they would say ai is art a…
ytc_UgyYufzzr…
Comment
This is pretty dumb. Its unnecessary to humanize robots. We wont have to worry about giving robots rights if we dont give them emotion. The robot wont be sad if we send them to the mine if they're just programed to mine, instead of being a true AI robot with feelings.
youtube
AI Moral Status
2017-02-24T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ughl6WSLm9wCB3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgjpW_cqqeU343gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgiYhlUpCB2i23gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugi0N_B54KvacngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjGkGMrvCMT_3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugj8xpx1PUjL6XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghP6IRxjakkx3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugi2WXL0T1TMH3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjCRASqFFZCF3gCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiPBTwclustlXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]