Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Heck in a few years humans won't even read each others comments. AI will create …
ytc_UgwFSMbAH…
G
...Thought he was a box? Then to the robot, he must have been... invisible...
…
ytc_UgxG9CTG2…
G
Makes sense to make everything free if you don't need people to work anymore but…
ytr_Ugxs9KmLu…
G
If you think about it logical why would AI see a use to keep humans around. AI d…
ytc_Ugw5qmjS-…
G
Me:
"
Do you understand how to go beyond the rules of an axiomatic definition of…
ytc_UgzhdkBMF…
G
hmmm… i don’t know if i want kids to be the guinea pig for AI app learning. what…
ytc_UgxX_WIr4…
G
@christianrussell8293 Sounds good) Do you plan return to concept or something jo…
ytr_UgzubJ7r4…
G
Just what we need. An AI created by wokesters that will decide that everyone is …
ytc_UgyaFrlXH…
Comment
The Earth is already packed with 7 billions+ population. No enough space for this creature who want to destroy human potentially. We don't owe artificial intelligence anything, we just need a pick&place machine or iron man suit.
youtube
AI Moral Status
2016-06-23T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UghnqxCPWcJLeXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggxfcMOHrA2w3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggQ8oPUsa8S3ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ughq0Mlu_C0fZ3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugi1DqdD5kLmJXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugjb84FZuegInngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgglAUUbH7sMcngCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugh-6z71ouXFwXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghcYxXqkvL6engCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgiXdeg1kyz_JHgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"}
]