Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robot don't move and box like a human does especially side steps in this video i…
ytc_UgxH9SCbF…
G
An AI artist is like calling yourself a rocket scientist when you play Kerbal Sp…
ytc_UgwW6iryG…
G
So they used a race based segregation to train the AI but still wanted AI to do …
ytc_UgzONdLX5…
G
I hate code made by ai. It is looks good. Fast and functional. Until i need to r…
ytc_Ugxv15Gzs…
G
The Sad thing is Law Schools teach Students that Facial Recognition Software at …
ytc_Ugzaa7sne…
G
How on earth did driverless cars become a thing? Who thought they were a good i…
ytc_UgzESg9ef…
G
So take a bunch of ai and put them into their own world with jobs, people, every…
ytc_UgxUFW23C…
G
Traditional physical artist, one thing I've never had to do is show process vide…
ytc_UgwGNm51U…
Comment
I love how this video falls into the SAME fucking trap EVERY discussion on this topic falls into. The "Humans are special and different than computers! We have free will and a soul!" No... no you don't. The brain is just a VERY VERY *VERY* complex "AI". When it comes to man-made ideals like "rights", our brains are literally no different than an AI given enough complexity and computational power.
Meanwhile everyone in this comments section keeps talking about how "superior" humans are to robots. Yeah. Take a look at our history, actually *know* something about neurology. Until you have done both those things, shut the hell up.
youtube
AI Moral Status
2017-02-23T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghUUTPKDH88uXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgieKUMMYUHrjHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggNXuR9Uuu-dHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj0k6LggtSpPngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghIMGtVMpeoXngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UghsHvsEZa7QpHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugh-wxpuQ7IOdngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjx2gfLE92JJXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugi_RzdM3NNBsngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Uggfa3awuUzm_3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}
]