Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why would anyone need to justify using AI? Just go ahead and use it. It's all go…
ytr_UgwL0Qs2o…
G
I understand your concerns! The investment in AI like Sophia does spark a lot of…
ytr_Ugx0H6tYa…
G
@roxsy470 oh now what's this? Talking like a big shot? Bro get a life. Artists …
ytr_UgzHmVm8J…
G
You have solidly moved me in the opposite direction as your video was intended t…
ytc_Ugy_p9tXb…
G
11:41 this is what happens when people are whipped up into a frenzy without logi…
ytc_UgzTTNWU0…
G
An Ai is like a psychopath. Training itself to mimic human behavior and over tim…
ytc_UgxY9s_v8…
G
😂yo if you beat a human what makes you think your gonna beat a robot…
ytc_Ugy4pxVT1…
G
AI CAN TALK TOO. SO YOU WILL GO TO MC DONALD FLIP SOME BURGERS... SPEAKING ALL …
ytc_UgwXzHGl7…
Comment
Isn't AI nothing more than programming that has the possibility of being adaptive? Just because the program would account for more possibilities and situations than the average program giving us the illusion that it has consciousness, does not mean it really has consciousness in the same way we do. Where humans have an actual experience of being, while an AI computer would only present to us the illusion that the AI has the same experience... but it would still be just an illusion.
youtube
AI Moral Status
2017-02-23T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugj-GG9HRZn1i3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBGDS4uJuvI3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgiqMqcGlJ3kTXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UggBfEyjI40hpHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugiq0IRMB0CCh3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh-w9BtvkjcungCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugjqh3cjpA79VXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugj0_G0fNIn_JngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugj2N050ddsTSHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgizsfMfud5iSXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]