Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Will smith in I Robot asked if AI could.... And now that AI is better then you w…
ytc_UgymkMcRt…
G
What's worse than humans and ai combined? Demons worshipped by humans that overr…
ytc_UgwCQvxeT…
G
10:38 ffs my father was a plumbing central heating engineer at the American Emba…
ytc_UgzY6JUOa…
G
See it would be my janitor ai chats that I would be scared about lmao…
ytc_UgzQk11Fe…
G
The regular school structure is outdated , also teachers aren't paid enough and…
ytc_UgwVr-haa…
G
@moverseve ok but that still doesn't prove that AI will never be conscious? Most…
ytr_UgxNSShD2…
G
The largest benefit to autonous robots in war is, hands down, going to be number…
ytc_UgyRq0iMi…
G
Would not be able to sleep at night knowing that robot is in my house!…
ytc_Ugyl3KPkH…
Comment
in my opinion neil is not underestimating AGI or AI. He is instead overestimating the CEO’s and general public who pull push and make the decisions despite them being the good choice or having value. People will still purchase various forms of it to fill whatever void and do other jobs as well.
youtube
AI Moral Status
2025-07-23T16:2…
♥ 197
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyAxAfp0HrNJEtDJrh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxGFeN7Sx5i3NSDEUR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2GKIxUk892yaABSZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxmdp33praTZigNdTR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzcioOvqVDbHz6FR14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgynG7Bcdj9eMwXmYQF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy1ntHZgea8HIaBqwJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwhELyIPy_sMeVG7Nl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzjqLVkRbazcTh3DB94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw1vkD0cT_zOCvuk_94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]