Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wish people would stop call LLMs AI - precisly for those reasons. It is a stup…
ytc_UgziQkgRU…
G
Your way of thinking about this and your entire value system in general are miss…
ytc_Ugyt6ZUol…
G
I like AI in the form of chats. But AI art has never set right with me. Not only…
ytc_UgwSWk2lW…
G
@brianmi40 Exactly. It's not specialized robots versus multipurpose ones, we'll…
ytr_UgyA1GAiJ…
G
Are guys stupid don’t do the robot 🤖 is going to end This world 🌎 don’t don’t…
ytc_Ugye26lfh…
G
SETI failure to find ET is thus explained :
Techno-Lifeforms invent AI not long…
ytc_UgyHPdpNx…
G
What if we all get to buy the AI on credit. We send the AI to work for us. It is…
ytc_UgygOblNV…
G
But if companies and governments need to ever develop more advanced AI to stay c…
ytc_UgwM_7wiS…
Comment
This channel was better in tackling questions about physics. This thing about robot rights are nonsensical. If mankind creates AI robots then mankind is superior to the created being. Why should mankind give rights to robots created by mankind? It's not even alive to begin with. It's interesting in movies but totally unrealistic.
youtube
AI Moral Status
2017-02-23T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiveMjZemHGGHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UghOnqpItWsoN3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghxIkKCF0da9ngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggQC_X6GCXb-XgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgiZTomR8t9t8XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghPnX8p8kXgNngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgjkdfxV0TC693gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgimBcFcL1grSHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjwqWnr_kYH83gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghMs1kjBq3vf3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}
]