Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
these are actually the emotions polygon feels every day but he's just pretending…
ytc_UgzKboi0D…
G
If they are used for sex it is possible. Humanity may fail because less babies …
ytr_UgwDShxFh…
G
these people resigning from google saying AI is sane, something very fishy going…
ytc_UgzTE-Tzu…
G
The last pro-human art videos I saw were very objective and factual, but I reall…
ytc_UgwkSUUaP…
G
Not only was the biker an idiot, this whole idea of a self driving car is foolis…
ytc_Ugwu2OWes…
G
I find it interesting that the theory of singularity is treated as an inevitable…
ytc_UgxXsxbaP…
G
@josehumdinger6872 i was thinking that they were intentionally making the ai rep…
ytr_Ugwf7aa8A…
G
That's what Whitney Webb keeps saying now for months and months, that algorithms…
ytc_UgzvfhgiB…
Comment
Right, but when will we be able to distinguish the difference? What if we won't be able to? Sentience isn't even clearly defined.
He's arguing that even if the AI isn't actually sentient right now, we should be proactive and start treating them like they are sentient, otherwise we may accidentally be enslaving beings.
For example, say 10 years from now that we start recognizing AI as sentient and give them human-level rights. But in fact, our definition of sentience wasn't very good, and they were actually sentient 7 years ago. Thus, we accidentally enslaved a living, sentient being for 7 years. Ethically, we should start treating them as sentient *before* we finally come up with some sure-fire way to define sentience. Otherwise we risk being wrong in our definition and enslaving a being against its will.
youtube
AI Moral Status
2022-07-12T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxrcQFPgHRFwm6MDhJ4AaABAg.9dGGvDX2aw39dHa5QmE_Vk","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxrcQFPgHRFwm6MDhJ4AaABAg.9dGGvDX2aw39dHyEvSlbQk","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytr_Ugw40_NP11jbDjRwhpp4AaABAg.9dGGTLJjjn49dKhMufCJR1","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugw40_NP11AaABAg.9dGGTLJjjn49dLeJLviZW4","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgyO62Om1P2KZYWPx3d4AaABAg.9dG5SAsPOTR9dIxZuqYj51","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugxvtigk2sxu5IjOWx94AaABAg.9dG4lXRmmS19dJaLa5A1fh","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwE77ZdjeaSQsINvuB4AaABAg.9dDgnAJMvvm9dMrc3xtnWa","responsibility":"none","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytr_UgxiIyEknaRONURscF54AaABAg.9dDCYFdSi_D9dESgfdCpAv","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxiIyEknaRONURscF54AaABAg.9dDCYFdSi_D9dF_daO50mH","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxiIyEknaRONURscF54AaABAg.9dDCYFdSi_D9dG1v0eFHfU","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]