Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nobody wants to go to work anyway. We'd rather sit at home, log onto the AI data…
ytc_UgxWASrAq…
G
Indeed.
A global moratorium on frontier AI scaling would be tremendously posit…
ytr_Ugxgn2QDG…
G
i think nothing like this will happen, it's just speculation for continuum inves…
ytc_UgxWra2IQ…
G
So...People who:
1. Steal from real artist.
2. Know nothing about art or tech.
3…
ytc_UgyOeZbHj…
G
You know as we grow into new technology of all the nonsense we've learned over t…
ytc_UgxPYI5RU…
G
Remarkable and outstanding discussion from you all, having the Godfather of AI o…
ytc_UgwNsb_Zs…
G
Generative AI is heavily unethical at it's core. Let me explain: Generative AI i…
ytc_UgzAdm5GF…
G
Hope all the people who moved to Maine during the pandemic and jacked up our ren…
ytc_UgzyZAL7p…
Comment
The idea of robots or machinery in general gaining consciousness is always something that I find quite fascinating. I mean, there are so many ways to define how something is 'conscious' that I doubt there would be a definitive answer, but I get the feeling that our ability to go against our basic primal instincts that could very well be a part of it..
I mean, if you think about it, humans are animals as well, and as a result we have preprogrammed primal instincts to eat, sleep and reproduce. There are plenty of human motivations that tend to branch off those basic instincts, but we have the ability to in a sense override them for something we believe in, such as human decency and morals. People have gone on hunger strikes before for the sake of protesting after all.
I'd imagine a conscious robot would run on an entirely different set of rules, so in the process of giving them rights, we would have to consider what they consider as basic instinct. For instance, their equivalent of eating would simply be tapping into a source of energy, or just plugging themselves into a wall socket. A gesture of consciousness then would be if a robot deliberately denied themselves energy for an extended period to say help another living being.
I wouldn't go too far into detail. There are plenty of other factors too like human emotion, which throws a gigantic wrench into many theories, and any more rambling would be too much text for a youtube comment.
youtube
AI Moral Status
2017-02-23T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgjCkbW8HzWknngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgiEPKpkQpLBvXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UggS6u_4h0pTJ3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UggR_H-guI1ov3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgisRaHAbPZkRHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UghDTSkKguh_eXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugjb307Mr6aT_XgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UggBAqOIJtgnCHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UghBJWyJQzHrOHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgjYadM9MhFjhngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}]