Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will never become as smart as humans until it k owes and operates based on go…
ytc_UgwIBT6YS…
G
So is this part of the big ugly bill that says states can't mess with AI for the…
ytc_UgyVTYkbb…
G
I knew AI was of Satan, its so evil it is Satan. I have never used it and neve…
ytc_UgwLUufIs…
G
B1-66ER is a reference to the fictional robot in Isaac Asimov's "The Evitable Co…
ytr_UgyLF05Bo…
G
I am Software developer and Ai also is big threat for us also especially with c…
ytc_Ugzw8K7cu…
G
as a matter of fact I do have an idea: before humanity destroys itself entirely,…
ytc_UgyVDCC_O…
G
I'd prefer robots to look like robots really, the human look is next to impossib…
ytc_UghNXdUAp…
G
Thank you for sharing your perspective! In this video, Sophia, the AI-powered ro…
ytr_UgzhKplOJ…
Comment
To suggest that artificial intelligence will ever have conscience because they can do tasks faster is the same as suggesting that the dishwasher will become consious because it can wash dishes faster than us. A robot will never be able to choose by himself because all decisions were already chosen by a human programmer or by luck, they will only do them faster. It's singularitarianism at its best, mixing faith and facts by a sense of apocalyptic urgency to irresponsibly distract people from the real problems, such as 700 million people without access to safe water.
youtube
AI Moral Status
2017-02-24T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgiC1pPPoV9Z03gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"indifference"},
{"id":"ytc_UgiG-by23WYWbHgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgiVs4F-1x9NUXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggK5XGSttVTAngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj88OdliVboRXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjv5Bx70AAdEXgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghQeWptGFL25HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiB2jsTsOfp4XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgjKLXQfgdB8lXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughk5eBKt9iA2HgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]