Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@AsianDadEnergy- I guess we shall. If all these jobs get wiped out in the next 1…
ytr_UgxjZXCzH…
G
I put this on for background noise, and I am sorta an artist, and I was neutral …
ytc_UgwWZ7dhs…
G
Sophia indeed shows humility and acknowledges the continuous learning process sh…
ytr_UgwVwxqSp…
G
Listen, I 100% agree that this is tragic. However, driverless cars are more safe…
ytc_Ugz7fBWMc…
G
So… why are people treating it like shit? I mean it literally says its an AI Art…
ytc_UgxSWjQot…
G
#1 people dont need AI #2 technology in the past 20 years has not improved quali…
ytc_UgzPwYOfB…
G
Ellon is right AI is far more dangerous Nukes,, maybe somewhere in the Universe …
ytc_UgzEbGRCF…
G
> If we magically eliminated all encryption today, there would be caos and ba…
rdc_gqlocl3
Comment
Just being a devil's advocate... how about we just... don't make them? So we don't have to burden ourselves with the ethical questions? If they had rights, would they fight in wars? What if only them fought in wars? Would it then just be two or more countries playing a strategy game? Or imagine a hacker getting into your robot house maid. I mean they could spy on you and make the robot want to kill you or malfunction in a way that would do so... ever played watch dogs/2? Maybe because we can do something doesn't mean we should. Isn't there a point where it's laziness over connivence as well? Why can't I make my own damn toast or look up my own recipes or stock and keep track of the food in my own fridge? Maybe I'm paranoid and foreseeing a portal or westworld or iRobot type of scenario? Is my argument founded in meaningful opinion to you internet?
youtube
AI Moral Status
2017-04-23T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghZxim60h8djXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjWFrifyXFxLngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg6_68H1uxBc3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggpPoRogRJoJngCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgiDTskh2rn2yHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjGaC_PeYYcrXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugg16N0dkIPH9XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgizKQfBDOEQFXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugg9hqGfomYCEngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgglqXCxOme6MXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]