Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
it's not far fetched. When you combine everything that came before. Self aware…
ytr_UgwOG0TeO…
G
Even the disabled who need wheelchairs to function don’t get them for free. We h…
ytc_Ugw-0XObE…
G
Autonomous cars are cool but they're not going to fix traffic. Yall need trains …
ytc_Ugy2muqpP…
G
And it is proven that people are very bad at being able to quickly take over fro…
ytr_UgxniPHSo…
G
I would never put my trust in a fallen angel technology because anything can go …
ytc_UgztULtqd…
G
I just hope Ai realizes how much of a parasitic disease humanity is, and does th…
ytc_UgyVLEeI7…
G
People who interact with AI romantic/sexual partners or rotobic partners have on…
ytc_Ugx0Lf5VG…
G
"You mean you spent all that time and effort invalidating real artists so you co…
ytc_UgzQX3p_T…
Comment
>This is a non-argument. Letting people die when organs are available is an arbitrary action of uncertain morality. Merely "copping out" isn't a satisfactory solution.
Despite the fact that this was never an argument per se, just what I thought was an interesting observation, I don't see why this is a nonargument; it actually seems more elegant than the inhospital hospital in terms of fortune and reducing arbitrary action, as the random universe not man doles out the lottery. Plus, your statement "letting people die when organs are available" is pretty appalling considering a) those organs aren't *available* as they're in use for an autonomous person, and b) you're still killing people.
>Right, if you think that killing is noninstrumentally wrong, then that is an answer to the proposal. But the state is really only putting someone at a risk of death, so you have to explain why we should treat this case differently than instituting a draft or hiring someone for a dangerous job.
This is fundamentally different than a draft or hiring for a dangerous job. Both those examples need *consent*, consent to the social contract where the military is the fundamental force behind the state's keeping order (and most people don't die in the military whereas death here is certain), and consent to the dangerous job because you want money or whatever is offered. But even further, my argument is that that the state not only ought not have this authority to kill based on biopolitical governance, but also it would be proactively killing its citizenry despite contracted duties elsewhere. To digress a bit, this is why the entire notion of obligation is founded on negative not positive, where I don't have to help others, merely I can't proactively harm them, ie I don't have to save Sally's life, I just can't kill her.
>First of all, this fear is unfounded because the current organ waitlist system works fine, without any of this hypothetical discrimination. We're talking about
reddit
AI Moral Status
1402033853.0
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-25T08:13:13.233606 |
Raw LLM Response
[
{"id":"rdc_cfkw04q","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"rdc_cfl560i","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"rdc_ch4nk0c","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"resignation"},
{"id":"rdc_ch4zdd0","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"rdc_ci0i07o","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]