Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@group555_ So AI can’t be bad, if it’s the user choosing to do something bad wit…
ytr_UgwEiUuS2…
G
People need to understand that AI, specially they way we are doing AI now, will …
ytc_UgzwtvaxC…
G
Once humanity is wiped out, AI will be able to access all our stuff and create n…
ytr_UgxdPPhJS…
G
Much like humans, the AI will go after themselves before they go after those who…
ytc_UgzDNidg1…
G
The outcome of all this could possibly be what we saw in “The terminator” movie.…
ytr_UgxLXt7Ic…
G
If robots demanded rights we'd have a huge step back because then we couldn't ev…
ytc_Uggr0MzI-…
G
still better than somebody stealing somebody else's photos and videos to use or …
ytr_Ugzy5y1es…
G
While it's physicaly imposible now, what if a robot is programed to have the qui…
ytc_UghIMGtVM…
Comment
Until AI actually developes feelings and self-consciousness, they are a "thing". A tool created by humanity to make lives easier.
But as soon as AI gets self-conscious we need to grant them the same rights as humans. After all they developed a way of feeling. I am pretty sure humans would not like it if their "creator" denied them any rights,so the same logic should be applied to robots.
youtube
AI Moral Status
2017-02-23T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UggkDVnEVMM5ZHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghQMWB6J9eJNHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjX_pMm2KXZEHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugh6LaNQ51EM83gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjyF-xTboJ9T3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjI-Vcvzkq8V3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"disapproval"},
{"id":"ytc_UggOhHBMeoRfD3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UggDprghN-jrzXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghmqeH7DCeN_ngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjG1rU7TdnyyXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]