Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will create more jobs than it will destroy. Horse people said the same thing …
ytc_UgwvF4OSU…
G
If you are living in a simulation then why do you care that AI will take all our…
ytc_Ugx7mS2Jy…
G
They will steal your water rights. A constitutional condition. THEY WILL PASS TH…
ytc_Ugx419oiv…
G
Thing is LLMs were very expensive to develop, so you have to do something that h…
rdc_m6y44dj
G
I don’t remember when a drawing tablet suddenly became supporting AI but whateve…
ytc_UgwV7L3MK…
G
Soon I well be able to Heff this women Jess? For My village will be proud I firs…
ytc_Ugj8t_rzH…
G
No I need to speak with real people it’s so hard to talk to someone real at xfin…
ytc_Ugwlbjb9r…
G
Why Did He Stop At The Car? What was it that made him shoot at the driver but no…
ytc_UgyKPlWSw…
Comment
If anybody is interested, there are a boatload of films that ask this question, a large amount of books, and a remarkably well-sized body of philosophy surrounding this; my favorite film on the subject will forever be Ghost in the Shell, that begs us to consider the consequences of transhumanism, AI, the rights governed by their existence and ours, individuality, the definition of being human, and what consciousness is. I'm talking, of course, of the Mamoru Oshii version from the 90's, not the Scarlet Johannson film. Seriously, even if you don't watch anime, it is an engrossing film with a lot to digest.
youtube
AI Moral Status
2017-02-24T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugha-oJt_DsgWXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiJNQy1_UpMX3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgjTxWp0UNLVk3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UggPw_bN0ng11ngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgiRC98B3aBkr3gCoAEC","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiKrWdtOG_Tx3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UghJw5uloiiWqngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghfigvnGzz6L3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgiLxWdsHUjz6ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg11ud6zdAB_XgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]