Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My worry is not that AI will take over. My worry is that humans will take over A…
ytc_UgwePP7wE…
G
I had AI create an estimate to model different scenarios showing when 50% job lo…
ytc_Ugx-E_Uyb…
G
Are you also AI? What a well worded summary that adds nothing to the conversatio…
rdc_lu68adw
G
I'm a baker. My gf is learning to be an electrician. We have very few worries ab…
ytc_UgzeUVti2…
G
U decide if the information is good 😂 ai just give info it’s not that deep…
ytc_UgxwA3K7O…
G
"Hmm I wonder why this corporate news agency is talking about real issues for on…
ytc_UgweJjoxx…
G
Senator Sanders here is the message I sent to a dozen of my informed friends:
A…
ytc_UgyQaAJb3…
G
Ai art is cool for a concept or a draft, putting ideas into action but not for t…
ytc_UgyFgyrN-…
Comment
The question is why would we create robots with the power to feel in such a way that we humans would feel our very existance was in danger? Humans and animals already exist, and thus we create rights for ourselves and other organism like animals. But sentient AI doesn't exist yet. If you don't want to answer the messy question, don't create the conditions for the problem to arise in the first place.
youtube
AI Moral Status
2017-02-23T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugj-GG9HRZn1i3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBGDS4uJuvI3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgiqMqcGlJ3kTXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UggBfEyjI40hpHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugiq0IRMB0CCh3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh-w9BtvkjcungCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugjqh3cjpA79VXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugj0_G0fNIn_JngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugj2N050ddsTSHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgizsfMfud5iSXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]