Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I had absolutely no idea that Elon was involved in the founding of OpenAI, this …
ytc_UgwIyfvos…
G
My sister lives in Madagascar right now as a biology researcher and it’s not a p…
rdc_dpcmc2j
G
So the “godfather of AI” thinks the only way to ensure it isn’t used against us …
ytc_UgxUtyxsZ…
G
I feel like the script for this video was written by AI. It's good detailed inf…
ytc_UgwanCMVh…
G
New technology should always be regulated until the impact is understood. The An…
ytc_UgwwMaqkv…
G
Some idiots proudly working for AI companies they have no idea they will he kick…
ytc_UgzKcGVa3…
G
For those who think this is crazy, think about the software used in those machin…
ytc_UgyxjqxS1…
G
The way I like to think about it is can people distinguish good cgi in movies?..…
ytc_Ugw2gXzlM…
Comment
This is the one video by Kurzgesagt that I find pretty illogical. Robots we make cannot go on to create robots that are more intelligent than them, and if they do, it's because we programmed them to, hence cancelling them out as the original creators, seeing as though we created the designs or the base ideas and concepts ourselves. They have no free will, or creativity, or sentience. They are not capable of thinking up new ideas or designing a robot far better than any human could have. The human brain is... trillions of miles ahead of any comouter chip. Yes, a computer may be able to calculate fasted than a human, but humans have creativuty and ingenuity. Computers just do what we tell them to do.
Then again, this entire video might be a what-if scenario just for the sake of talking about robot rights, and if that's the case, I apologize for being an inCOMPETENT OAF.
youtube
AI Moral Status
2017-02-23T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UggjdW6J36gm6ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjxFsSVeCDnM3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugggw7gD7mYXeHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgggWgOiFrTcO3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugg7XZMoCWN3CXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UggI6IWQzP_T3HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiuuZL8zufHAngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgglkOmxDN21AHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghwYK5jq-QSJHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgiwNbDwLt7DeHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]