Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Should OpenAI win this, it will set precedence for any media pirating. Get a mov…
ytc_Ugw3sLRKU…
G
Ilya has moral compass coz he was born in Russia u all hate so much. Westerns do…
ytc_UgzByd18W…
G
But then AI doesn’t need life. They need electricity and that’s it. So, why shou…
ytc_UgxrJsGh0…
G
so if there is zero regulation does that mean people will be able to get AI to d…
ytc_UgyHcAbC7…
G
These AI gentleman groomers have literally no idea how systems work. And they h…
ytc_UgwYxSf6s…
G
THIS
One of my favorite parts of looking at art is the wonder of how a HUMAN cre…
ytc_Ugz_cclRX…
G
Surveillance capitalism, yes, and we have to rectify it. But much worse than tha…
ytc_UgxWrebGy…
G
Honestly, let these companies have all the AI slop all they want. If you don't g…
rdc_o8eihg0
Comment
Really interesting conversation. The intro section was class. The AGI debate is difficult, as we don't know how to properly define intelligence in the first place, so we wouldn't know if/when we reach AGI. What constitutes intelligence?
Neil mentioned that the people who are positive on AI aren't getting interviewed, which is not entirely true, as (selfish plug incoming) the podcast I host on this channel is focused on interviewing the people who work/build with AI, create new versions of AI, new products and services with it, research it, etc.
Although cautious, they are generally optimistic about its potential for us and think it can do good. Worth checking out if you want to peek behind the curtain at the people in the AI trenches.
youtube
AI Moral Status
2025-07-26T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxKjMQMiY0D_Jky8Hd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxoL4rYdT2HXsT8Kox4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy4D1CY1EZcmwmwZHV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzNKofaUWJuEIGBcNR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxPHn4CP4W20bUfVl94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnfuPKd9S8W8RlFuB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugzo8Hzv5nQoEIKytxV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzEx2SymCzGKbDkQp54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxR27gLKlykmg2lO5V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzsx8PBwtjMhlJpJn14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]