Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I will take what he says on AI, and which he obviously knows directly and better…
ytc_Ugyg6RTpk…
G
The utopian dream of AI is a pipe dream and without purpose humans will become c…
ytc_UgyoRbKTZ…
G
It is definitely safer if there was no AI research until Stephen can prove it ca…
ytc_Ugy_7Llsl…
G
The car automatically tired to stop! you can clearly see the smoke about 50-100 …
ytc_Ugygnp4z4…
G
I don't hate AI, I just feel bad for people who can't distinguish between AI and…
ytc_UgzhGndu7…
G
I feel like people become their true selves when they are interacting with AI.
…
ytc_UgxEVWEYk…
G
Yet another example of how useless self-driving cars are. This guy very nearly g…
ytc_UgwuRqRGm…
G
I challenge an ai artist to make an ai art generator that produces half decent a…
ytc_UgxBCYlVY…
Comment
Come now, this video is just trying to plant the seeds of fear and paint the future like an apocalypse and world ending catastrophe. Calm down, ai isnt working or thinking like humans. We fail to understand that this lifeform is different from ourselfs when it comes to no real emotions, no chemical reactions in the brain, no primitive Brain that still has ancient parts within, no real desires. Horror stories about the end of the world come in all shapes an sizes, so lets just see where this leads yo instead of predicting the end of the world in 10 years 😂😂😂😂
youtube
AI Moral Status
2025-05-29T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzIAIwKIjJDCqjOdsB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwt5GkjgjwYMfeQ1UV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugw1Zh1OrFvfXi2Y-eF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"mixed"},{"id":"ytc_UgxbpjZv3bhvr2MfxhZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugy4ROt7YVLPLFw4Rtp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},{"id":"ytc_UgwlVMHzvVQai3Ecgtx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwLrkd2E5kqJX-Fggx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugzyw_aFOutsQfCfEZ54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugw7D63oxqWiX_hbAM94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyUNWQ6A1wURGjOzcF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]