Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They don’t get the AI revenue model.
These companies will be fighting over the …
ytc_UgwXHtmhE…
G
AI "artists" most likely don't even know why most, if not all, artists create an…
ytc_UgwpQNk66…
G
Humans have not been worried about their survival in as much as they are worried…
ytc_Ugw0kca2s…
G
No one will stop advancing AI because if they do, others will advance instead. I…
rdc_oi11xlg
G
To avoid a dislike as click bait, could you provide details as to occasions wher…
ytc_UgzTmf_PC…
G
I bet you I get in a horse, and the robot won't have a chance to jump the river,…
ytc_Ugz9lgktY…
G
We should have international law against allowing AI to make a decision whether …
ytc_UgwYCc-uD…
G
I stopped the Video in the Beginning to see how people are "crashing out". Turns…
ytc_UgzPFo7v1…
Comment
What would be the consequence to X if they got rid of their AI ?
It might destroy their company.
But what would be the cost to deep mind ? It would DEFINITELY destroy their company.
It's not going to stop no matter what this guy says to anyone.
youtube
AI Moral Status
2025-10-31T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyWBa3ZHDwbz_TRHOR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyNbgCru6frOF9-ROh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyO4sXzepX4g416NsJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgybbdJWEPoe2zim-2N4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxW3DXpKt6efbyIVYB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxrRoqpV7tc0mZ3gYl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyQYarHiV2n7AAXJIV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwwdbh1v_HwUmapK114AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzLBpOhTu1anLR5e0h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxaXODyoIy1XtKSU-R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]