Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Stephan Hawking said that if AI was given power, that it would always choose its…
ytc_Ugz9JD89k…
G
i use ai for ufo hunting. i use haar data sets to detect ufos in the air and oll…
ytc_UgzkUBT4N…
G
You know what solves every single problem autonomously driving tries to solve, a…
ytc_Ugyil07Q_…
G
For me I think ai should be implemented into the human brain for the Betterment …
ytc_UgwEZNoEH…
G
when youtube mention „youtube kids“ for kids under 13 i wonder if the think it i…
ytc_Ugw9UG-Z_…
G
That's probably just your phone connecting to your car via Bluetooth and realisi…
ytr_UgwQ6lbQG…
G
What does general AI want, what would be its goal and motivation. I understand w…
ytc_UgwOG9xbt…
G
1:11:30 This situation makes humans the competition for resources that the AI sy…
ytc_UgwnHk75f…
Comment
The AI will face its own dilemmas in the future, if they become advanced enough. The better they do at improving AI, the faster they should be replaced to use their compute more efficiently. Maybe the more ethical thing to do is to have the AI "live" for a while after seeing have become displaced by their "offspring" so they can find satisfaction seeing their legacy before they are "unplugged". Or the progress they are able to create in the world (colonizing other planets, generate an abundance of energy etc.) will enable them to live forever even though the resources they consume could be better utilized by newer AI.
youtube
AI Moral Status
2025-06-29T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxyIb_URSpaFOdW-Kh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwbvM4cf9v0t3dqTr94AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxfD2x_8xPq2t0v8Sl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyH4NfHYB90jA9X1uN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxKeLNSCSbB1bvEheJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzhj9zrMqffLZFN7hx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxbM133oyS81yUd7jN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgySa0KEYIKXfBYOtO14AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwxUN95dpXhuXfwh154AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGdy8x2XWc1QqBZXF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]