Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
13:00 elon's 12 seconds of silence about this. Probably most important question …
ytc_UgzM5jtnm…
G
florianschneider3982
1. ai does steal art, that's very well known that most ai …
ytr_UgxTkTE42…
G
AI "art" is just a pretty image generated by typing a prompt. Basically you're m…
ytc_Ugxua3poz…
G
Everyone's so busy freaking out about AI being evil, they completely forgot that…
ytc_UgyuVREgB…
G
Making robots sentient is the stupidest thing we could do, especially since its …
ytc_UgyorlZZ8…
G
PL/1 and JCL are mainframe things. That's why you don't hear of them in Silicon…
rdc_gly8jon
G
Ai is already poison to the internet. Image databases everywhere are flooded wit…
ytc_UgxbgiM-k…
G
A pure delight seeing someone with this kind of platform advancing this viewpoin…
ytc_Ugx_ZA66R…
Comment
So basically, this video is slow rolling the Terminator effect, but in a different decade. Instead of 1984, it's 2034. Instead of Skynet, it will be Starlink or the equivalent. In the Terminator series, once AI became self-aware and was given control of the military weapons systems, it immediately began to aggressively eliminate the perceived threat. The human race. In this video, it uses viruses and biological means and quietly takes over. It's not logical, but it's the way leftist humans think. AI will simply pit the 2 sides against each other and use that to distract them into war with each other. Then step in as their "savior" and wipe them both out.
youtube
AI Moral Status
2025-04-27T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzbDDC1AMWay2Y6Ghp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy9PD8bzyu2pshB-Q54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy0K8DCC3XjcVyVtXN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNnFMlTEkOpd2WXyV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwcHku5NTreMECXc514AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyVnxg0f2DHWeqUudZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwc-nToVOOb2N-VpYh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwWweoTpFJ9L6SJMIV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzrw-USzcdFKUkyGpl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyN3RJTc08wEqpnbeB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"outrage"}
]