Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is great but for a non tech person it can all be even simpler than this. Yo…
rdc_kk35uth
G
Well someone still need to program AI … AI cannot write its own program and prog…
ytc_UgzJcEzeY…
G
The manner in which AI is being presented to us by business is that it’s major p…
ytc_UgwvHlVyh…
G
It’s far more than you say, but you have to be highly awake to find that out. Hu…
ytc_UgwGJNcQd…
G
> Ah, yes, more nukes. THEN everyone will be safe!
The only nuclear devices …
rdc_dl07tr0
G
cause its ai strapped to traditional lasik?
○ It creates a digital twin of your…
rdc_m1ybgnw
G
Not only do we not want AI data centers we also don’t even want AI. It literally…
ytc_UgyvTSKLM…
G
@TheBerylknight No. that’s not how this works with Super Intelligent AI. You jus…
ytr_Ugw0hBfu2…
Comment
Interesting talk.....with that said, he's selling two things: bitcoin (he's invested in it) and fear of AI (which his business is in AI safety). I don't doubt the concerns raised and I don't know about the timeline.....but it's difficult to look past the conflict of interest here as well. Regardless, the most dangerous part of the world today is that it has inconveniently become the most divided that it's been for the past few decades. Globalism is important when considering how to approach these concerns.
youtube
AI Governance
2025-09-06T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwAZ1MTxSna7HJroaB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzCjjcrWrWB5lVHDLd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwxKAMCwz8lep7w0714AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx0kCVmg1KxqFiIUPd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzJGnxpYCGb25CECjN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugyv6Zc9bth551xMiZ14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxMeKF9dCwDVd6DdY54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyWQY4tJYAALq70EC94AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzThRXluJvW2EFPgvl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugxfn2ppd0G_TtROjC94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]