Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
when u realize that there is no place to hard-code in those three laws of robots…
ytc_Ugzg_nxZe…
G
So you mean artist is the same ai because artist need to memorizing every single…
ytr_UgzEIsIZo…
G
Hey there, I'm specialist in Frontend software development (UI Architect). I hav…
ytc_UgxtvR5O3…
G
We've known this outcome for a long time. This is why quite a few tech people, i…
ytc_UgwamQgNu…
G
An artist can use AI and just tell this guy it's all from their point of view. 😀…
ytc_UgybSS6VR…
G
Yes and no. Because AI needs an input (and an output) prompt, the current way it…
ytr_Ugy0bB56B…
G
@badlybad1656 The problem isn't the AI itself. It's the humans I do not trust. O…
ytr_UgzGHWnIr…
G
Just let the robots buy all the stuff. It be an AI and communist revolution at t…
ytr_UgypNBy_9…
Comment
Eric Schmidt represents the bullish perspective while Nate Soares stands for the bearish one. I don't doubt that tech executives discuss AI risks, but I believe they'll push forward regardless of how serious those risks are — simply due to competitive pressure. Google, for instance, was relatively cautious about their AI assistant until OpenAI launched ChatGPT.
We should also acknowledge the hypocrisy at play: while these tech billionaires publicly evangelize an AI-driven era of abundance, they're quietly building bunkers in Hawaii and other remote locations. That's hardly the behavior of people who genuinely believe the future will be utopian.
youtube
AI Governance
2026-03-22T18:0…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxSp6Ls9VbI6OdwSHh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzpnnSl8HbwTc0o7Mt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzrfCmMWsyRHJo5mSZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwp77NMGC6LAMyQCIN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxKk_z5K8KBHdGF9OR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxNqYTotGxlJvtBF6R4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyBl8PztBJOfXXHLZR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwT813VBJ7fFC9Rv3l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyEBcceq8XHCQkTYpN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxj-KnIt6rwczLt8l14AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]