Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don’t worry about Ai. Worry about Sam Altman a human. Watch his train wreck of …
ytc_UgzXI4OQ9…
G
And another reason to mask up in public. Even if a mask only stops 30% of the vi…
ytc_UgwKa1qiC…
G
Chatgpt can be hacked. Not saying his was hacked for sure. But I don't use it no…
ytc_Ugw9h4FK8…
G
The second shot Looks just like my wife when she doesn’t have make up on…
ytc_UgxROm9jN…
G
Even if Tesla could achieve FSD Robotaxi service, who would let a stranger use y…
ytc_UgyvQz26C…
G
So even after black mirror they still decide to go ahead and make these. Huh. Gu…
ytc_UgwpkE7_V…
G
I think AI is over hyped. the CEOs are over promising and going to under deliver…
ytc_UgxnToFRA…
G
'Smarter than we are', not 'Smarter than us'. AI at least never makes that mista…
ytc_Ugyt-clNj…
Comment
But why should this be inevitable? If the critical mass of people worldwide would refuse to use this technology and would boycott the companies working on superintelligence, the industry would not invest a single Dollar into the further development of AI. No further investments from shareholders /the industry would make it impossible for states to create a superintelligence. Such a common approach would also avoid the upcoming financial crisis once the AI bubble collapses (its just a matter of time). Why are humans so stupid and rather willing to put the survival of our entire species at risk for personal wealth? If this isn't the core of intellectual stupidity, I don't know what is.
youtube
AI Governance
2026-04-10T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzr5AISxs1E6AYuwjZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJSGmP4YZXEEpw_zx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzXOeF7l4qD9JIFC0d4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzZheIhL5ZJ9I1-ljJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwVmZy7c9ga4ZLVIn94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSIFmqUmiuJKsrbhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyrx5fJ0inUihpRtx94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwkh8sp0lU9jR6Y4zx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz7YUl3BDUWIwEsERt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_WJCgeaUoyZlOBJR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"}
]