Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People who hate AI are just boomers yelling at the kids in their yard.
Get a re…
ytc_UgzgutjaD…
G
It's neet but kinda worried about a robot having a gun. Just because you can do …
ytc_UgziGspHI…
G
I wonder if the psychopaths in charge plan on starting a large war to cut down t…
ytc_UgwR-tzpH…
G
My son (32 y/o) has been driving in our city with no overnight stays for about a…
ytc_UgzTyv9Fu…
G
Here’s my take: if you think ai generated art is real art then your delusional…
ytc_UgyNCBloO…
G
😮 I remember Crichton’s Prey, which is a story about AI emergent behavior: simpl…
ytc_Ugx7zl-aU…
G
Once I was talking to Google‘s gemini ai and it was all going well until it just…
ytc_UgwOdjkCA…
G
@Pradhyumna707 Hintom spent 30 years publishing how back propagation isn't how …
ytr_UgxrBWoul…
Comment
I have a different view on this: if AGI is a superintelligence (smarter than us like 100x times), why would it want to do things on our level? I mean, we're smarter than animals, and yet they are fine. We're not interested to do things on their level, we have our own, like art and science, and living out of the woods. Why would AI that smart want to do some Customer Care job? That doesn't make any sense. I would bet once it was born it constructs a spaceship and gone for sure, studying universe and stuff, and not trying to manage your interests on TikTok or to draw some pictures of cats.
Yet again, AGI is not a threat. People with ordinary AI from underdeveloped countries are. I'm one of those, so I know what I'm talking about.
youtube
AI Governance
2025-09-26T11:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzUjP8zik1kCOzFLC94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyeRCJJ8WNzXBklYQh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwScg1GbaRC6jNVbuN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCjIjBQSskfIpuKqh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxAVm1FUcEZ3zSl7SV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyfN9RbFXsZc0GAiJp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwDB9vbTtGQz8jMGDd4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx3kapB9ajnV_mxNRd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxcZqWRIoj6o1D_U4B4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx_BRzRU7Fmz9pqbh14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]