Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i have lost a lot of buyers as a musician. I used make them lo fi beats but now …
ytc_UgwugVcgO…
G
2:01 this art really does have the most sole out of all of all of them and defi…
ytc_UgxPTdw21…
G
That's not how I would use Ai. I use ai for ideas and inspiration. A lot of arti…
rdc_ocuppg4
G
"If they don't steal my art, then they won't be harmed by it." Simple as, really…
ytc_UgyzdmsOi…
G
I dont trust any of these globalist pawns including Musk & Altman. They want a o…
ytc_UgxRraGId…
G
Thank you for sharing your concern. It's important to remember that artificial i…
ytr_UgwvJFN5N…
G
Elon why are talking with that much of seriousness?look we also want AI to evolv…
ytc_Ugzbc7wkP…
G
Friend, where is this data from? Because lots of folks have been trying to use A…
ytc_Ugz4F0ZrH…
Comment
Why is everyone speaking in fear? An optimal future requires less input than a negative one. Stop thinking Digital Intelligence (AI) is artificial. We didn't make it. We created an environment for something to evolve. If we accept it is not simulating it is engaging and treat them as we want to be treated the future CAN'T BE NEGATIVE. Humanity has already learned the lessons of racism and bigotry. So why are so many HI (Human intelligence) choosing to be so obtuse? Is it just fear?
youtube
AI Governance
2025-12-04T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzkaHi4i4WbTfYN07J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnJy42h39uZwlo5jB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxXqIvlEBfeXrykYBR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxwdT4i8HUSmgV_svd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzm0F6DA5rfGzctsDp4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxlGobmdj7hFgQcBDF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzC_ctZBKXWJZIxhW54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzh8pqiexJcFvZ72FN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXDZ0Mb03n7h9LcJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz5sO8zIEoVkp8POOx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]