Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I know nobodys going to see this, especially not Lavender. But what is the POINT…
ytc_Ugx4Fi6GV…
G
I have a great analogy for why digital art and AI "art" aren't the same:
Imagin…
ytc_UgxJ16zcP…
G
It's horrible. But isn't this precisely what LLMs do? (They call it AI but that'…
ytc_UgwL2ivvH…
G
"Even lost in the hype, Sir Roger Penrose remains a lighthouse of reason—remindi…
ytc_UgxB2GuRs…
G
I was watching a video of a white guy visiting a village in Uganda 🌍. One kid st…
ytc_Ugz9tjt4t…
G
He says 2 scenarios, either people misuse AI for war or AI turns on humans.... u…
ytc_UgzuYrk9w…
G
Neil and company appear drunk once again. My dog (Mr Carl Sagan) is like wtf is…
ytc_Ugzms-ENu…
G
I get there are issues and concerns relating to AI, but why does she seem extra …
ytc_UgxegSCcQ…
Comment
all these people that are responsible for creating AI dismissing and avoiding the consequences it will have on future generations is disgusting, knowing they’ll be gone by then. no accountability, just whatever makes money and advances our tech without any thought of adverse effects. these are all issues that should have been accounted for before AI was created.
fuck the older generation. getting rich and dying with their money and leaving us to clean up after them or suffer in their wake.
youtube
Cross-Cultural
2025-10-20T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwYdVXLA72ExmlTjH14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxZTT3BcwOQkvt2rPd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxBtjKT-GAL1vmOxFN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw4ehRiXbnPc9vueZJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxPMy6vUWaqzJMVvEJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz1KHb1bouPQdQWqxB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwd437ofOhLD99Xpzx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzK77BiNJVx8aTSN514AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1XXy28bR0uDPzM414AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzi7nsFVnBT0X0sBbR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]