Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'd like to see a robot remodel an old house....I think my job is safe for as lo…
ytc_UgwyxO2R1…
G
In my opinion, this Echo character is entirely illogical. If it was omnipresent …
ytc_UgwCH0sUo…
G
Got into a debate with a coworker over AI and art and I mentioned companies usin…
ytc_UgybDnG4L…
G
Don't forget to sue the FUG out of the software maker.
Because a report was done…
ytc_Ugx07Tsiy…
G
@T in this particular case, they need, in fact, if you hear the whole video and…
ytr_UgwKrQWE-…
G
I can build an AI enterprise grade automation system within 3 months, alone as o…
ytc_UgzHVB2lq…
G
No rll I am always nice even to Siri I don’t care if she not ai…
ytc_UgwY9rKrU…
G
a good number of top AI researchers are already of the opinion that LLMs are not…
ytc_UgwG4Gbfx…
Comment
Who’s going to want to watch a machine be a “better” person? How can you matter to something that doesn’t innately care? AI can live a million of your lives in a second but who will appreciate being with that? What can AI experience; nothing. Not on its greatest day can it have meaning. It has no spirit. It has no hope. It can’t eat or process food. It constantly requires a grid, data and updates. It could care less if the world is destroyed tomorrow. It’s the devil’s humanity.
youtube
Cross-Cultural
2025-10-15T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyEGssUIBjDkvZ5m894AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx4GLuCSL-P1h8flbV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy6AoKr2dGqb744ryx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwNOQtV9p0Ph8SMFZt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxnFGtg3o6KRyz-Bu94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxmzPxIGqnla4eabEB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwxN8waU2EXkBfDFxF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxkPn-5XrKP5pigPQF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy0J_yEZlsSSnJS81l4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx0sCUof0EIMn-OOdZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]