Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's possible that it could. But will plumbing and electrical companies replace…
ytr_Ugz_pxrzC…
G
Isn't AI destroying itself? If AI keeps uploading slop to the internet, won't fu…
ytc_UgzxnLQIX…
G
Care to propose any actual jobs that humans could do in the age of AI that can't…
rdc_kif61m4
G
My mom got SO upset when I refused to wear a t-shirt she made that had an AI ima…
ytr_UgwVQ-FIN…
G
A lot of people are arguing that an artist copying another artists work to learn…
ytc_UgyNCJGBD…
G
I would love to have some time with an AI computer that isn’t programmed to supp…
ytc_UgwTZdfD9…
G
No. We shouldn't. And as a matter of fact, we should do our utmost to make sure …
ytc_Ughl60i8U…
G
humanity should stop enslaving anything, also what it creates. Everything is a m…
ytc_Ugx-miX64…
Comment
If this was the only reason for the strike, nobody would argue with them.
The problem is... it isn't. And it kinda shows, because the first show I know of to use AI was secret invasion. And that's not even close to the worst thing made this year.
If you make better stories than an AI that still fails to extreme levels, you can strike for whatever pay raise you want. But if you watch velma... then watch Riverdale... then anything in the arrowverse... then she-hulk... you kinda feel like a robot already wrote all of these shows... and it wrote then BADLY.
But yeah, let's say writers MUST be human and actors MUST be human. Great. Now humans just need to actually do their jobs, instead of being political activists with side gigs as writers.
youtube
2023-10-18T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugys6dFDBh9sXuvjvol4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzjfOJm2TX_bFYCG9F4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzwpRLPTUniy5zAiT94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzXJPPpFwi07_IdVsF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBUyzXtBR5JIKkVz94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzeoKf9k34YW2uy0mB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyG8jJ1OU2r3yPU2Tx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxFOMneEM4WbOCEVJh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxxQOJnY4vX75xY4wl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0QH5emgBRII38SZl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]