Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Imagine AI already controls the Youtube Comment section by pushing and publishin…
ytc_UgwJAmpc3…
G
I think it's pretty funny how now we complain about AI art when it takes away fr…
ytc_Ugz1cqb5z…
G
Programming is in the unique position that fits this bill in that:
- everyone k…
rdc_nm8zok8
G
Did everyone forget we cloned a sheep 30 years ago. Siri hasn’t made an advancem…
ytc_Ugw0BMQHm…
G
AI isn't killing the job market. It's your lack of actual skills. I did a busine…
ytc_UgyoP66iJ…
G
You are asking your guests ethical questions but they are not experts. Contact m…
ytc_Ugwl-tSVP…
G
AI doesn’t need to kill us. we’re already doing a great job of it. It literally …
ytc_Ugx8m8moo…
G
“…autonomous cars on the roads, drones in the skies, run the power station…” sou…
ytc_Ugz3jShHs…
Comment
The fact that people are saying that “you posted it on the internet, it’s free for use” it’s incredibly wrong! Sam already clarified in the video but I can’t with people. Someone also just saved the picture of someone else’s art on twitter then posted on Facebook saying it’s AI art when the watermark IS STILL THERE!
The fact that a whole damn lot of people think that just because artists posts their art online, their own passion, their babe basically post it just for free use. There’s a reason why the words “consent” and “plagiarism” exist. It’s to keep people in check. It is true that I want to see AI to evolve but I think it’d just be best if the industry themselves hire artists for them to use references of. Or pay the artist for consenting that a certain art piece would be used for reference to the AI.
Also merry crisishiljermer I don’t know how to spell chrjisismenrsts.- ;-;
youtube
Viral AI Reaction
2022-12-24T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxeBGxNNSz_FZHlLPV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzjgYuiQ1RwSVx3swZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyquGfv4A9phCHLpAV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyPEmtnYGIeICfmTd94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwdsOn7UsZ5UQeOVBt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxsngZ3MdJwxbWRpWd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxWfQvQnN4doUc3GoB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwhPSHEdZymFdbxFGN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgzKmdtyutJbIqWTd4l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz-yWdmTaSuoJji2Sx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]