Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What's more ableist, saying that Generative AI is bad or claiming that disabled …
ytc_Ugx0vL3h8…
G
"I panicked" you are a bot! Even the best ai can not stop larping as a human no …
ytc_UgwquvARt…
G
11:00
I think the case of "modern" art actually has some well reasoned validity…
ytc_Ugyf_iBdi…
G
Theoretically, AI can be infinitely better than us humans can ever hope to be. W…
ytc_UgxCqsdsd…
G
Tools like OpenAI will discourage the development of critical thinking because p…
ytr_UgxGvvYuc…
G
There will be a day when people get violent and kill ai machines all AI machines…
ytc_UgwJMCivm…
G
Tbh, this probably could make human made art more valuable, since they are not m…
ytc_Ugzw4g_Ge…
G
When Ai becomes incorporated with Robotics. Even plumbers and mechanics won’t b…
ytr_Ugygx1-vV…
Comment
You do realise that the main money maker for AI is not the image generation ? You are trying to fight a titan worth billions with a toothpick, poisoning your image might slow them down a bit but i think theyre scrapping method will just improve.
I have an art degree and do software egeneering as my fulltime job ... there is no fight ... you either embrace it and get your work to the next level or die not using it by lack of efficiency. You wana continue just doing your illustration make them on real canvas that is something even if an AI could do it the value of it would be null. The main value for AI right now is code and all the training data is free and publicly available so it's not going away anytime soon but you are right that some sort of compensation for being training data should be mandatory.
youtube
Viral AI Reaction
2025-08-24T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwVsq08xy6SGh1j1Pt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxzuZ2MsOR0LK870o14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyI1bQ9YGzuZFG8Zyx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzCTaIMim7HiHrnQr14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzCumOyZ-VMX15qr8F4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwTRgF8XQNYShjx7iB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxeE5mph-4Px79pX6p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwu0TWKF3BiTS3-TtJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxsD7YPcMMLxAQShxd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyEnhUFcUkkfKF1qA14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]