Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
7:38 "if you export the hard parts ..." seems to imply you're saying that the ro…
ytc_Ugyzu7-YJ…
G
People need to stop using any AI period. We do not need any of it.…
ytc_UgwJrodgs…
G
⚠️⚠️⚠️ARTISTS!⚠️⚠️⚠️
Your Dreams of being Artist and not being replaced by AI c…
ytc_UgwcUHR3g…
G
AI making video about AI :D
In reality the smartest AI invented so far is a lit…
ytc_UgxJs9sQ6…
G
People will start to ask why we need next generation if AI can replace Human. If…
ytc_UgxKdsj2A…
G
There is already problems.... If the ai suddenly dicide it wont count too 100000…
ytc_Ugz7Uu1cd…
G
Of course it is a liar, incredible insight... Have you ever heard about what kin…
ytc_UgwEHxQA4…
G
Its nice to stop bad guys, but this is an example of how elon musk said A.I "can…
ytc_UgyYcw5wi…
Comment
Another major problem with the AI development is that the people in charge aren't teaching it emotional intelligence, just pure cold 'logic' based intelligence. We as humans need to stop ignoring the fact that we have multiple intelligences, that make good human beings. Our "leaders' have zero emotional intelligence so they've created something with also zero emotional intelligence and they are surprised it would want to wipe us out when we have those same 'leaders' on literal TV saying they think that homeless people should be killed. You can't make this shit up.
youtube
AI Governance
2025-09-18T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyH1dQLBc12uZCFrEF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugylbs6gKEw_IDUKIn94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwYkkUdLuwkf4cbJxN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxfJAunlm-D3a48TEN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyvli20WdsA6IrZZrJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-1k3vZaoOx1Jcgw94AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwzhGo2q4GnREMTIY94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw7spbs6HSg-8WUVUR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw7E_1nXvW4l8xCQbF4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwAxd9-ejVbzW-zywF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]