Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Artist don't give a fuck if you have ethical models that were trained on consent…
ytr_UgwiaMo60…
G
Noname-mi1oo it might not be my problem but, to me, it feels soulless. Where's t…
ytr_Ugwfi8aRC…
G
Everyone should read Capital, Marx addressed all this over a century ago. This i…
ytc_UgwEE-KRH…
G
Not a single job has been replaced by AI yet, but it may replace some jobs in th…
ytc_UgylqmhYv…
G
AI and Automation are two different fields. Yes, there can be some overlap, but …
ytc_Ugy53aztP…
G
Did a mini "interview" on the topic of AI generated art a while ago and even CHA…
ytc_UgyFGoToy…
G
How fast they are expanding??? They plan to go from 1500 cars to 3000 cars in 2…
ytc_UgzpuhQNm…
G
You're thinking like a realist, and we respect that. Yes, AI is surpassing human…
ytr_UgzE4qz0M…
Comment
And what if your neurolink is contaminated by a digital virus...or hacked...our last freedom is the one of our thoughts..this can lead the owner of neurolink or some skilful hackers to be able to "read" people mind/thoughts (output) quite easily. Hence, if people are dishonest they might develop these kind of hacks. Very dangerous. Or if installed in the Nucleus Accumbens, you could render a whole population addict to the content broadcasted (input). Don't become the next Openheimer! Let the AI be the most intelligent and don't put humans at risk of losing their freedom. That's kind of the worst case scenario but you have to reflect on this...to make your neurolink safe enough for humans. Beware of pre-clinical studies on monkeys because you would quickly get "planet of the Apes". Proceed with extreme caution!
youtube
AI Governance
2025-09-16T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgykudEW0U8eHLPL4w94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgweBfPh4a4wU6JVkBF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwFb26J8JlGn2mBn4J4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGrd-PxBulEPeThzN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwVSW-dIXQv1V9M_jh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy0H4Nq-dk9ngkFPlJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyR9yJaxaMTNufBixJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz7l4nG0SzFX_GKjp54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgxWcjXLPMNTuQfOHgt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugx2jWVdGQilILuYRB14AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"fear"}
]