Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Can we cut the whole AI funding shit. I mean cmon we’ve seen the terminator bro …
ytc_UgzhvW4B3…
G
If we are foolish enough to give an artificial intelligence consciousness then w…
ytc_Ugzo3L3CC…
G
Well actually ai programmers actually use ai to run programs to make better ai n…
ytc_UgyPny6RZ…
G
Easy solution to keep human jobs . Tax companies that don't employ human worker…
ytc_UgzR1tsgH…
G
AI "art" is about as creative as messing around with the settings of a video gam…
ytc_UgzD-DtPy…
G
this is why llm training data centers should be moved to the middle east. let th…
ytc_Ugz4irRl2…
G
Well the thing with media not covering people are rising up - it doesn't mean it…
ytc_UgxFr6vtZ…
G
THE MOST HIGH WOULD NOT CREATE A WORLD OF CHAOS AND EVIL if you ponder on the hi…
ytc_Ugy_QYdCX…
Comment
Despite his brilliance and achievement, I find it hard to respect Mr.Hinton. If he spent his entire career designing and building neural networks, why is it that he is only talking about their danger now when he is ready to retire? He really is a modern day Oppenheimer, but even Oppenheimer gave up on some future advancement when he decided to talk against nuclear proliferation.
youtube
AI Governance
2023-06-09T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxep-sB529TcYARBSd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyHHxq2z5eThcnlNy14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy_EfaIWPplBMKgQ794AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz4en2HyR52WnWYX5t4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxGu3BGJo6ygz6qJrJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwrk02uEi2HMDpMDcJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyE4Rxvvoo3uLtn2DJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz2SyH8I4aYZ1HEIYh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrPHPGIOicQVVUHux4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwy1PXGecsEocjfoHF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]