Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m sorry, wdym “predictive policing”? I thought that was some shit from dystopi…
ytc_Ugyj5xopo…
G
@zelle8651 if you want to give up, you can. If you want to fight, it will delay…
ytr_UgzWujMYv…
G
A dude in my class submitted AI art as his concept art assignment three weeks ag…
ytc_UgzWvXIcE…
G
I wonder if AI would be able to take over a tennis coaching or jobs that involve…
ytc_UgwNfSFyB…
G
And this is why you hardwire stuff. Can't shut the AI down by software? Pull the…
ytc_UgwSY71IH…
G
that's why these AI lawsuits have their merits as we figure out how to deal with…
ytc_Ugz3dUBfK…
G
@moonstoneblast6065 technically the problem is more in the sharing (which acts …
ytr_UgzDX3J7V…
G
not surprised really. I've been all over Asia and there's blatant (but not hate…
rdc_clv2uod
Comment
I really like Steven but this episode really highlights just how out of touch with reality he is, AI is great, it will make plenty of people very wealthy but my concern is what about the rest of the world? Like the people who’s jobs will be taken over and the Jeff bezo’s of the world won’t have to worry about paying humans. Ok so if you’re not super rich what happens? I feel like desperation will set in crime very violent crime will skyrocket, the world will be super wealthy and brutal poverty. No in between. I see the world becoming borderless in the next 10-15 years so then what? This guest brings great points unfortunately it’s not stopping it’s only getting bigger. For the first time I think we will extinct ourselves in maybe 30-40 years.
youtube
AI Governance
2026-01-14T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxlTf6Bh_TnZL5Enmp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUNhLNHBmnSZedMoJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzPslccHQRft-fGy6h4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWHjjq1r8KldtG_VR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugym25fS9C-A7ANMiJJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgxzA4Oji-LyKoaF1lR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw6Al9RDdaL9oDYoKp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy75zseYlt_4EeFHOV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx0Zl9dKrgKOHlG6xB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzJ5_wtZ2tbkkZO3Ul4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"}
]