Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I remember seeing videos of guys kicking robots to test their balance algorithms…
ytc_UgyYeAYtc…
G
When Sam Altman says we should be careful of the answers given by ChatGPT, why a…
ytc_Ugycq_yXH…
G
😂😂 what soft people. If it's AI why even care to begin with. Say no and move …
ytc_UgyKfyHxV…
G
They dont care about safe AI, all they care about is pushing out the next agent …
ytc_UgyNcrUpW…
G
As much as I'd like to dump on Elon/Tesla and watch them sink - there's no excus…
ytc_UgzQ4tgfI…
G
And somehow, the CEOs will make it out with multi-million dollar severance packa…
rdc_ndymlbj
G
ehhh not really, chatgpt will combine the average of information, so it won't re…
ytc_UgylwdS8G…
G
We worry that AI would eliminate us if it gained consciousness because deep down…
ytc_UgxndD5OH…
Comment
For christians, the upcoming dystopian world just sounds like people living during times of revelations where the whole world only has 1 religion and thats the beast. Without the beast system, no one can buy or sell, economy will be run by them.
Even if you hide in the jungle with food AI will hunt you down till you pledge allegiance
youtube
AI Governance
2025-10-31T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxh2clLUvKhrO4ucZV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyBsJ3bDnq9I3ZvNPN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzkF0pHOW-UolctoZl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyQc_7U9dGcQT22W154AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzgjvl6_YmchZVaQjp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy-bEmQseqqPiJ82UR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzyE22fQmDCE3-SxjV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyLaMBjlFBwrP3U0hF4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzmvbnkJGjAIC-p-_94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz0JhZ5KSLBvPrveZh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}
]