Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So basically, if an AI of some sorts, suddenly became self-conscious , and had f…
ytr_Ugwu5KhSa…
G
If it can kill job can leak organisation data which humans are incapable off..b…
ytc_Ugz_0_Zib…
G
Why think they will gain sentience? He says people cannot define it but why is t…
ytc_Ugxi21zQe…
G
The elites don’t need people to buy their stuff. They need us to do jobs and the…
ytc_Ugysi2GSn…
G
Sorry everyone, don't blame AI on this one. The value was ruined a long time ago…
ytc_UgzrJIb8I…
G
do all these digital artist expect us to draw for 16 hours when i can make somet…
ytc_UgwiGKwKI…
G
A drop of water in the ocean as far as AI contributing to climate change. Pointl…
ytc_UgwQhmk_3…
G
fellow artists, make what you can before Ai takes over everything 🫡 bust of luck…
ytc_UgzIW-MZ2…
Comment
i feel like at this point we're genuinely going to have to have a Cyberpunk 2077 type situation where AIs get out of control and end up requiring an entire reset of our internet and societal structure to be able to actually seal them away after some horrible apocalyptic type AI takeover. They'd end up acting on their own, reproducing and doing anything in their power to further their own goals, like the very frightening rogue AIs of said franchise.
youtube
AI Harm Incident
2025-09-11T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzaJH83UyX6B1d9qQR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgylVvmPnRYJHEj0ayV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwcsq1MaUtnZBXXwrJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugzc2L0QRjU-lOScfl54AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy3XzoWAA9qK51BNNR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwXTuq5NWPgyNqRfMh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz3DeFrsVTsQnFbidZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzJVCZenruiOHye_s94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxYDHtMJrAYGmuJUPV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwtsN3mwuvmsuNSzfd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]