Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A robot can't be conscious, it can only be programmed to act as if it is. Giving…
ytc_UgimVR2aj…
G
Funny how initially Sam Altman was pushing for regulation right after open AI pu…
ytc_UgxvFylCv…
G
And that’s exactly why AI imagery shouldn’t be called art. It has no conscious i…
ytr_UgxPuSqvN…
G
Lol😂😂 AI WILL REPLACE 10X MORE JOBS THAN IT WILL CREATE!!! GET READY FOR THE FUE…
ytc_UgzdDbK9a…
G
To ask these question to an Ai , is just SAD.. shows i dont hv support frm.real …
ytc_Ugy9fhI8_…
G
talking about being disabled, i wonce came up to one of my friends who tried to …
ytc_UgzR19agA…
G
This feels a lot like trickle down economics, reframed around AI. There is heavy…
ytc_Ugxex1d_u…
G
Honestly, AI can be a great tool to make your life easier and make some tasks, e…
ytc_Ugx3EWBGR…
Comment
That's no what the video is saying, it's not "AI wants to..." it's following it's orders. It's a tool like you said doing what it was made to do, BUT it can do it on it's own, even taking decisions to fulfill it's function. It's preventable, just like security protocols exist for construction places, when those tools are made they put human safety above everything no matter what. What the video says though, and it's being mirrowed in the comments, is that those who are making the tools don't care about human safety.
youtube
AI Harm Incident
2025-07-28T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxASsktvLEwAjh2H754AaABAg.AL8i-Z65gSiAMIUWQs8ZSW","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxiHjSONWgF9MytblR4AaABAg.AL7xTGhZJnqALL16xkbUWK","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxLg69yQ3o68BlWYA14AaABAg.AL7dIEe8EwXALBBJEhYEX4","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugws29neEky5h8y5r4F4AaABAg.AL7KdI6HwzvAL85ez1YTKK","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugws29neEky5h8y5r4F4AaABAg.AL7KdI6HwzvAL9sXWfXEiU","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzbzBmLQYKOxNNbITN4AaABAg.AL6KoiLPeQnAL7PFZbZFpX","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgzbzBmLQYKOxNNbITN4AaABAg.AL6KoiLPeQnAMMxCDpBAt8","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_UgzRZaN5Z1zjWMAqD_h4AaABAg.AL69YcijqY8AL6A6MHlpQw","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyFE3-0NjInX_I13Th4AaABAg.AL63OYcBbizAL8xhfiSABI","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyFE3-0NjInX_I13Th4AaABAg.AL63OYcBbizALEfyfCVp78","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]