Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The enemy of humanity will be the established elite the industries initially imp…
ytc_Ugw4OY83m…
G
Fun fact, about the captcha where a user has to click on all images with a bus o…
ytc_UgxnXYhAI…
G
If Americans live in a surveillance state, how is it that there are so many porc…
ytc_Ugx9qgg6o…
G
AI generated imagery should be used in art as references and inspiration, tools …
ytc_Ugxola2Sl…
G
Guys is real dont say is fake because im he box he robot use on worker and now m…
ytc_UgwpobaNd…
G
What’s really great about the paragraph instead of the ai art joke is it looks l…
ytc_UgzrAT1Qh…
G
No,this Button is always pressed, a car cannot make any damage not create any ri…
ytr_Ugz0Sfbxl…
G
Great clip. I have not yet gotten past insulting the chatbots that have been inf…
ytc_UgyYZJsOb…
Comment
They can’t but it kinda gets tricky if an autonomous drone actually makes a mistake and I.e. targets American ship or something like that
Now the Chinese couldn’t say „oopsie, coding error, sorry”, they would have to lie that this was a rogue pilot but that’s kinda tricky if pilot doesn’t exist and there’s no one to prosecute
So having or even testing these weapons would be unnecessary liability to the owners - those in power don’t want any stupid robot to create a major international incident by mistake so I think this agreement will actually achieve its goals
Keep in mind that world leaders are almost exclusively narcissistic control freaks (why else would you want to become a president?) so it kinda makes sense to not offload thinking to machines. If international incident is to happen they want to make sure it was because _they_ ordered it, not an accident
reddit
AI Governance
1699783757.0
♥ 23
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_k8woe3m","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"rdc_k8wtmg7","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"rdc_k8y4f22","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"rdc_k8wopbc","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"rdc_k8wmgld","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]