Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I want to down vote this because I really don't like it.... But then I remembere…
rdc_detye1k
G
This has aged poorly. Here's the actual interaction between curators of AI art a…
ytc_UgwzCFPv5…
G
If AI takes over our jobs who then pays taxes to govts to pay their people and s…
ytc_Ugz5aUM2R…
G
Ai needs artist to work, but artists don’t need Ai, that’s the difference, ai is…
ytc_UgwGlt1Wp…
G
The thing I've always wondered is that can governments afford to NOT step in and…
rdc_jd7wisf
G
"Hey, can you drop off this load from Seattle WASHINGTON to Daytona Beach FLORID…
ytc_UgxmvR8Z7…
G
@For Paws … there's nothing "communist" about china. its capitalist chine. and t…
ytr_UgzHTdn6t…
G
Bonjour ''IA'' d'ou c'est doté d'intelligence?
De plus quand j'entend souverain…
ytc_UgxaPRmYw…
Comment
Sounds a bit like a more nuanced and updated version of what Asimov proposed over 80 years ago :) Props to both gentlemen.
"A robot may not injure a human being or, through inaction, allow a human being to come to harm."
"A robot must obey the orders given it by human beings except where such orders would conflict with the First Law."
youtube
AI Governance
2025-08-14T13:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzRtarstrWwRQMu22B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugwan1_9DvbcG1rRH_F4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyROXPeek-vNeZ3gTR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzZqjoPmfec_UbUZyB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx0MY7jTLC3DrmuIdl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhET_KY1Z7alju2wd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx7m3TG2agDZKUskWN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySqql1ODiK3_Nl19N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyBVV-hqlslcb-LIk14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0Erj-kHymCzqunUN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]