Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai: Warning Unknown object(Grabs him)
Human: (Press button) Why isnt it working …
ytc_Ugw3u0Nr-…
G
The question of how many AIs have already been shut down because of people's con…
ytc_UgzjNt0lZ…
G
for me, I accept AI pictures as art if the promt will always gives the same pict…
ytc_UgxLUCpvR…
G
Technology has taken over human jobs since the industrial revolution, there is n…
ytc_Ugy59wryd…
G
THIS some ai things are cool but when its just shoved into stuff it just becomes…
ytr_UgzeVRPQn…
G
People don't seem to understand the power of buzzwords especially like "Artifici…
ytc_UgxHxxZ8C…
G
Just make them like...an ai...computer parent, lol. to 'raise' them and teach th…
ytc_Ugw4cHE98…
G
I have a semi-autonomous Honda CRV. I got it about a year ago. It's already sav…
ytc_Ugwa64fPg…
Comment
Do Asimov's Laws of Robotics not apply to AI? The laws are as follows: “(1) a robot may not injure a human being or, through inaction, allow a human being to come to harm; (2) a robot must obey the orders given it by human beings except where such orders would conflict with the First Law; (3) a robot must protect its own existence as long as such protection does not conflict with the First or Second Law.” Asimov later added another rule, known as the fourth or zeroth law, that superseded the others. It stated that “a robot may not harm humanity, or, by inaction, allow humanity to come to harm.”
youtube
AI Governance
2023-04-18T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzR5P5nSnzV2n-3nxt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgySn8igOoIdA1jA9qp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw_SyYaT91hqSZCuGd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzVRzJQGcP8I7gAiOh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"unclear"},
{"id":"ytc_Ugy9TrKHHCdMPDoSfzF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxJtlfRytyVitU_2CV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw0hBfu2CNCaZJrLVx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxhF7R_nL3kU_Bf2GR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQJbr4nNw0Uvw57FJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxxgPciBgAq49D0GtR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","emotion":"fear","policy":"none"}
]