Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only issues, do you think China is going to stifle their AI developers? Thi…
ytc_UgxxKPGBK…
G
Why would I trust a US. government led by Trump to do anything better with AI th…
ytc_Ugz1f6Cr0…
G
@Heaver I never said they were,if they wanna do ai art and never learn to draw…
ytr_UgyAWgk4M…
G
I paid for your AI class and it didn't get the link to watch your video and when…
ytc_Ugx--ned2…
G
Looks like AI art really inspired them, and now this video will ensure that that…
ytc_UgxPmF42P…
G
AI female robot, fucks on 1st date and never gets a headache with volume control…
ytc_Ugx_2xw34…
G
2:00 well I'm not saying we should stop or that we should implement regulations …
ytc_UgyS7jCK1…
G
I've heard these two arguing a long time ago. The butler-looking robot, Hahn, se…
ytc_UgyfkVZbl…
Comment
The robotic laws must be embedded deep into programming the issac isomov robotic laws t he Three Laws, presented to be from the fictional "Handbook of Robotics, 56th Edition, 2058 A.D.", are:[1]
First Law
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
Second Law
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
Third Law
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube
AI Governance
2023-07-07T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw5sDo63gW6Yv5Cy8N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxMtjEkXddaRf_AY_t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugxp2ToDwzWnJgn3rlh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_X2oLuUWgS574vQZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzal945MQpjRpHO1xV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYHMBnGWS5d34WoKN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw32eWr6CLmIVufUD54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzZdat7GrtsGDVQenB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1Dw0O3viJ8TDbWMV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw87U76ibO8mRZbTXR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]