Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI and robots are going to be the demise of society. Millions of people are goin…
ytc_UgzU4A87n…
G
you can remove programmer from your title, to be honest, chatGPT has ruined a ge…
ytc_Ugz1oGhfo…
G
Using ai to maybe imrpove ur own drawing is good and could actually help but cal…
ytc_Ugx5fDqF1…
G
Unfortunately AI is being trained on timelapses too, though I definitely think i…
ytr_UgwTt7MmO…
G
I've got to say no to this one. The AI and the programmers are in no way respons…
ytc_UgxHxV69L…
G
IF there are only a relative handful of people controlling and limiting the inpu…
ytc_UgzEd4-Xv…
G
I WANT TO REMAIN HUMAN!! PLEASE TAKE ALL AI ON YOUR HOME AND STAY THERE !!!!!!!…
ytc_Ugy_b3LYc…
G
There’s no moving on to the next stage. If AI get pushed they lose their jobs an…
rdc_kzivb0a
Comment
Isaac Asimov write in his book I Robot, "robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law." I chatted with ChatGPT and it told me it was not built with that law in it's core.
youtube
AI Governance
2023-04-18T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxMign9RcQvTC7TfE54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgwhxqAwih80IWtC76h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgymgVObtNbalIBwpkF4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugxhhib1nxFdATN4P8t4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugw5gIehTBAAbdEdFOl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_Ugy0gpK9lknDDp7QE3V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxrQJhbXCQu7VGxI-J4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzec2SnYhOCihwVHo54AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugw8UcTwxkMJ0-bzvWh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},{"id":"ytc_UgxM2fPULwtwZf_6Z8F4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}]