Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love the idea of having robot helpers but my god so many bad things has come f…
ytc_UgzQMbSP0…
G
The only thing ai is good at is skipping school assignments which isn't a good t…
ytc_Ugy9K4afG…
G
Very thought provoking video. I got the impression that without constraints, AI …
ytc_Ugw3wSD2K…
G
Quite a few challenges to this view from the assistant professor. There are a mu…
ytc_UgwQNReAZ…
G
Society: let's automate jobs so we can have more free time and still fulfil our …
ytc_UgwJSneUR…
G
This is stupid to incorporate an urban dictionary term like "jailbroken" with te…
ytc_UgwHRzLW1…
G
: other robot I'm going to prank my friend
The robot: what was it not working…
ytc_UgxoS0OrR…
G
Once I watched I Robot movie back in 2004 I believe that the humanity will be in…
ytc_UgyA7LWQK…
Comment
Automated robot soldiers would be the perfect gun for a government. A government would no longer need to train humans to be killing machines. A government can implement these for possible civilian security meaning no need to have human police. These machines will happen, because humanity is raised to be insane.
youtube
2012-11-23T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxWyB_WdWgncQqmJtx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx042Ne8UlXAF9_01l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzGZKq-ZmNlstUi3-14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzKI2oHu3nLGoZ6-sd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzQ0osU3HXzJkeJlJZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxjmvwc0z2yufpgD2V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw79eYAz52yiIYoNXJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQLY94VS6RzuJvovJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzonnFIo53uKhEDHch4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxd8XKoFWYSfqF68vV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]