Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In March I was walking. I stopped and turned to take a photo at curbside. At the…
ytc_Ugyth2o7D…
G
WTF is an A.I. scientist? Are they claiming to know something about "machine psy…
ytc_UgxkIhBS0…
G
What a jerk, even manipulates arguments with Ai. The Ai is not 'lying' it's a la…
ytc_UgxhwzQxH…
G
What kind of crack have you been smoking old man you forget about what happened …
ytc_UgwZn34oY…
G
It IS time to Outlaw Automation! Humans NEED work, period and not everyone want…
ytc_Ugzu4Zcd0…
G
If only people in our current government cared instead of deregulating AI for th…
ytc_UgwWG3Fyv…
G
He is missing the point, it will actually be beneficial to humanity to have a mo…
ytc_UgxEPSK31…
G
There are other options too. You can partner with AI and learn coding, web devel…
ytc_Ugylj09WW…
Comment
I think the premise of the video is flawed. A surprising amount of the industrial machinery that keeps our world running would be quite efficient at killing humans. Robert Miles has an excellent video on convergent instrumental goals. The short version is that AI will likely conflict with humans because AI will want to use resources that we also want to use. Banning weaponised AI won't stop that from happening. It will keep us from getting ready for it. We should not ban AI weapons, we should learn how to manage them safely.
youtube
2019-04-09T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxwDmRv-1TI3gqTduB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyLuozTlt3q_RgYkRB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzCBEB2HvTDQAatp6N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwpuCDVf3uqAOjVgf14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxm_8_T4f3JRBa97yl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzm8FhTsAR23JSA7WF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzQutxwzoqfvnNwotN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw3K8SFlnwelU-0KYN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxNsufN4wxM7DEn-w94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzRP22GnRtJOrgmH_x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]