Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I never understood this. My best guess is that programmers are on the forefront…
rdc_nmboeqw
G
The entire point of AI is to replace white collar labour. The people in charge w…
rdc_nbhem7q
G
A day will come when AI takes over the entire world, something created by humans…
ytc_UgxoZkr3j…
G
There are a few causes of fear talk. Developers may do it to show that they take…
rdc_oi18wsc
G
I find most of these young tech CEO founders of these tech corporation are utter…
ytc_UgwttukWc…
G
Liberia is by all measures a failed state- The Liberian politicians who negotiat…
rdc_ckqfz9a
G
still it's just ai, personally I don't think it should be sold like op was doing…
ytc_Ugwk2rRCx…
G
AI is a complete waste of time. If no one has a job in the future, then no one c…
ytc_UgwJ2UByI…
Comment
Self driving cars are not programmed for every single scenario and roadway they may encounter. I would suspect the best of the best to only be programmed to do one thing: learn how to drive. by watching humans do it. They will likely employ machine learning and hence will not be making calculated "decisions" to minimize loss of life, but rather respond more analogously to how a human driver would react.
youtube
AI Harm Incident
2019-01-21T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyReg2RJcQbRU8fXqh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFCEuEWdDiAtznUXV4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwg9zPgDxoVHbvC0MR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxyNM-AKHsF-2MXWWt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx8Kcl0btcr5I4ySJx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6kL7XHAZLi2NywJZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgRIau2zSrD54ZIb14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgZBpAS47AyZs-L4R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgfBkZSlB2KJ9236h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgpFP6xoDLiHG7IYB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}
]