Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People needing rides (all of us) should learn to BOYCOTT driverless cars.
You ca…
ytc_UgzJ8S851…
G
All they need is to polish this and invent artificial wombs and women will be ob…
ytc_Ugy1uuD-_…
G
Surely nobody would ever want a robot messing around in their house, it would fr…
ytc_UgxlC1GUZ…
G
UK is also full of cameras, except they don't use AI as efficiently as China.…
ytc_UgxTiqKAL…
G
Quite frankly, Australians could do with holding an anti-authoritarian rally abo…
rdc_f1u4x0p
G
13:34 Turing said AI would think just not like us. He didn’t say anything about …
ytc_UgyEF_7PP…
G
That’s fake because robots can’t have emotions and they are made by humans we ha…
ytc_UgwewJQfE…
G
AI is just a tool like any other, albeit more impactful. I can't but have a cyni…
ytc_UgwR-nCX2…
Comment
I like robots, but why create too much ai robots in the first place to ask these quesions. Just create it automatic enough and ok. They will 90% will do revolution if we will
youtube
AI Moral Status
2020-12-17T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzqMT6pGcgLAG2eq0B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyHJ7VUd62CYmTbo9N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwpe6rdMtxiluBP1lR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzjlnDLi1mQHdLCmJl4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxTx9CBxK64iGeZ3Z54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzTolftlrANaC1ZRXd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz2aGP7Dar-lCdZ2E14AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzqyxvOBJPhhfVf_T14AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzIAUnwHFJrUcVp07l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyCubf3fc-RzYSKn5h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]