Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI by nature is non-deterministic, you’d never want use it where you need a pred…
ytc_Ugw04JeSE…
G
Invest your time into being a tradesman ai won't be taking those jobs anytime so…
ytc_Ugw4fVdQz…
G
Copyright is an interesting phenomenon. Literally, it's only enforceable in west…
ytc_UgxEI3fKy…
G
Also, use Version History in Google docs to show all your revisions. If OP actu…
rdc_kgqbtbh
G
My family are artists and i want to do it too, AI is okay when you need some sch…
ytc_UgxbLnBPT…
G
4:50 yep, it's been 2 years since this video got released, and ChatGPT is STILL …
ytc_UgwDmFq9A…
G
should have kept working on that education then you would be creating the AI not…
ytc_UgwDOr6J2…
G
Of course they will, all the wealth being created (by AI instead of humans) will…
ytr_UgwxAbydC…
Comment
Self driving cars are a version of the trolley problem. Do we want to choose fewer deaths, or the deaths of those who are not supposed to have died without our having made this choice? Self driving vehicles make it more evident that those who are behaving safely could be killed by a car which would not have killed them had the cars which killed them been controlled by a thinking human. I hate to think that I could teach my child how to be safe and have my child be killed by a vehicle that did something completely unpredictable and potentially, unavoidable.
youtube
2023-08-08T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | contractualist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxAcV3-jeGRD8Ee6zx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwnChjmSX_yHITtIzR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfQrExD2D6_UQHyEp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugykx2wAY5dREF-SFFJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxVTuSUCocmKajIjtN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwx1y44FZI776ewm9t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwXrz0slgBdwc2zw1l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyFCnmOfLiYdmU-wYF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwaemTH8eWUccvEhjt4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyt4Mx2dB7uiJ5TBdV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]