Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI can’t ever be 100% human, just like humans will never be 100% AI. It just can…
ytc_UgwQZlOHF…
G
AI is a fraudulent hoax
I never cared enough to explore it
South park revealed t…
ytc_Ugx0YNlx8…
G
The future is going to be like the wild west with train robberies except in stea…
ytc_Ugzdpn4_6…
G
Silicon Valley guy telling an audience of Silicon Valley people that they need t…
ytc_Ugyj0P0Gk…
G
Ive built a chatgpt kind (LLM) from the ground up before, in its data all it has…
ytc_UgxOXKlY6…
G
The way tucker spins this into some Democrat nefarious issue is beyond me, sound…
ytc_UgwAQ85aD…
G
Well, if millions and millions of us are out of work and not therefore earning a…
ytc_UgzBIpRt5…
G
Yeah but good luck trusting a robot to carry your baby or put an IV on you when …
ytr_UgyNuZhO1…
Comment
From a 2016 interview:
"During a Q&A session following Tesla’s announcement yesterday, Elon Musk was asked if Tesla would be liable if one of its driverless cars gets into an accident. Musk quickly answered that those types of incidents would be something for the insurance companies to figure out.
“No, I think that would be up to the individual’s insurance,” Musk answered. “If it is something endemic to our design, certainly we would take our responsibility for that.”"
So is the great Elon going to stand up and offer these families and individuals a sliver of his fortune to for the hardship his company experiments have caused? Or will he and his lawyers bury all these claims behind a wall of paperwork and lawsuits?
youtube
AI Harm Incident
2024-12-16T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugymxs7I-T47d-2EPgl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxS8QN44BNGQCNwYK14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz5A5m2klmeRivyB5d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwS_LepYfJbvFnKSMR4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzebMplL1mA9E_4LPF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz8rez9_oL_bu9knnV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzS8ooHOA5c2aTeTiF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzfLUOfQuTAP5UQmkF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwiKaMdsF4mK-Aom514AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugydqn1QFkeeyNgxTd94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]