Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What I have calculated so far is, AI can not overcome every field of society unl…
ytc_UgxM-dYLg…
G
@nae-nae-99fsd is not autopilot it’s a completely different system. Full self dr…
ytr_Ugz-4oIrt…
G
AI art is only as good as the data set it was trained on, which is regular art m…
ytr_UgwzUzZ0f…
G
BRO I SERIOUSLY WENT ON TALKIE AI TO CHAT WITH ANIME MEN…AND I DID IT FOR LIKE …
ytc_Ugx3aMH2g…
G
Y'all got chats you don't want anybody to see, while I'm here just using charact…
ytc_UgxD6yFwA…
G
@moonlitxangel5771my second comment said I wasn’t generalizing. This is definit…
ytr_UgzxVR_go…
G
Excellent pod cast!!! Thank you. I have already recognized no democracy can sur…
ytc_UgwiQ42ZP…
G
Gemini is consistently the worst for me every time. Google assistant was more a…
rdc_mi769wb
Comment
All these people on here commenting about how people get hit by cars driven by humans more than autonomous cars. First of all, we know that. There are more human driven cars than autonomous cars. That's a bullshit argument, though. That's like saying more people get killed by bullets than rail gun projectiles...
What this is about is liability under the law - if an autonomous car operating in full Level 5 mode hits and kills a human, who is responsible? The car? The car's owner? The manufacturer of the car? The manufacturer of the sensors? The algorithm programmer? Who?
youtube
2018-03-26T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx_cYfqNU3aDhdPlt14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwHuKYnGxVVo6rFHhx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy9rO2wNOT-ikBr4B14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzojAAVa671GkDhr9J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyKQyZyilGfh0x72Gt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzgX2RvjccjA-YPstx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxJ9c89mNiilIm2Ybp4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxEzCpiwo6VeTlj5jd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy63pl_ZifMbW-O96h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzyGRU8kjp5e7zZ_SJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"}
]