Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Too bad this guy doesn't really understand the limitations of LLMs and neither d…
ytc_UgxfSH0sA…
G
Always wondered...without biological needs, if Ai did get rid of humans, what mi…
ytc_UgyAwYJv4…
G
@noname7271 I'm talking from the pov of the media and people not involved with a…
ytr_UgznWWQ3n…
G
@Gooseofthefallen No, he does have a point. Ai users don't seem to have much ove…
ytr_UgwRzc3nS…
G
So much easier to make them look like the plastics now because they all look the…
ytc_UgyXDZ7Ji…
G
Is there any scenario where humans need AI, and AI need humans in order to survi…
ytr_UgzLbdLud…
G
did anyone see in the first vid the robot girl looking at her? that was so crepp…
ytc_UgymYIkg7…
G
Call me a Luddite but I see no benefit to the regular person from a successful i…
ytc_UgyfJ5HfW…
Comment
I still question the usefulness of self driving cars. You forgot to mention potential glitches in the software that can be even more dangerous to the cars decision making. I get ticked when people think a better future a freedom is through advancing tech to were they want more and more government involvement.
youtube
AI Harm Incident
2018-10-18T12:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxNu6orq72VYmfHfwB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZh_afhC_OOFGLQXR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyCTPsECAP96PGh3Hl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgxJKKQ9sqMp_Ti81H54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxqNM8gW2hHoyqHe5l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxnIUdclFIQ-4Rv2z54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzlBBbufEX3_0ASe354AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwCGqFtlXMJ6s8DwaZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwtXdfQuJN50FtVCfZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyosQqVFTeUat0GwLx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]