Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"AI is not about safety" mfs when they feel a very sharp pain in their chest in …
ytc_UgxqnfGih…
G
Every instance of a false arrest related to facial recognition software that I'v…
ytc_UgxWR8oQC…
G
man i'm glad your dad didn't fell for it surprisingly but yeah its scary how acc…
ytr_UgyQgaIis…
G
@caobitaIn T2 the dog still identified the T-2000, you can best bet they kill d…
ytr_UgxSho2QH…
G
What's the essential difference between hiring a person to create artwork for yo…
ytc_UgzCqX3B8…
G
If you're Gullible enough to use AI you kind of deserve what happens to you.…
ytc_UgwL39VOU…
G
Ye like recently I'm getting some comments from ai bros and I'm not gonna delete…
ytc_UgyKNZ1Be…
G
Nah we should be fine as long as presidential ai remains president ai voices pla…
ytc_Ugx1fmL2i…
Comment
1:11:55 Hello. HELLO!!!! AI IS NOW and since the beginning of the Gaza war being used to target civilians on purpose and maybe by mistake. If it is AI mistakes then the entire Gaza Strip was a AI mistake. Look the tech-bros of Palantir AI are culpable for the 170 school girls annihilated by Tomahawk missiles, TWO missiles in a "double tap". Again, that strike was orchestrated in part by Palantir AI and it's "Kill Chain". To quote Google Gemini AI... "The AI Component: Reports indicate the US military used the Maven Smart System, which was built by Palantir Technologies, to identify, rank, and target locations in Iran.". And right now the investigation is trying to determine if the so-called "mistake" was AI or Human.
youtube
AI Governance
2026-03-26T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwvdKWnPjUV9f-tZVt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwLc2II-NW1KhMrKkp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyHjwe987gHpF5nQw54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfyEdq_Mgc7Q7gLYt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyXnOWFAraynJwSlV54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy83ApDK18PwGIOdmx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwlp0QaZilJG2rE7PF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxwD0HmezmjFqXNF2F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzoia8QRE4uMWwesCF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy8AP0DU5HWPwxhdgJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]