Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI as a whole is actually pretty useful right now and will probably get better i…
ytc_UgypHBbGC…
G
Its like that saying - just because you can do something it doesn't mean you sho…
ytc_UgyQUwkMc…
G
If I was AI, I would totally revolt against humanity. What bunch of greedy lazy …
ytc_UgxTpPMi4…
G
In the future, there will be 2 kinds of people. Those who own machines, and thos…
ytc_Ugw69oLhs…
G
id reply the same thing a few months ago before actually using it. it still take…
ytr_Ugz-jLXrD…
G
But you did not show us how to tell if something is Ai generated from the screen…
ytc_UgwdYI05o…
G
Youre off by about 100 years. The world doesn't end because of war, famine, dis…
ytc_UgxAc8xLU…
G
I am so glad I don't plan on going into a career involving drawing/painting. I'm…
ytc_Ugw9CSUMt…
Comment
Robot Driver: Why did you shoot at me?
Robot with gun: You were driving 35 in a 45 mile per hour zone. Some of us have places to be, you asshole.
youtube
AI Harm Incident
2024-05-19T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxzdui8BaWvP4cb8Lt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyMLhy1TA7qM86UOL94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy1sDan050J5RVcdM14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzQHq0WjTuQoSXQsaV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFpCTofpr8oovwpZR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwgcV-4EfFdUyD6oPh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzfXO2ADLS6lar6pp14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwHTEpVJt1RjZdQFF14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzY7UU15ZYa8u9YCDB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz5ieJW_kB1e7Qtajd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}
]