Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lol I love artists seething about AI, makes me want to start to make AI art so t…
ytc_UgwoHUKnO…
G
It's not the AI that has a racism problem. It's us humans that do and since the …
ytc_UgyIEUgjZ…
G
I fear that we will end up relying on an AI companion to check for us the legiti…
ytc_UgwAWN29g…
G
There’s a huge difference between AUTOPILOT & FULL SELF DRIVING!. In my opinion,…
ytc_Ugw8NuBFZ…
G
What if Ai getting hands wrong is just them seeing all the poorly drawn hands on…
ytc_UgxqYj8DV…
G
7:15 I know what we can do with super artificial intelligence, and that is get t…
ytr_UgxpzCnzA…
G
He said universal high income. Money AI generates goes to the government. The go…
ytr_UgymhMZXr…
G
Ooh nightshading already ai pictures to speed up decline in ai. Well that might …
ytc_UgwECumnX…
Comment
Happy thanksgiving and you be well as well, doc. >>:=) Glad he was able to make the full recovery in this one. It just goes to show that in medical or any other important decision, you can't be relying on AI for sound advice. If AI is misinterpreting the nature of what you want to do or for what purpose, the advice could be incorrect, not to mention the dataset it is pulling from may not be the best either. Things like diet/medical decisions, one really should be doing deeper research via reliable sources or speak with a doctor who can relay more about what you're trying to learn about or any ill effects that may come with it. Gotta be careful out there...
youtube
AI Harm Incident
2025-11-28T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwSnZQD-OsPoXlLwit4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzPKSFDMBCoSBhbvht4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw8lNF-yoq9ipqGtqJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwQcsONsgJyHZjKdgV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxf-9hlzxc1ytVhGIV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwrjA7-aOECSGjVFw94AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugzo8UQ_yOcx7w5aXT94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxqT3optPIFNP6dXsJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz2AsZN2ei7f-yeLGt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyVZfwi4Aja4WmLYPF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}
]