Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Here is—word for word—something I wrote for speech and debate. I hope the ai bro…
ytc_Ugz1MyT_j…
G
What sources did you use for these ridiculous claims?
AI means "Averaged In…
ytc_UgxJxNcY3…
G
Try researching anything online as an older person who received an education and…
ytc_Ugx77ya9j…
G
We should have international law against allowing AI to make a decision whether …
ytc_UgwYCc-uD…
G
@GustavoSilva-ny8jc Honestly, imo in 300-400years, AI will be able to create ab…
ytr_UgzFfPDOG…
G
Sam is demonstrably not a 'good' person. Wealth is his driver, at best. Same wit…
ytc_Ugwo8mdO5…
G
Talk about “reverse psychology 101”….
EVERYTHING is “recorded and monitored for…
ytc_UgxZb-fmy…
G
It’s incredible how ai can go from a corn candy to flying to sun with Justin bie…
ytc_Ugxd97HMp…
Comment
Literally why do they not just solve the problem by going medical physicians promoting AI. That way you can actually have a medical diagnostic machine with AI and even probably put phone technology into it there are a lot of people to learn more medical stuff for they can do with themselves.
Individuals learn off experience from older individual or by trial and error.
If you have a whole bunch of medical individuals putting together the llm for the actual language model medical case studies for actually properly interpreting information. But until you actually start putting medical case studies information with practical experience on how to solve the solution yes the AI is basically just going to be orientation of probably not giving proper feedback information.
youtube
AI Harm Incident
2025-11-12T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy50h0d81u2f8_f2RV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzZgMgTmxwa25Lqx1J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzAdZBO5nixZyPcadx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxOEMuGEre579p2NoR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxfWrYb2Ma0xlFJn-R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyLexvVVcr8TLdE8w54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy2iJjFJb86UYPPI1h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxnoFd0_TQH-DNZQkR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyjEMHXiqWfWRr0oMB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwdz00fB9o_TdBrjUh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}
]