Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
0:15 OK I'm a bit concerned with how the video starts. You should not be compari…
ytc_UgxmJQKnG…
G
I am a Gen AI Product Specialist. DO NOT RELY ON GEN AI FOR CRITICAL MEDICAL ADV…
ytc_UgxbXUNRM…
G
robots like this don't exist. and nor does AI. robots can barely walk let alone …
ytc_Ugy8fnxOy…
G
Wtf, are U stupid, He fighting a medal and steel,he would never knock out no rob…
ytc_Ugwx7d-zl…
G
5:05 Hmmm...sounds like we should get rid of AI then while we still have the cha…
ytc_Ugw8FmE8x…
G
I mean no need to be mean to a random guy just cuz the pictures he posts are ai…
ytc_UgyWvEfRT…
G
"Would you like to destroy humanity one day?".....nice, that she has to think ab…
ytc_Ugyebo3dI…
G
Ai steals from other people’s art and mashes it together. It has no “imagination…
ytr_UgxeNjvvK…
Comment
The biggest problem is that therapists are also using AI.
Source: I was a suicidal man and got sent to the hospital. In the psych ward my therapist told me how I felt and I said a word similar to feeling down. She said let’s see what and pulled up Google Gemini. She types in words that sound like down feeling and she highlighted them as “Worrysome?” “Depressed?” “Saddened?” And this was not a new therapist she said. “I have treated a lot of patients. I have been doing this for 25 years.”
youtube
AI Moral Status
2025-08-23T19:4…
♥ 29
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzhr5YiABoG43VH3st4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwb7yOgmKRYbyqtYZh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxrYrmW-_5_bzZ3GhN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzv827oB3mGGRYCsPZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy4m2gpG_JAu7EawZh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzR32nZ9h536SKCenJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3irVGwQOBvRW7qYl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJ28DKC06ddlWByMN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2ODPQ0c5D19iFcOB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyXK7Uxa_8P5XU5_gN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]