Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is so not good! It literal had to steal, plagerize to exist. It's built on a …
ytc_UgxWB_1hu…
G
"Exposing the Dark Side of AI data centers" buy intentionally misrepresenting wh…
ytc_UgwgG-inH…
G
This gonna get spicy. Because if Times win. AI will have nothing, because AI onl…
ytc_UgwkgXPuK…
G
If you need ai to help you imagine things your brain was already cooked to begin…
ytr_Ugzo2U4B6…
G
My take is that ontological descriptions made by humans are never 100% accurate …
ytc_UgxMoXLqq…
G
I really don't get how some of ai supporters perceive making art as some sort of…
ytc_UgypJwRLA…
G
I only use ai when I need something that doesn't exist or I can't create. I can'…
ytc_Ugz-w4ihU…
G
I work with a woman that uses ChatGPT for her health.
She gave herself iron po…
ytc_Ugw9F43o3…
Comment
I would have thought it was a bad idea too, mostly because it can give you bad info, but a family member used it to try and figure out what was going on with them & it was actually very helpful. It accurately reflected what I found in separate research from legitimate sources, and put it into a “compassionate” dialogue. I was even able to use it to help me with how to deal with their situation. AI has a load of issues, but people dealing with mental illness are DESPERATE. quality mental healthcare is so inaccessible. And so many people don’t understand. So this might be an outlet. Just giving you a different perspective. In the moment you do not care if someone writes a book about your problems.
youtube
AI Moral Status
2025-07-23T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugz-Wwmx4olwPK8N4Qx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgymM-NXotowhjX-64B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwxsuT4KvmVJoUaZeR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugzzi3q1uETPOyfMXd94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgymLkbeuCuoHQoDN5Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugw2HmD6GJs5VhZ6OWR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwvEkNy9hnLddUYf254AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_UgwNOUnN4Kr8FoPHvQV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgxV7nUnWOXLnEgEmqZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgycaxNM-Kf8JoZFQY14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]