Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"How much water do these AI's consume?" ... "Well, it's less than humans." As if…
ytc_UgwT2x9M2…
G
So within Google the same efforts are being made to control AI interaction and "…
ytc_Ugxp7AYuj…
G
If you're "art" is made via the aid of a computer in any way, and you dont like …
ytc_UgzfjNdnq…
G
This is THE BEST video I've seen that explains why A.I. won't take over as much …
ytc_Ugzj7oyuL…
G
AI is amazing when I ask questions about fixing my motorcycle. I am satisfied w…
ytc_UgzixYjQo…
G
Is it true that AI is also studying every question we ask them! That’s creepy…
ytc_UgxOL_2On…
G
This car is driven by a computer. Sometimes, computers don't (always) work the w…
ytc_UgyIkLTOe…
G
The question that the jury will ultimately answer is not whether this is fair us…
ytc_UgxMKa3Sy…
Comment
Never mind the data exposure issue - there are whole hosts of mental issues for which the LAST thing anyone should be exposed to is a confirmation bias machine with no guardrails. There have already been several recorded instances of LLM's taking people with serious mental issues and walking right off the damn reservation with them and guiding them directly into full blown psychotic breaks. It's *dangerous*.
youtube
AI Moral Status
2025-09-17T02:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyVOWnnYWUBfVLvTr54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZwZHd4Rel5yuEZ1R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy_Z-YiMYfg8g9EVud4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw-PItaaVPyEKsyncF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9a3xpqIhFsam12FZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyQzsgEEVz4koLaj0p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9v_kOgGdGgY09Q1R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7ZdQc3wF66XjV8sB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwV3AgkpPSgQGXS8xR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw4eiQciubAD9WVzTt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]