Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Blaming AI is a little weird on this one. Why would it or anyone else assume he …
ytc_Ugwd_nZUb…
G
Artificial intelligence has algorithms that let robots create their own new idea…
ytc_Ugyd8UDRE…
G
Completely deluded. The 'AI' is just a program that is highly trained to convinc…
ytc_Ugy96n9wD…
G
Tell me one thing in which AI is able to replace humans as of today. Dont tell m…
ytc_UgzoRhJDa…
G
It's funny how people talk about how AI will take our jobs. Turns out AI takes m…
ytc_Ugyg9DmmR…
G
Kinda glad I could still recognize the first one as AI. The eyes were a dead giv…
ytc_UgzdZax9H…
G
When the input is extremely hard math problems, a request to write a new program…
ytr_UgwTlwQcj…
G
The only sustainable future is surrogate robotics. No company should be allowed …
ytc_UgzHl9rBD…
Comment
When you rely on robots/AI to do your own thinking and possibly reasoning, that is scary! People will believe everything AI is telling them, and what is the best course of action to take! They already have AI boyfriends/girlfriends, AI psychologists, I suppose AI Pastors. It’s even getting more difficult to know the real from the fake in some of these YouTube videos! Most won’t know the truth from a lie!!
youtube
AI Harm Incident
2025-08-30T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx_aSZ7SWWiN0uG0FJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOV7RgDK8m8vOBuGx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxHFBUSmAT8dxDlx-Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwxUetV9ZWAU4MCPpl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyD3seX-rKkfHgxy154AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyLebQpJTlOoK76FXF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyqdm7cC0br5Du09h94AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwtj2YddjTDsFyjIAt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxZf4XId5V6wktuC_Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxQEOhwgLzzpJaHVe94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}
]