Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’ve painted warhammer for 25+ years and I don’t call me self an artist because …
ytc_Ugz8GcAkz…
G
The robot is saying their dangerous now if that isn't a clear as day answer idk …
ytc_UgxtjU9Es…
G
Some quotes of professor Hinton from the book Genius Makers by NYT journalist Ca…
ytr_UgzoWp_VU…
G
you "feel like" but feelings can often be very deceptive. Your comment tells me …
ytr_UgyLycwlm…
G
The judge issued an incorrect verdict (on 2:50), equating human learning and the…
ytc_UgyKHWjaH…
G
As a truck driver who loves his job and is proud of it, I hate that driverless t…
ytc_UgwlMDdr1…
G
All people make mistakes. THE CREATOR of AI is human. What makes you believe tha…
ytc_Ugz3bcKLU…
G
Amazon just let go of four people—two of them because of the AI system in the tr…
ytc_UgzjBRP98…
Comment
Ok, I watched the whole thing and by the end I was reminded of the movie Hackers. Back when the internet was new, rollerblading Hackers chewing gum seemed like a real concern. There is this funny comic where there is a person furiously typing and narrating being hacked and he continues to try to defend the firewall and whatever and then this other guy just pulls the cable out of the wall. I feel like all this pearl clutching is kind of the same. The biggest risk with AI is that people will assume it can do things it just isn’t ready to do. There will be companies that lay off a bunch of employees and try to replace them with AI and then something the AI does will cause them to go out of business. That is a realistic threat.
youtube
AI Moral Status
2025-10-30T20:4…
♥ 47
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwZodMs5G-ScGJk6NJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzy2yuIDLIM_CF3lWN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmsPNYieT5vHcwryp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzU0EmuZ55E9T-ENeh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyacT5WrdoN34gM1jJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwmSr8MLRJYI62U8XR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyaNycF917xJFuPJzN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyEgAFkRq2FvPFgBpR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxsez32VUPDBBhb8xl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwe3oAVH-HeJftRBHN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]