Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think, what makes matters even worse, is the fact, that the person using AI to…
ytc_Ugw04TNKS…
G
@tidalshooter9778 As for AI, these systems are actually massively overrated and …
ytr_UgwQ1zl8i…
G
I was never a huge fan of co-pilot vehicles, and now I see why 😂…
ytc_UgzC8d6Zi…
G
Meanwhile Larry Ellison & Oracle is quietly getting access to everyone’s onl…
rdc_o87jnxu
G
Too me personally AI is harmful, because of what is one mans creation, is mans d…
ytc_UgzIBDRt9…
G
I wouldn't doubt it. I'm creating enterprise applications with the help of agent…
ytc_UgyYG76zx…
G
Let’s be CLEAR: what would be far far worse than the free market and big compani…
ytc_UgxrOtwTd…
G
I wonder is it possible to have a worldwide non-profit organization that can hel…
ytc_UgyvKQ-1E…
Comment
This guy seems incredibly naive, if not dishonest. Solving the world’s greatest problems is not profitable for companies that treat diseases, make medicine, sell oil, manufacture weapons, or try to get you to upgrade your gas guzzling SUV every few years. That’s not why companies want AI. They want it to make more money, and employees will be collateral damage. Sorry to sound so cynical, but if you think big companies are going to keep paying people for jobs that an AI can do just as well (or better) for the cost of a robot and maybe a licensing fee, think again. As soon as we have AI robots that can do our jobs well enough to not be a liability, and for less than the cost of a human employee? They’ll order in bulk before you can read your pink slip. It’s already happening. Just ask some of the 9000 people Microsoft just laid off.
youtube
2025-08-04T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwE71bPC21aQ5LoZvt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw8pTjJExa0dYtFeTJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwTS5BWoP6xm4ifVTh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz75DjyMjka1_VKoZZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyQ4sSJhSMPah0oAcN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz_vYnuVr2TLY9OehF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz_eiq4-kY_xryGqPF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxVO4JQ7gE3VNAJpjt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwHaKQQjifz3S0Z_Dh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy8ohR6br8QWsagLVR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]