Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Fake hype bullshit, don't believe any of it. Even the alarmist "anti-AI" stuff i…
ytc_Ugz_V4OKe…
G
If y’all aren’t boycotting any business supporting driverless deliveries, you’re…
ytc_UgwwBHavK…
G
Robot "I don't have time for modesty, I want to create the singularity tomorrow"…
ytc_Ugy0QMiEE…
G
I believe humans are paid for their time not productivity. If I can make 1 widge…
ytc_UgxW2_B29…
G
People use Luddite as an insult against people who don't like AI. But that comes…
ytc_UgyGqz3nh…
G
Dealing with medical issues, I’ve been asking ChatGPT all my questions and it’s …
ytc_Ugy99HCxA…
G
A hidden stealth off switch inside A.I. Robots could stop a takeover easy as 1, …
ytc_UgxzSs0zR…
G
As a pixel artist and programmer, I definitely see how it could be upsetting if …
ytc_UgwdHDjRm…
Comment
“But I didn’t use Open AI, so it’s fine”
No, you just used a application which almost certainly just used OpenAI’s API, so two tech companies have your deepest darkest secrets.
Edit: Yes, I am aware of self hosted models and my comment doesn’t relate to self hosting, it relates to third party companies.
Regardless of self hosting, LLM therapy is a terrible idea.
youtube
AI Moral Status
2024-08-30T13:1…
♥ 26251
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyUmVNE1wQCK7-yyPR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxwUbktSCocj45nqR94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwtd0tbQY-GQFYS61J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz6pB-M8AeaMMoKWWB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxqzx74AJdFkIfRmWB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzpUAU1utHPydnd7h94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKFHJxdSkiD1fO32x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyppLZJ5sC6xaeC5nd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxv38mtu4UzJCDkS5l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyRJnEg8d2mJ2pbiL94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]