Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI art renders black people's appearance as monkeys. 🗿 Just try it. They wouldn'…
ytc_Ugz3Fiu4H…
G
Loved the Rick and Morty robot at the beginning
“Why do I exist?
- you pass th…
ytc_UgxCiMscU…
G
AI is like
"If everyone’s an "artist”, no one is”
But in this case if you call …
ytc_UgxeMbxdQ…
G
But what if humans generally become more skilled? Hasn't that been a trend for a…
ytc_UgzsalTjv…
G
The whole “ai is inevitable” thing I believe is true if companies use it in ways…
ytc_UgxfxNKf2…
G
ChatGPT is a master at creating book titles on any subject as you would expect. …
ytc_Ugwt7iyAl…
G
@no1r Ideas rule over LLMs. Are you smarter than a statistical inference engin…
ytr_UgyUorRiW…
G
AI art should always be marked when published with the following disclaimer: "Th…
ytc_UgzaKWaSs…
Comment
The Point is not whether AI can do, even do better, than current humans can. The Point is that the human race is CHOOSING TO DEVELOP MACHINES instead of CHOOSING TO DEVELOP HUMAN BEINGS. Humans can be trained to have as few car accidents as self-driving cars, etc. So WE the humans, must choose: will it be the machines, or us? Will we choose for the Human Race, or choose to destroy or at least ruin it by immoral unethical choices.
youtube
AI Governance
2025-09-07T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwcFKvZgtIpERESJQ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0CDjStTmeP0gRbzF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzESjZ__oZyeRqlcpN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgybRxkmIWSFQmc-X_F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxa29C-z3QnhQh2pqx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzN1VsTCU6grJc3_AB4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyIIvjzmiuF1u_UmzJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxwu_3gWepbJl1Pc1p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_Ugy1tn-kEwjapUzl7gd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"mixed"},
{"id":"ytc_UgzAbmPA2gWZXXn2D5l4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]