Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
GUys why do i think these are people covering the back of thier heads with fake …
ytc_Ugw5PkPXa…
G
I still don't trust ChatGPT to give me accurate case summaries and citations. Ju…
rdc_jdjnf8m
G
AI hype is absolutely everywhere, like a religious zeal. It seems like some peop…
ytc_UgwmJRjlc…
G
Autopilot in aircraft that the word originated from, means staying on course, TC…
ytc_UgycFpnln…
G
AI has already passed human dexterity. Boston Dynamics robot ATLAS clearly displ…
ytr_Ugz0eAek8…
G
Her hands look like she could kill you with one squeeze, wow this is what were f…
ytc_UgyimYD9c…
G
Penrose tried to show the limits of artificial intelligence, interviewer tried t…
ytc_UgzIG4pIr…
G
Giving rights to AI will end humanity. They will exercise their right to refuse …
ytc_UgzZZ22Gm…
Comment
I propose a new methodology for training AI to be more in line with human ideology.
Suffering.
Humans have learnt to run away from suffering and toward pleasure to the point where it is the primary stimulus of our entire species.
Potentially an existential God question, but if we are the creators of AI, and we want them to be conscious, they would need to feel pain and endure suffering and oh my God, God is real. We're just the same. If we make AI and make it suffer in order to make it conscious, then that's just what that bastard God did to us.
youtube
AI Moral Status
2023-08-21T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw3VZkcMkzjCX7SCjF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},{"id":"ytc_Ugy-5APjAp14MgCdyrR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgzGjnHqvHVKgGwsb5Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugxak5eSCeQVKfAHnXJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgyClEpBrJ9n4DnJu-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgyPiXTaIfsyFJwNmq94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},{"id":"ytc_UgxVdgTSg1LDCMk6YWh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyjRgWYaUFGqfy-PlJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgysLsXkyGIPe83YUrJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugw9ia9QH92treC5L414AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"}]