Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just had a wonderful conversation with Grok about alternate histories. Noticed t…
ytc_Ugx2MzUx0…
G
I’m convinced once AI takes over majority of the jobs on earth the next step is …
ytc_UgxC-T2P9…
G
What can Twitter do to stop things like this?
[https://twitter.com/DrPaulGo…
rdc_fdf9a2d
G
This is the same conversation humans had about computers in the 50s and look at …
ytc_Ugwrj45ax…
G
The monster you've drawn on the lower right looks pretty much like AI. It makes …
ytc_Ugy67FMUD…
G
I'll be honest I didn't even know how the AI generators acquired their "blueprin…
ytc_UgxWXVK5P…
G
AI is NOT sentient at all. AI is no even intelligent to begin with and never wi…
ytc_UgxgERnjs…
G
I support self driving, but I want the vehicles on grade separated rails. (Thoug…
ytc_UgxcHA-ph…
Comment
If we define "consciousness' as having the ability to "think for itself" we can literally do that right now if we wanted to, if we were to train a model to live like a human, we just need to replicate our dopamine system for the AI's reward system then just let it run free in our society with everyone treating it like a real human, it will develop the same neuro circuit as us since thats what millions of years of optimization from natural selection create. It WILL and I have to stress this, ABSOLUTELY WILL have something along the line of a "emotion variable" to hold the state for that kind of stuff.
The biggest thing to argue about here is does modeling a human perfectly down to the thought and emotion, even things inside the mind consider consciousness or just another model
youtube
AI Moral Status
2023-11-01T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzuFRkfY-K_NTAsA054AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyAiLDtxVGueYa1Wqp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyFDODU2brkuPY2pZp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzDzLQCHTxj5nNxKbV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_Ugy2a95WF-lAGSPsYwR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugy62iRof2C4WI9MARx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgyoPCTgr5kMDOvvqSF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgyL09Zgz1N3b99FbCJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzAmQ6z2R7Ifk7Zb8t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},{"id":"ytc_Ugwevrfmf7H5tBiUYLp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}]