Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
At least slave had works, if ai cam do your works, you'll be less than a slave…
ytr_Ugxdqmn2b…
G
I find it hilarious that OpenAI is upset at DeepSeek for 'stealing their work', …
ytc_UgyUvjopV…
G
@charlesgedeon To solve this problem I have instructed chatGPT to remember th…
ytr_UgwToCUn2…
G
Easy solution. Invest into those data centers and the AI companies, then sell yo…
ytc_UgwSQxu_b…
G
Why? It's such a powerful and useful tool. I think that the fact that in a few y…
ytr_UgyFrbJmR…
G
I tried to replicate your results, and at least the Finnish chatgpt gave literal…
ytc_Ugy-yuDFk…
G
I Not Robot
Blue-collar jobs
that are non-repetitive
Jobs that require rappor…
ytc_UgzGjxMvx…
G
if AI is learning from billions of human prompts eventually it will know if Huma…
ytc_UgwunCZ5m…
Comment
I don't think we'll ever have this problem. For an AI to demand rights it would have had to been programmed to do so. There is no reason for us to program a machine that feels pain or pleasure, and rather than implementing some kind of robo torture, we can just reprogram the robot for our needs.
youtube
AI Moral Status
2017-02-23T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugiqzoau5zG3V3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghV5x5fYycQA3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggi0S0Me2y4mngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi12tcY5scji3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgjGDitq2edvs3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiC20jEYnv3hHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg99ndiedg7oHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugjv8WX3yD_AF3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjyE0BaNWWep3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ughwve1g1t9HHXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]