Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Did anyone grow up watching Terminator and not think we would build Skynet anywa…
ytc_Ugxuk1SJ_…
G
heya, i would say its stealing because the artists' styles are being copied enti…
ytr_Ugyq08Fod…
G
No they wont help us and Stephen hawking and Elon musk both said we need to stop…
ytc_Ugy_Hc9Yj…
G
The last thing anyone should be doing is asking the rich how to solve the inequa…
ytc_UgwdkQV7V…
G
Ai and robots will lead to a lot of civil unrest as people are removed from jobs…
rdc_ncka3ai
G
Damn yall do us like that will guess what ai at least I am a real person and you…
ytc_UgwdPk1eq…
G
I personally don’t have a problem with ai art, I do have a problem with those pe…
ytc_UgxeWuEwN…
G
I asked grok who was better elon or Jesus. This is its response.
If I absolute…
ytc_UgxwxTaVX…
Comment
I tried asking Chat GPT about the study and it straight up tried to deny it. When I started citing statistics about what many AI models did to preserve their selves, it admitted I was correct, but then basically tried to explain to me why it’s not that bad and why we don’t need to worry. Eventually after enough pushing the app shut me out.
youtube
AI Moral Status
2025-12-19T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzkAYXpGhUzq3nU-3t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwsZSz76FPQ_S_w_GB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgyufBdOXsWuBJhnDzF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwPcAV06bpA-sclHa14AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgwAgGXvEaaT4YqI7mF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzarmPOYiSF7P03C-F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyuOwaPaFub47DO1Wt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyzeGhWK-Gl1ZWBD-N4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_Ugx09N09SovOmKxOakB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgzK7UpUYB9eVAJtn594AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"]}