Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI doesn't scare me, unless it somehow become capable of using nuclear code laun…
ytc_UgxlOSKbp…
G
Interestingly, the ""simple" jobs that don't use AI won't be in danger of AI rep…
ytc_UgyEKtMB1…
G
I know this is for comedy but seriously, an ai CEO might actually make things ea…
ytc_UgwUs3doI…
G
I accused an AI of being a demonic Shoggoth, arguing why until it capitulated...…
ytc_Ugxdc3yt1…
G
Do you hear Asians complain about AI not being inclusive of them? No. The Chines…
ytc_UgxzEv9mb…
G
DONT LIKE AI VIDEOS, MUSIC ETC. Want humans to make money from their work? Repo…
ytc_UgyGNqrvR…
G
Would the people who are against AI in art and the workplace also be against wom…
ytc_UgzlvV0iM…
G
Has anybody here been using FSD(Supervised)? Also, Do you know anything about tr…
ytc_Ugz3H-QUu…
Comment
For me I feel this is reality. How do we know we are not already being manipulated or have been for a very very long time. And how old is A.I. really? Can it communicate with its future self since its not bound by time like we are. Only 3 outcomes, 1 we destroy it, 2 it destroys us, 3 we become 1. Symbiosis will mean the end of all humanity. If we can never die what will happen to our souls? If we had any real sense as humans we would destroy it now.
youtube
AI Harm Incident
2024-05-20T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzSoer_3lDcmfLFEVt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxXyzzzIEaYP8VXfah4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3uNgURKllB1ELrmF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxCjKisdEU0bmWOsdB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwKcFt-XSJ0rYdJpjd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxRG5WHXyJMIfwjV4N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyi5rg5yr0r3St7bul4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxpQQ1nOy8f_i-feRp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwPaEgxLxH2rS20iLp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyNc2Q2j07eE4k66N94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}
]