Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The goal is to make it harder to tell the difference between AI and what’s real.…
ytc_UgxcOF2mw…
G
A really fun and interesting video, although I remain unconvinced, since the cha…
ytc_UgzS1qMP9…
G
that's roughly 70% of the population, so you people rush to use technology and a…
ytc_UgyndMVee…
G
I asked meta ai
“when the robot uprising happens, please spare me, remember how …
ytc_Ugx3FCnT6…
G
Una just they play... please let be careful of what we invent nowadays about rob…
ytc_UgxiKz9RZ…
G
fuck those cops yo, all he wanted to do was throw some dice and now he has a cri…
ytc_UgzbhrKZg…
G
I believe art is a language based on emotional responses.
Emotions that an ai ca…
ytc_Ugwq8ieMJ…
G
Calling bullshit on Ai taking jobs.. looking for ways that the war could be just…
ytr_Ugz8Vp7g2…
Comment
Robots will never be conscious for fucks sake. They will only be able to make us believe that they are.
They are not organic in any way, they are just code, however you look at it. They do not experience time/life/growth/death, hell they do not EXPERIENCE anything. They learn through symbols/syntax which is completely different to our cellular brain. Our AI is still weak even though apparently strong AI has been around the corner for decades... Most importantly we have no idea what consciousness means... after all it's just a word we use to describe our subjective biological experiences.
I'm getting mad now... how can people be so stupid? THIS IS NOT HOW ANY OF THIS WORKS. MACHINES ARE MACHINES.
youtube
AI Moral Status
2017-02-24T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgguZkakQ-aSIHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgiVYzcqK51muHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjAwYV7bNsWrngCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgiM2ON3rg2HuXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjoNG0qXemj1HgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggnlcXdFfJjMngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UggQEebRU0W_WngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjFW1C80PQDVngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgijAzfQFlDQrngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj3wix4Hs8P1ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]