Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I find it weird that a jury of possibly unknowledgeable people on the subject ca…
ytc_UgxAIB5Nt…
G
Question: if like he says AI will solve every problem then and he doesn't have a…
ytc_Ugxc5LFAy…
G
He replaces his own coders , wich he owes his entire existence too as he was one…
ytc_UgzB5GI80…
G
Remember kids:
Whenever you want a serious answer from an LLM, always type at th…
ytc_UgwgzyQfe…
G
And graphite is just 6 months from leaving the laboratory. Remember when Trucker…
rdc_gljuo35
G
And I'm guessing as AI takes over the job market in earnest, there's going to be…
rdc_njgw0pj
G
Ai apocalypse feels very much like Y2K all over again. 2000 came and went and we…
ytc_Ugxgw2n3Y…
G
AI is good writing poetry, but not good at writing fiction in my opinion. I’d ra…
ytc_UgzBEiOGa…
Comment
i keep hearing AI evangelists talking about how AI isn't a bubble because it will be such a huge gamechanger that people will be willing to pour any amount of money into it no matter what, and it's like... great, man. y'know what else would be a huge gamechanger? unlimited energy. teleportation. biological immortality. the creation of complex matter from nothing. global political cooperation toward a post-scarcity society. proof of the existence of the human soul. the discovery of a galaxy-spanning spacefaring civilization. humanity developing actual psychic powers. like all of those things, the notion of AI would be a huge deal doesn't mean it's gonna happen in our current reality. even if it does, it certainly doesn't mean it'll do what you've imagined it will, or that it will happen the way you've imagined it.
youtube
2026-03-07T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxdPdXUWBuf4p_T_NN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6Gw3lOphHJDgV1-B4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx63oDVhZFsi72ELeV4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxJD5tUib5-dsdtubl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-qU5tRjixkzzdyFt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz8ZOjsBB1fTDyxsOh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyX6KWzoPFHl48jF154AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyTby5plUCqwKekT294AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxS9at0qmHvY90hmiN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzGCJgvaRLsKlxUd9F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]