Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How is this the beginning?
I have witnessed this happening three decades ago…
rdc_lgqq6b0
G
It wasn't just the embargo. Cuba had heavy restrictions on buying and selling ca…
rdc_f9fae46
G
Saying you do AI Art immediately gives you sludge swamp aura with flies, mosquit…
ytc_UgxO-rJt7…
G
Poor baby robot person. The racists that created first computers are still poiso…
ytc_Ugy5zPrs-…
G
I understand why people would consider it silly to copyright an AI generated pho…
ytc_UgyU-NsAA…
G
AI isn't going anywhere. We've seen that automation and industrialization remove…
ytc_UgyMIV3v3…
G
"you wouldn't be as critical if I hadn't used AI" bish what are you smoking…
ytc_UgzYUG7Eb…
G
disagree with his take on astrology. i dont think everyone who suscribes to astr…
ytc_UgxV1JvDM…
Comment
"Fixing the hallucinations" is not going to happen. The problem is context. Case in point you have a programming language such as python. In one package all the keywords have return values on principle. In the other package you have mixed case where some keywords return a value and some do not. AI has a lot of trouble with this because contextually it can go both ways because the standards are so open and human software developers do random things. So the LLM fails to account for it. A prompt for one package or the other may return the wrong result. Aka it hallucinates a solution that doesn't exist.
youtube
AI Responsibility
2025-10-01T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxabuYtvz5MNnv0PU94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw6Sb-pOHkdfR564mN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxfy47jDn32oxNSP814AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwYcasGmq0ClU3f-X94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZ2-qplMVQO86bX-54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4SHM57xiAMGh8tpN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXaC9rwHrhMk6L_u54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw55Y91z7iARWjpbtV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhRAtRuqQmphrBokZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyIIi7dNM4EdTU197p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]