Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lav. I almost never comment but it breaks my heart to see people insult trying t…
ytc_UgzAZe3ev…
G
4:35 There just has to be a loophole; there always is. Even if states cannot spe…
ytc_Ugxt9dFWn…
G
Idk how these people are doing it. Every time I tried to even make an innocent p…
ytc_Ugw1PLyix…
G
The M3GAN 2.0 movie deals with AI killing us and being autonomous. Hollywood wil…
ytc_UgzB08jgF…
G
This video was fuckin scary dude. Do Not Give Robots Weapons. We dont need any t…
ytc_Ugwk0Qs1Y…
G
ChatGPT also recently convinced a man on the spectrum that he had discovered fas…
ytc_Ugz2oqjZI…
G
Zuboff bets on democracy, she says, is more than 200 years old, as opposed to su…
ytc_UgzK8sJ4y…
G
The concept of driverless trucks was first shown in The Simpsons in the episode …
ytc_Ugz2zvlTa…
Comment
I asked AI a question today (involving creating an acronym) and it gave me an incorrect answer. No biggie, it happens, so I pointed out it's mistake to it, and it said basically:
Why yes, that is wrong! Here is the correct answer.
... and it gave me the same wrong answer again...
;-)
Not saying AI can't be helpful and won't "eventually" get there, but I think it will take a LOT longer than some people think for complete, complex, and accurate solutions.
And for some things, it might not be worth it. i.e. There are better things to throw AI at...
youtube
AI Jobs
2025-04-01T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzozStM6shXRvLAvDN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyzOOVo4etqlzFx3_R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw6Cg7hqyQCDJB34VF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzDb4UpUzCdiDiFI2Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7dyPJc6owJEaRVod4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxKAyzZ_Qufk_0QEyF4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzYTz_Ls68SEkJsSJt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3VZW3ivkSkCTsKDd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyZCNRSztT5Coo1tvZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwg7iylEpj-t46a2kd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]