Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Does a knife, or a gun, or a bomb have a conscience? No. AI is no different, i…
ytc_Ugy4eLPd6…
G
Maybe the biblical story of the flood is something that we should reflect on. A …
ytc_UgwzOsN7G…
G
OpenAI is terrible. It say so many wrong things and completely breaks down on lo…
ytc_UgzTKDUbI…
G
They're not aware of when they're being tested, it's a different tip off, that t…
ytc_UgwdAOIw0…
G
What if the only way we can figure out consciousness is to make consciousness? M…
ytc_UgwCB76Gg…
G
@worldwithouttime Don't get me wrong… I'm not trying to put photography down or …
ytr_Ugznaj3G5…
G
In order to train an AI programmer model, You need data from real programmers. I…
ytc_UgxrZv4Pe…
G
This all great except these schools are very EXPENSIVE. They usually have a few …
ytc_UgwmuUHzn…
Comment
What would happen if the top 5 or so AI companies decided it was just too expensive to run and shut down all their compute farms? How many apps and website tools would suddenly go dark because they relied on those AI apis to do their magic? Is this part of what they mean by bubble?
reddit
Cross-Cultural
1763023363.0
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_nopsjq6","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"rdc_nomaihk","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"rdc_nom2myr","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"rdc_nolphmp","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"rdc_nomyfrw","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}
]