Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How would hallucinations be fixed? The LLM is just guessing, you could never gue…
ytc_UgyMrgxf4…
G
Do yourself a favor, watch shadversitys deep dive on AI art . Because unlike you…
ytc_UgwSBsdRj…
G
@Forge6007 If you want something good from ai you need to know how to prompt…
ytr_Ugz3lD_Ah…
G
was signing up for chatgpt until i was asked for my phone number. why would the …
ytc_UgwmtXH6-…
G
Replace human go wrong ❎
Replace human while AI is not fully developed go wrong …
ytc_Ugzz45IVF…
G
Prediction: robot drivers will make great targets. Congress passes law making ro…
ytc_UgiMFU0E_…
G
I do draw IRL but I'm also using AI for my entertainment. And I'm sorry for usin…
ytc_UgxKXMjwb…
G
Oh, no! Soon, I shall be looking at a robot instead of my Goddess Palki.
Even M…
ytc_Ugws0zaA4…
Comment
The companies that operate these AIs require a lot of energy (a sign that their processes are searches in only two dimensions: ELECTRICALLY *IN SERIAL*, which consumes a lot of energy), and THEY BELIEVE that KNOWLEDGE IS INFORMATION=DATA, and this IS NOT SO. It's as if they believed that PEOPLE EXIST INSIDE A TV SET😂*. It's ANOTHER AMERICAN IDIOTHY, which tries to replace the human being WITHOUT FIRST KNOWING IT.
youtube
Cross-Cultural
2025-10-05T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz6AqkLUUkeGcS-jJB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw3eu98pEGGDcPo95J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwt_l0Qy9wNqDb-78h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw2uYI6S0HDHstD8yN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyH7iRXoxF4bxP3Jrl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxI6FgCfg2YbZwuA9Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyXfox1JqpvWNugEe14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxZ1-CBkrQDZ07WzAZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzpDzlqfBQgdIkV7T54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz4DhdYXXBfFyDxNTd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]