Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Have you ever considered that’s the truth?
Most people don't want to hear the h…
rdc_jif9hd9
G
Calling LLMs a dead end is about as reasonable as calling steam engines a dead e…
ytc_Ugw_U2D2Q…
G
I remember the good ol' days when "AI" stood for "Art Institute" and "Adobe Illu…
ytc_UgyOLE252…
G
When I posted what's below, I was immediately blinded from all s shoes related t…
ytc_Ugy_xkUC8…
G
what does that even have to do with my reply? I said that Ai can be held account…
ytr_UgzXa07My…
G
So this mat be another clue to my theory what if there is a secret developer try…
ytc_UgwxqOWZW…
G
That doesn't sound like ChatGPT at all unless maybe you explicitly prompt it to …
ytc_UgwFd_G9j…
G
HUMANS WILL HAVE THE NASTY WITH ANYTHING WE ARE DISGUSTING ANIMALS, AI IS SO USE…
ytc_Ugw01Ay_p…
Comment
I think we as humans would have to go back in time if AI takes over most jobs in the world, we would have to relearn how to grow our own food, have farms and hunt our own food, try to survive in the wilderness, rather then being a part of a society where artificial intelligence has taken most jobs so humans are left jobless, maybe we should prepare and learn how people used to survive years ago, this is moving so rapidly, we all have to prepare, just for the sake of being safe
youtube
AI Governance
2026-01-22T08:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxkXEHfSMw447_MgT94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGxjeum1yWr0O9nyN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwgPngjkXE7yKb3VRB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxjwboeGcac4PS6Vip4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwPAajf3vTbGmSBPGF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzadN8hi8A9uyo1Qm14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzecqFjvU3Quf9Qh914AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzKKDAMG3by9NzHJ-F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzc1B10YSpuUvQbRa14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgLim6B87R3av4Owt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]