Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
​@platemaxGoogle HAD those things, I don't think that is true anymore. If I recall Elon has also stated he a falling out with a Google co-founder because they were, at least in theory, okay with the concept of machine intelligence REPLACING (aka wiping out and taking over for) all humans. So my faith in Google havi g good intentions is zero. Anthropic says they care about alignment. They do at least seem to be spending marginally more on alignment and safety research than the other labs. But they are sill in the race and contributing to the problem and risks. Sadly no government, AI company or billionaire that I'm aware of is actually trying to do good for humanity as a whole long term. They are all trying to maximize their short-term gains at the expense of humanity's long-term future.
youtube 2025-06-07T21:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgwgMCVUt2G7xe2l8A54AaABAg.AJ35drjcy0PAJ3Oe-iVH-h","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwgMCVUt2G7xe2l8A54AaABAg.AJ35drjcy0PAJ4tdl1Ot01","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgzcLY1zA-7BPxyhqp14AaABAg.AJ35KgVnc5rAJ5F8aIPx_8","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_Ugxio3XMveQozMOs9rR4AaABAg.AJ2sqfHuiLJAJ39Keet5SW","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugxio3XMveQozMOs9rR4AaABAg.AJ2sqfHuiLJAJ3aWWFZdJb","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"approval"}, {"id":"ytr_UgzI57N8h5qnVDvNMkV4AaABAg.AJ2o3zjCwwzAJ3F1LALEfQ","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytr_Ugx0dFNpg3yl0Q3ufmp4AaABAg.AJ2_son0FgAAK8Mzl4T0WV","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugx0dFNpg3yl0Q3ufmp4AaABAg.AJ2_son0FgAAMbSfaRi_cg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugx0dFNpg3yl0Q3ufmp4AaABAg.AJ2_son0FgAAMvBPBDmnbf","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgylRE4Vf3c9Pv7FD1V4AaABAg.AJ2_GPfhr3YAJ2yoB-pACC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"} ]