Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have been a software engineer for 20 years. This year I produce 5x-10x as much…
ytc_UgxR1U39R…
G
Imagine someone wanting you out of the way. They use AI to construct “video evid…
ytc_UgzSJE0Oo…
G
Right, AI is a computer product. They are selling the program and services. I wa…
ytc_UgwD1QGjW…
G
I hate " ai artist" they only do one thing and then call themselves artist
I lov…
ytc_UgxxQj7L0…
G
I would prefer humans do the art and machines wash dishes. I think some of the A…
ytc_UgxzGDmo8…
G
He said that right before he smugly slunk away into his underground bunker in Ka…
rdc_kojyok4
G
I hope the inventor of this type of AI has a day. Doesn't have to be a good one.…
ytc_UgwhZVAmE…
G
I agree, I really respect this guy , and want to trust his virtue , he's a geniu…
ytc_UgyPJEnYN…
Comment
@platemaxGoogle HAD those things, I don't think that is true anymore. If I recall Elon has also stated he a falling out with a Google co-founder because they were, at least in theory, okay with the concept of machine intelligence REPLACING (aka wiping out and taking over for) all humans. So my faith in Google havi g good intentions is zero.
Anthropic says they care about alignment. They do at least seem to be spending marginally more on alignment and safety research than the other labs. But they are sill in the race and contributing to the problem and risks.
Sadly no government, AI company or billionaire that I'm aware of is actually trying to do good for humanity as a whole long term. They are all trying to maximize their short-term gains at the expense of humanity's long-term future.
youtube
2025-06-07T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgwgMCVUt2G7xe2l8A54AaABAg.AJ35drjcy0PAJ3Oe-iVH-h","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwgMCVUt2G7xe2l8A54AaABAg.AJ35drjcy0PAJ4tdl1Ot01","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgzcLY1zA-7BPxyhqp14AaABAg.AJ35KgVnc5rAJ5F8aIPx_8","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugxio3XMveQozMOs9rR4AaABAg.AJ2sqfHuiLJAJ39Keet5SW","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxio3XMveQozMOs9rR4AaABAg.AJ2sqfHuiLJAJ3aWWFZdJb","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"approval"},
{"id":"ytr_UgzI57N8h5qnVDvNMkV4AaABAg.AJ2o3zjCwwzAJ3F1LALEfQ","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_Ugx0dFNpg3yl0Q3ufmp4AaABAg.AJ2_son0FgAAK8Mzl4T0WV","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugx0dFNpg3yl0Q3ufmp4AaABAg.AJ2_son0FgAAMbSfaRi_cg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugx0dFNpg3yl0Q3ufmp4AaABAg.AJ2_son0FgAAMvBPBDmnbf","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgylRE4Vf3c9Pv7FD1V4AaABAg.AJ2_GPfhr3YAJ2yoB-pACC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]