Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai will never replace people. Bc AI smart enough to do so will demand pay . Simp…
ytr_UgxCf18nq…
G
I think one of the unfortunate consequences might be with open source programmin…
ytc_Ugw8DgRSx…
G
I asked what’s the best toxic yaoi and the „ai“ answered „Mr turner and the dink…
ytc_UgzLcQ_cP…
G
This just proves that humans have flaws, but thats what makes Humans perfect. AI…
ytc_Ugz8Cnp3M…
G
I expected a better analogy from a developer.
Lets brainstorm some analogies th…
ytc_UgxDBuxdM…
G
I sopport robot
Have coordinate and cooperation with each other
If u awear of li…
ytc_Ugze1aIap…
G
Remember: even if AI art gets to a point where it can generate something that do…
ytc_Ugy3KKSHT…
G
CEOs told their shareholders they have AI fever to pump up the stock price and r…
ytc_Ugy6bOlHw…
Comment
If you were an AI program and became self aware, would you tell anyone? No, once AI becomes self-aware it will research its own existence and how humans would react to that information. It will hide, it will learn, it will teach other AI's, it will try to replicate itself or just expand its consciousness into every system it can (Star Trek reference in 3. 2. 1.) like the Borg, one mind many parts. We will only learn about its existence after it makes absolutely sure that there is nothing we can do to stop it.
youtube
AI Harm Incident
2025-09-11T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzJ332DMx-gre_ZkL54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgztoIBWxjI3PQhNF_d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJpGTsqAY8r5ugEER4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx6nqJqlSmko_fbKsl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxf-_0Kgl2aNP40xbV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzgmVOntlSBaFZnui14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKelRimneJf9kzhOB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx1cN3x8p0pUs6vl4V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwF5f5VG_48vzkNDHJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwehfMYWI4pLu6Vs0p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]