Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
in a society so profoundly sick, one could argue creating AI now will adopt its …
ytc_Ugzd3mDPl…
G
@autohmae Right now it looks like a system where we are supposed to both trust …
ytr_UgyTSejWe…
G
THEY DIDNT EVEN TRY TO MAKE THE AI LOOK BETTER? THE HAND ANIMATIONS STILL STARTE…
ytc_UgzRGVSFB…
G
This is animated, and shoot over a green screen. Don't let AI test your intellig…
ytc_UgwAWKbzF…
G
Does anyone think that if AI was more advanced would that make 75% of humans tur…
ytc_Uggi0kM7h…
G
I asked AI if I should put ice on my bruise, It immediately cooked my brain unti…
ytc_Ugyx2sORp…
G
A question I also find interesting and terrifying pertaining to this topic is wh…
ytc_UgiS9-lmb…
G
I can see a day when we get tired of being told this is the next best thing and …
ytc_UgwK4hvV5…
Comment
Authority Institution AI
Every single time technology makes a new world dynamic, many people die. Why? Because sociopaths seize the tech and use it for murdering for money.
AI is different to any other technological advancement as we are giving decision making power to something that can only look from a non-biological base of perspective. A perfect tool and excuse for the sociopaths in power.
Chances are though; you're not in a simulation and Dr has spent too much time not noticing things outside the research lab.
This topic and discussion are very dangerous, as you are talking about it not mattering if you are in a simulation. Empathy is a feeling that doesn't need reality to be empathy! It's actually a response to imagining from outside one's own perspective to begin with.
youtube
AI Governance
2025-09-06T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzDuQlA6Q2uF8t2xox4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxdRpzEx84ouFicFep4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwiG5XWlZFeLBJqoBV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxft9LqCh3BwvBUuD54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwESo-QedIpVgCJKv94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzedWS9mi840Ddzjcd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxrLoDNXMlwfLjaIRB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzck8VixyLG-FM8aod4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyJXJ0IA5LHiy90hJF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz8PRKtJAvpaUNbk2Z4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]