Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To fix that give ai human rights.
Now you have to pay them.
Say no to digita…
ytc_Ugztz8BJI…
G
IT'S TOO LATE!!!The big IT bosses who push AI forward ignoring danger -listen to…
ytc_Ugyxa3bp-…
G
"Those driverless trucks will kill, maim, injure actual humans.
Is the Economy…
ytc_UgzEBTaiN…
G
These sociopathic priests of the AI cult are always scaring the sheep with imagi…
ytc_Ugwy7n3mT…
G
The only way to be literate is to keep practicing using it. Ironically there is …
ytc_Ugza6MVDd…
G
What's most trippy about this is that her left eye is significantly higher... me…
ytc_UgxlhCGrh…
G
Also if you're switching from OpenAI because of their DoD/DoW deal, you might wa…
rdc_o7xnv15
G
When most jobs become automated, who will be able to afford the items that are m…
ytc_Ugyppysyb…
Comment
17:30 Neil deGrasse Tyson is TOTALLY wrong here, according to science data... [ which he loves ! ]
according to science data... [ ! ] An overwhelming majority (75%) believe technical researchers should be concerned about Catastrophic Risks (General risks).
(source: Severin Field surveyed a total of 111 AI experts)
youtube
AI Moral Status
2025-10-18T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyJc2GePOl38RIb7Ld4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugxc2b_JzO5DHFmUxBp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugwgz4eBSvmV63WgaO14AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzkY8IRhJzC7jeQyOF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzquBvwMbxPXxBTS0R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgxR8LDXdBZbfIO2M654AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_UgwKQ_aArcDcKiiwmHB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzrOMDH0oCCWsV6xAF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgyOGa8EmrGHXOFCyx94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugw8GvlSf50F_muJN554AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]