Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why would you question NDT about AI? I don't get it. You should have invited eit…
ytc_Ugzk90dzq…
G
I had 12 years of artwork in it. And to be honest?
I've decided to remove every …
ytc_UgwvEcLvp…
G
I've been signed up to Chat GPT for a while.
Chat gpt have all our email addre…
rdc_njh995t
G
Yes, because working in machine learning and data science makes you an unbiased …
ytr_Ugy4b6M9E…
G
I am so happy. Your health is doing better. I hope it stays giving better and Go…
ytc_UgxHfbrCj…
G
No America don’t be fooled by this Communist….He’s Pro China he wants regulation…
ytc_UgxUgYyts…
G
How long is it going to take for people to realise that ChatGPT is just a langua…
ytc_Ugz7veMWd…
G
well, in true uber fashion they jumpstarted their self driving car project by st…
rdc_f6ze5st
Comment
The worst thing about Sam Altman is that he knows exactly how dangerous his AI could be, but then he thinks to himself, "but if i do more reasonable and safe AI research we won't make our investors happy, then someone else will make all the money, BuT I WaNt To MaKe AlL tHe MoNeY's!"
youtube
AI Moral Status
2025-12-11T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyHEL9aXmkwse6sd014AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzmoqPtnSiECpc-lAF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzChfszO4tCIHgnIpt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOX6fRm2A7EkdaKEt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwo0XSsUlR2C8Qgk8x4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz07uwBM8eB8-Eexdt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzyLBd0bJVGnjLJGh54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz2-j2Dnfmii6r1WgB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwkIo45-OPvrWRp9xt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxw91ytLwB2WDdvxGt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"})