Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wtf?
Why does the youtube algorithm always send me to this stupid BS from bloomb…
ytc_UgxSngMY8…
G
this is so funny, AI doesn't need to destroy humanity, their doing a good job th…
ytc_Ugxj_m3qK…
G
There’s a pretty big fight going on in a Facebook group I’m in because someone w…
ytc_UgxSGpG8h…
G
We're only experiencing level 1 of AI; level 3 may be hear quicker than we think…
ytc_Ugz5LDnnN…
G
@WeirdcoreWilløw thats just not true lmao. And why is art in quotation marks. A…
ytr_Ugz4VBl5Z…
G
I think it really comes down to accountability. In fields like medicine, there's…
ytc_UgxkGDwvB…
G
I’m so sick of this pretending that the AI is actually thinking. All his AI does…
rdc_mtrf5do
G
But automation will significantly reduce the cost of goods and services to near …
ytc_UgyBy0avH…
Comment
Giving robots all these feelings though seems like it is and always will be an unnecessary impediment to efficiency.
Though I'm sure we'll do it anyway, there doesn't seem to be any reason at all that we would ever need to develop past conscious, goal driven, automatonic-robots into sentient, emotion-feeling robots, so theoretically this won't ever be a problem.
youtube
AI Moral Status
2017-02-24T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugizh8nsOE91DngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggyl3hVgRsJR3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugg3NvlXnGLtkHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjIf01qQSO2LXgCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghC_fBL9IzRwngCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiW6UEDZs8n7XgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjZjn-YzcpzIngCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Uggbivfnf2X5BHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggKtzN8-y1cSHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghTBvSlrH_EcXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]