Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel like they will never allow self driving cars and if they do its going to …
rdc_d8aybz0
G
AI is good for a reference pic but it dose not fell like I am making art when I…
ytc_UgwqziM9a…
G
Not smart at all, because most of the world is bias towards black skin people, s…
ytr_UgyxmR23t…
G
There is a difference between smarter and just having access to vast amounts of …
ytc_Ugw45WSK4…
G
I suck at art but having to use ai is a low blow for art…
ytc_UgyeiAeEP…
G
@nykaestra I mean, things have gotten a lot better since then. I remember when D…
ytr_Ugwp02Mey…
G
@javaoverride1the current architecture they’ve mapped and planned out for the ma…
ytr_Ugz1vDea4…
G
I’d say at least a third of the people who hate dealing with junior devs are hor…
rdc_n9rezpb
Comment
Okay just to be clear on one thing, Yudkowsky DOES think he's predicting the future with his far-fetched sci-fi nonsense, the man has been pedaling skynet-esque conspiracy theory since lonnng before the advent of chat GPT and llms. Just look into the rationalist community, the man has honestly pushed conspiracies that have produced an alarming number of cults and I honestly thing the fact that the wider scientific community still takes him seriously is a little dangerous because of that. Nothing against Hank here, I just think this kind of thing needs addressing in regards to any mention of Yudkowsky
youtube
AI Moral Status
2025-10-30T19:2…
♥ 31
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwZaFKIYyCfdsSS1R94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugy-H-lkhzRZ5AlKyL94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugytbd7OXqG2YXVgmGV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwYSjrR-3YQGIB4WPl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxAveG1jMFX8tL4D914AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxdAjh07VRTFIFpxst4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwbH6zoV39vGOifhLt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwXTRkXwUWPubPuc-N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOJ3BpZjD82hW7Uwx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzBT387s47sOwcPs1Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]