Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@ o0Theresa0o I'm irrelevant and YOUR talking about robots killing on their own!…
ytc_Ugy_7erKA…
G
Intellectually’ you’re a very smart parson,, but life skills you have none. You …
ytc_UgyXezTKf…
G
Suppose you want to launch a data-driven pharmaceutical startup. Under the new A…
ytr_UgyROzCiV…
G
It seems even more ironic, that this came out 1 day before Amazon, laid-off 30,0…
ytc_UgzslXRHb…
G
Study link: https://arxiv.org/abs/2602.14740
>AI Arms and Influence: Fronti…
rdc_o7byawd
G
They programmed the male robot to sound like the stereotypical male asshole idea…
ytc_UgzTvBC7I…
G
Saagar is apparently against all AI yet almost everyday he says “I asked this to…
ytc_UgyIYpm8X…
G
And eventually all of its work will be based on its own work. You still need a p…
rdc_n5iof1a
Comment
Truly illuminating discussion. Thank you! PS: One thing to mention: Humanity doesn't have much time left to answer some of the most thorny questions. I'd give ourselves anywhere between 1-3 yrs but not much more before the runaway AI scenario materializes. So please go figure it out. (Or let AI do it for us).
youtube
AI Moral Status
2026-03-02T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw93O8RgZRBklF64aV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwMQxDmCug_3NlePLp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzj5B2SmJqYyYxB9vt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxJZukO05mT_gX3XEh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzxKpVRg69MlTaXTCd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyZIgOMzzLFnjrPV-F4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyDF7XvoMtCFft7p-F4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxRocDGw0B25BOD-AF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyJpjHjk421DQKY8CB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzdutVh37X0NHTcq2h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]