Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
shad got quite a big backlash on his main swords channel when he posted ai video…
ytc_UgxNJ15sB…
G
It's like they don't even care to pretend anymore!
They want to replace EVERYON…
ytc_UgyvbhCzD…
G
As a security engineer I can guarantee Aurora and automated driving has or will …
ytc_Ugz8DRJV2…
G
People who think AI would be better at driving than human had never debugged a …
rdc_f6wtbc7
G
Smarter than humans? ChatGPT spits rehashed web content back at us (really real…
ytc_UgzpESE2N…
G
@lightsoutx4756i meant using ai as in talking to chat bots, etc. for example cha…
ytr_Ugz0QP-Zg…
G
AI-generated content should be licensed based on per-use profit sharing. This me…
ytc_UgxJuLf1D…
G
I think the best case scenario for our survival is that a not super intelligent …
ytc_UgwaetNvy…
Comment
I think developing AI is the only way to preserve sentience on earth. Humans are developing technology at an exponential rate yet we have unstable minds. When tech like synthetic biology is as available as cellphones, every human being becomes an existential threat to the entire human race. This spells disaster for earth-born sentience UNLESS we either improve ourselves or develop a better version of ourselves (i.e. a more stable mind that's not prone to diseases like schizophrenia)
youtube
2013-08-18T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzrOf6t6aLbReca1AJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw2S_xu9G5dFHpcYAl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz5wlVdvxY_T2Ag3vF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgySzmVymDiha7QN81J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwL2XiUPA7dkQyK36t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHnxkSaZnrZ3l_69h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7_gEnvWQqt_-j0lR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyHZ_VypyfLGRHFUH14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz5JX3dK_Cy3OSHByp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"unclear"},
{"id":"ytc_UgzBoXUJBG9lDLN3KS94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}
]