Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It sounds like you found Sophia's appearance interesting! The design choices, in…
ytr_UgzrTOTb_…
G
@willjackson5885Because usually as a human you can tell when the artists has put…
ytr_UgxFje0NJ…
G
Look at what happened at Walmart as well. People started refusing to use self ch…
ytr_UgzimJNNO…
G
Even if AI can handle code very good, it is crucial more than ever to first unde…
ytc_UgxBvsmRn…
G
@Ndax_254sadly it started long before zakayo. Some big government players thrive…
ytr_Ugz1vGO3t…
G
You stood so high and mighty on your Artist Throne, believing to be immune to th…
ytr_UgzvK5-ms…
G
Wow these comments are heartless
The mind of a 16 year old child is impressiona…
ytc_UgyO3u-v9…
G
Hank Green *and* Eddit Burback both posting videos about AI that are over an hou…
ytc_UgyRuf-LE…
Comment
I'm reminded of reading that Richard Gatling and Alfred Nobel believed that their inventions would end warfare because of the absolute carnage that their new technology could inflict compared to what came before.
Our creations will indeed kill us all, it's just a matter of which inventions and at what time. AI looks like the leading candidate so far. There will never be a non-proliferation treaty on AI.
youtube
AI Governance
2025-06-29T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwquaengYT7QHoDOEZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFLKyHSySKN86fECh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgybcpXvEPsiR2dLplp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyIo9pPPyM6ohJVNNZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy9OkmL5CKKMrrp_VB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzfnWk5mo4hL2UtfkF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgykxVOuJeZi_UF5-Pp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxI8JpxZec8Zr5NfO94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzUi4BjIMB_WSdiUDp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzCp5U50_1Sab8zm514AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]