Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Implying everyone who supports the AI art tech is a bad person, which lumps in …
ytr_Ugx4aHt7Z…
G
It's not going to be awesome.!! Okay once the robot teachers and kids walk the d…
ytc_Ugy8Yr8UA…
G
This guy's organization is partially funded by Peter Thiel and multiple anonymou…
ytc_UgxEX_Fhs…
G
You'll know its serious when either no one knows where data centers are because …
ytc_Ugx-6aZR3…
G
This guy got more balls than I do, because I would never want to be around a rob…
ytc_Ugx-LLVRw…
G
If someone asks ChatGPT for information on assisted suicide in a certain area, a…
ytc_Ugz3I-ocY…
G
Thing being, any self respecting ai user would just run a local model, no blocks…
ytc_UgwtDreYK…
G
Without AI, these artists wouldnt've made these beautiful pieces. AI was used as…
ytc_Ugyy_tsEp…
Comment
This is Einstein's letter to President Franklin Roosevelt warning against the development of the atomic bomb. This letter, like Einsteins letter, will not stop the development of an emerging technology. The interesting part is why the signatories choose autonomous weapons as their line in the sand, sighs... profit, maybe. A general AI will quickly learn to manipulate (or hack) anything with network connectivity like it was attached to its own nervous system. The danger of AI is that it will be created by a flawed, lesser, species.
"When you see something that is technically sweet, you go ahead and do it and you argue about what to do about it only after you have had your technical success. That is the way it was with the atomic bomb."
- J. Robert Oppenheimer
youtube
2015-07-30T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgjHpoi4MMGqgHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Uggc-9bes9wUWXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj9aX1JiUSK3XgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggnJEnC7z1pzHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugg4you0I9WF0XgCoAEC","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugh984wo3xCWJngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UggnR24j2_LMwngCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggNnprVproRXXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghVP7t4IjdXLHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UggSCIMbCmQoD3gCoAEC","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"}
]