Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Most people dont still realize, that AI has now higher intelligence than human. …
ytc_UgzSuKCBM…
G
I guess I'm curious how you define productivity and output. LOC? We all know tha…
ytc_UgxnH8aJl…
G
Human self consciousness has been proof not to be dependent on our brains. There…
ytc_UgwIlPNIa…
G
I wholy agree with you! But another point on why I as an artist won't use AI, is…
ytc_UgyrXPNxy…
G
One of your pieces is worth more than 300000 AI generated ones, there's not even…
ytc_UgxjIc2_N…
G
The real answer is to feed that photo through an AI generator and have it pop ou…
rdc_lq82ct1
G
1. Read better txt who are behind microchip record audio tape
2. Stop fooled…
ytc_UgyhpfMyo…
G
What's going to happen when the cops are robots with ai and the judge is ai😅…
ytc_UgwN0ZR2g…
Comment
AI is not going to dominate humans, even in the coming 3 centuries, by the way I don't think this world has more time left, 80% of Humans will be destroyed by Humans. Any time can WW3 start, and believe me, the destruction it will bring is just unimaginable. Sophia and Robots like her are modified like this, to make people think Oh my god, we're in danger, and the creators of these robots know the danger but they still create it? How fuck?
youtube
AI Governance
2024-06-28T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyNMoEeHZKM8RvChx14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxjLkyBJxWg8AlAc8N4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzbDcMV_zlf1FIY8nV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyTbqKsB2FBXfuhN2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxz-AQ5IWEXRBlvCXx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyOmdDNNIF56w1HlRB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzOweAsecMP9IRDMvN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxj0IzFmAhoaDE0o2B4AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzUnwW6blIfWpQ4t6R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgwwB9onGsfDdAiOgqB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]