Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
BS. There is more than one type of AI. The first things thinking AI's learn to d…
ytc_UgzBE55gJ…
G
Sorry bud if 3.5 million truck drivers lost there jobs to ai the us economy woul…
ytc_UgwQM3GCi…
G
@MusicAsWeMakeItthe problem with capitalism is capitalism doesn't mix with capi…
ytr_UgzfpBoCX…
G
Do not allow your kids talk or even have anything to do with AI
It’s demonic. Th…
ytc_Ugw3jN_rt…
G
People need to wake up and realize AI is a bad thing. What if some of these sci-…
ytc_UgyqjlsDF…
G
There is very much a debate there to be had, but on the topic of “doesn’t both h…
ytc_Ugy8Cbah6…
G
I hate this. As an artist, i fucking hate this. I hate that REAL musicians will …
ytc_UgxcfTCDa…
G
The problem is that a lot of the corporate people implementing AI has no real in…
ytc_Ugy4ElKyY…
Comment
So yea i just asked my youngest son. He also said that if robots become smarter than us then yes we and my childrens' children are in big trouble. AI is good for certain tasks when it comes to public safety, medical breakthroughs, etc..how to cure and restore etc in good hands and minds and hearts but if man is going to play higher than GOD( greed, power, profit,control, irreversible coding/tech, etc it will get very dark . It got dark already without it being here to its' fullest potential..... Rest assurred. I agree.🙏🏾✝️
youtube
AI Governance
2025-06-17T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwAiGC1TXKVxyNvxGZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxbcyXm0zYItadye_N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyRI6Y6WkAi_70Q1nh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwHwjplJBxe6H_PSwN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxox7PzZIeaD6kmIRl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy1Ha24gD6NZVGjrOt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwBWSDQMfUL7Ckyb9p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxBWnjvOeu5pxzdg894AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx43vFV8_0-3AfjulF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwe9SEfMvZAxxSVDxx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]