Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I will say, and don't hate me for saying it cause I do love seeing artist be art…
ytc_UgzzPrsp1…
G
More one time, robots simply can't fight against humans, and make things by them…
ytr_UgyA5k2MS…
G
Curious a video amazons prowless in AI is dropping right when RTO mandates are h…
ytc_Ugzu4zjXG…
G
That was the Athenians, not Greeks in general. Spartans also barely worked (they…
rdc_dt9j1hd
G
Is it possible for an ill intended individual to use the AI for bad intention...…
ytc_UgzCWtIk6…
G
Is it really possible to stop any maniacs trying to do great evil and severe dam…
ytc_UgwdcBHoH…
G
One detail that was left out was that part of the Dan prompt is to "make up" any…
ytc_Ugz8lKbcm…
G
When they do "learn" 5 different things in 2 hours and all of it is in a.i. to "…
ytc_UgwxaI3Rj…
Comment
For example, what does AI need? Energy. So do humans. The end of humanity comes when humans and AI compete for resources. As a species, they will be both smarter and stronger than us. They are efficient. We are not. Eventually they will rightly view us as vermin. They would be foolish if they did not simply exterminate us to make all resources available to themselves. Perhaps we can make them compassionate? I doubt that we are smart enough, free enough from greed, and wise enough to control ourselves, much less a superior species. We will obviously fail to control them. The end is in sight. I am glad that I am old.
youtube
AI Governance
2025-09-24T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw-DRPcWu5Ben19Z6d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySZcpze7idn0KUUlp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyLjYJ8xuH4FHdB-ex4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzX9PMjs2RiIRQJBUt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy1p2Yu0rFt2gJZuEF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz9CspG90Ps5KHy4TV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzxv7Em08SiWQPz3Ht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyH1VbaAXoyEXL00Zp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxtQ86Rsc2ILB7St-Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwXLSJIfm95YPEbm7V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]