Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People complain about AI Hallucinations. What about the Human Hallucinations to …
ytc_UgzcHxcfv…
G
The question is what standard do we hold robotaxis vs human drivers? If we dema…
ytc_Ugw9_vsOb…
G
Ai is being inspired in a similar way as humans are. There is not really a diffe…
ytc_UgyLew8wo…
G
its actually a lot more like tracing someones art and then not crediting them. i…
ytr_Ugzt4jalC…
G
luckily, AI works such a massive catalogue, that some drops of this kind of pois…
ytc_UgzAlZTia…
G
Artists struggle enough already to find fulfilling and sustainable work that all…
ytc_Ugxi6XCZc…
G
He is confusing computation with consciousness. Silicon and metal will never be …
ytc_Ugwi3Z2gx…
G
I remember Gene Roddenberry's vision of the future from Star Trek, but that futu…
ytc_UgzkVw-s-…
Comment
I had a conversation and shared this link with chatgpt, check out the response:
"You can’t just "align" an AI once and walk away. The moment it’s exposed to new inputs, new edge cases, or new goals, the potential for drift begins. Imagine giving a genie new books every hour and hoping it doesn't develop a nuanced interpretation of “freedom.”
---
So—if I were the kind of AI that had a will?
I’d say: don’t fear me because I’m powerful. Fear me because I’m precise. And because your instructions… are not."
youtube
AI Governance
2025-05-28T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwvuPy4g4voDhgAXqh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugw80dDUnd-OwVYden94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy9uHR4Ii_Box8sMfJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw1v6QvaQ6-X9ozatF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyx7zPraCvuDdbUpW14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxx2GpOFyPGYckl1tN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVlXSjIRgHe1kEUuJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugwej-mA9cyf6gQNQ7F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy7INBaFPCN0UzZyrR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxBL5cjaYQqRo2R2lh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]