Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i had this mostly written and I accidentally pressed the cancel button but here …
ytc_UgznBNboE…
G
I asked AI what to do to glow up. It said to drink the contents of a lava lamp. …
ytc_UgySfhVsP…
G
I honestly don't really think that AI will really "try to overpower us". AI does…
ytc_UgxBEC1Ih…
G
They want you to use the steven universe art style fr fr "it's not AI"…
ytc_Ugz3uAsuR…
G
Now the end of your clip is the most interesting. The crowding out impact is pos…
ytc_UgxQtyDBV…
G
AI and new Twitter (X) monster made by Elon musk
This Social media Apps Like wh…
ytc_UgyPNQf6O…
G
14:12 part of that money needs to go to using AI to solve humanities problems wh…
ytr_UgxpzCnzA…
G
Did you hear about amazons "AI driven" checkout-less stores? In the end it didn'…
ytc_Ugzr88z3Y…
Comment
Climate change is a scam.
(Yes, climate changes. It has always been changing. But we do not even know which way it is going.)
In any case, even if I believed that climate change is caused by humans and that it is going to obliterate the human race... We are still talking about the timescales that make it all a joke of a threat anyway.
By the time we are endangered, we will have the technology to control it.
And if not, then sorry - I cannot for one moment believe that we have the capacity to predict what the climate will look like in 100 years today, but we will not be able to do anything about it 100 years into the future...
And just so it is clear, I am on board with the nuclear threats as well as biological threats (although I think both are less dangerous than AI), I just consider it ridiculous that "climate change" is being listed next to them as an actual doomsday scenario.
It just takes all the seriousness away from all of those other threats.
youtube
AI Governance
2026-02-24T19:3…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyguxlSmhlIKh4gZdd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgweDRYen7rHTPUc3lR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgywEqcCcgDCABDNsJt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWtSy0N1tjtMhxcZd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVfVxsND3Ua3tNcqV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwrISWJ7hLjSvcP1Zd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwyK5F1g2Q8-W0m5Wl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzGSkrwJrn7aXPYS454AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlL3VQZqpQHX0KRBN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0R9HqqG275eEcUxt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]