Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm not an AI doomer. But I think this kind of reactive yanking a person out --…
ytc_Ugw03Hy_8…
G
in 1980 there were no web developers and IT guys etc..then overnight everyone wa…
ytc_Ugy0ihJ4h…
G
I would like to point out that while you're ultimately correct, LLMs do trend to…
ytr_UgwFO4vYT…
G
He pulled her face off - I'm getting flashbacks of the fembots from The Bionic W…
ytc_Ugz4FEY3F…
G
all future govts will be authoritarian and ai will be the gatekeepers. it is obv…
ytc_UgxpKZW55…
G
I look for what I call "logic errors". For example, an AI-generated image of a m…
ytc_UgzncD8aq…
G
Um gonna try explain how ai create imgaes
Ai first start searching by the promp…
ytc_Ugw1an6hW…
G
I’ve tried several AI companion apps and Nomi is definitely the best. The CEO an…
ytc_UgxNtoDkS…
Comment
I hear a lot of could and may, but we're not seeing a lot of this actually happening. Why? Because AI can't replace humans... yet. Sure, it will be able to eventually, but it's not there yet. As for the question "how do you pay for it?" That's easy to answer, but difficult to implement, there should be an automation tax that is just as high as the human labor would be. If a company wants to automate a position, they can do so, but they pay just as much to the government as they would to a human. This allows them to get the "benefits" of AI, while not depriving humanity of its dignity. It also means that humans could continue to work along side AI. Taxing the AI companies (OpenAI, Anthropic, Tesla, etc.) won't be enough. They charge pennies on the dollar compared to what a human would cost, so they don't have the income to pay that tax. The companies that replace humans do.
youtube
AI Jobs
2025-11-30T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzGY5TmluyQiJ4_LgN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzv4jLTeBfNtvYi_6F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyBft0IWs8AJFvBGcp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwbPOCZQJw11hL8ppd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzbBiAck0bLnAwsDI54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwsyv8iZcifQ4qcrit4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyXu-aVmMKV-HavbzV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy3imUCVPo3dTYQiFZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugw7PAUUUPm0JAMoHMV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwLhL6uv6BThDR8Wmx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]