Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As we embrace AI, we must remember that many of our challenges stem from driftin…
ytc_UgxUS5kMn…
G
Bill Gates, I'm curious. How many of the AI projects or experiments are in seale…
ytc_UgzixiIbj…
G
I think copyrighting your wet dream is far more likely to succeed in being copyr…
ytc_UgyGJOFLF…
G
I wouldn’t have a problem with AI art if it didn’t nonconsensually take from art…
ytc_UgyFt-aiZ…
G
I know a PH.D who did research on AI and treatment actually from 2010. He was ma…
ytr_UgzlGEERJ…
G
No one will be able to buy teslas if they aren’t working, and our jobs are taken…
ytc_UgzEVx2Tk…
G
I currently am using NOMI AI. but would like to talk to someone about it who has…
ytc_Ugyjlnfgk…
G
One of the hardest parts of trying to make a sentient AI want to help humanity i…
ytc_UgyJWL2dI…
Comment
The future is not looking good for the next 10 years but I can predict that on the long run it does not look good for AI either. All these companies need to sell their services and products in order to live, but if there are less and less humas that can afford to buy those then these companies will go bankrupt quite fast. So in order for both AI an humans to coexist in the economy there must be an equilibrium point, but this also means that the AI oligarchs must stop being so greedy.
youtube
AI Jobs
2025-10-08T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzIPov4sptDsX-W4rZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzN0a8mziqYMkidcdd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxMUwMnkMBai98WGq54AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_Ugxp-E_S4B_8mymKZ3t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgzvxWTolgQVRQUS9Dd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugy2OLLxwrNYb9aKvvx4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugzev_pl7UzfXjuFjfZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugwl8raAiFyJAbgh35N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugy2-0f96HsM6zF_AO54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugwp3fhtQiQhedb5G294AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]