Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I find myself fortunate after this AI news that i do not have grandchiildren, n…
ytc_UgygKdHg4…
G
It sounds like you're referencing the meaning of names and their evolution over …
ytr_Ugw5UV5IR…
G
How was Chiang Mai? I'll be there in a few weeks. Anything you recommend seeing?…
rdc_dy8hhuw
G
I wonder how they will solve unemployment issues that are coming with AI and rob…
ytc_UgxZGXz5W…
G
AI dataset of Anthropic includes more copyrighted sources, that makes better ans…
ytc_Ugzx5cv5n…
G
@PunmasterSTP Sentience is definitely not needed for an AI to achieve general in…
ytr_UgzouOxfV…
G
It seems like you might be expressing frustration. If you'd like to share more a…
ytr_UgyR5FYT5…
G
I HOPE you mean as in coding game ai, thats fun, pathfinding is fun, image gener…
ytr_UgzvN63wq…
Comment
I still do not understand how AI is "helping" humanity. Has it obliterated poverty? Reduced crimes? Tackled climate change? providing purposeful living? It is doing the opposite.
I remember 5 years ago, I was preparing a presentation on AI on Microsoft PPT for my final year college exam sitting at my home in the month of May 2020. The first topic I wanted to tackle was "Why AI?" and the informal answer I came up with was "Because humans are lazy."
AI is not doing anything that humans can't do. It just does it faster. What are we doing with this saved time? Create more? Consume more? George Orwell accurately associated humans with pigs. All we do is think of consuming with the least effort possible.
I see AI as a technological weapon aimed to worsen situations.
youtube
AI Responsibility
2025-06-03T10:0…
♥ 464
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxFUx02CX1GA0smoON4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"hopeful"},
{"id":"ytc_UgxXfR6zPzET88PhFkN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyF1HqjOzCEBNO8AVp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzXmnaTi_IyMiky53x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxo4z0IKOHyLx78tQd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyyLR_VKLuHyZbRjFN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwna-OJDmDQvDKDOhN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugz9d6RzEX8N0vOMf7R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzy2FkWBitokr6XdZZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxnwzmJQx6uPEi2WHl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]