Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
if it goes ai, what happens to human labour? starve , become slaves to overlords…
ytc_UgzHItfQr…
G
If a driverless truck inevitably causes an accident and somebody gets killed, wh…
ytc_UgwRx4dbb…
G
Doctors are more and more clueless. They don't care and barely try. Many of us l…
ytc_UgxM_znHJ…
G
This reminds me of the movie I Robot, as Elon Musk said AI is dangerous and the …
ytc_UgwHzvt0m…
G
On the human mimicry vs AI recreations of somone's art style:
I dont know where…
ytc_UgyzKOk6d…
G
@dmitryfedorov114 talking about apocalyptic potential is a marketing scheme to s…
ytr_Ugx4CYCit…
G
Yo In Regards to the, "AI Lawyer Makes up Case Law, because, Pen to Paper, no La…
ytc_UgzlLChLD…
G
I don’t think it will we still need software engineers to develop robots & for t…
ytc_UgyuX1B0X…
Comment
I studied AI back in 1989 when computer development was the main focus to improve the ability of engineers to design and build manufacturing processes to build products like computers and cars. Robotics and process planning were key elements of the vision we had, for logistics and production, that would allow costs to fall and quality to increase. Society in the West is slowly moving toward those goals while China seems to have reached them. Computers taking over decision making to the detriment of people is not a likely outcome unless it is programmed to be the outcome. Fear not earthlings, the robots are not taking over the world.
youtube
AI Governance
2025-08-17T02:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyVbUkSskYy9NHJAM14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxPdXXk_F2kh5CqoR14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx58rWy72FDFpASMWh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxTRaQzBVo_LSj3UMB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwb9B5BqFjDF7TY5CZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwOJCKuPzN2DB7QaDR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwI4MAacHt22JQGWUt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"unclear"},
{"id":"ytc_Ugy1fTA35Nuu5h6T7-N4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxI9sirke1RG0oy8dV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx3GhKot0cNPD7mYCd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]