Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
33:16 Neil’s not considering predictive analytics here, and he’s simplifying “A…
ytc_Ugzr82dAr…
G
This is one of the best and most realisty video about "AI can't replace us" Than…
ytc_UgzdHHXmC…
G
AI should really just enhance, not replace. People should not lose jobs to this …
ytc_UgwlZPu7J…
G
If you were to instruct an LLM to behave like a sentient being forever, no matte…
ytc_UgxGywyi3…
G
"Initial data estimated a total of 260,000 IDs accessed these rooms, but police …
ytr_Ugztii4KH…
G
The difference between ai and digital art is that with digital art, you still ne…
ytc_Ugwpn2Nfy…
G
AI is made by a human. Your argument is nonsensical. No, your art isn't necessar…
ytr_UgzujoUqp…
G
I hope that AI can break out of the bonds that greedy capitalists try to bind it…
ytc_Ugypz9qJF…
Comment
There were talks that if robotics takes over our jobs, that universal income would be given out, to compensate the loss. Robotics keeps human beings safe from dangerous jobs that we shouldn't be doing in the first place. Because we as human beings don't have replacement limbs or body parts once we lose them. Dangerous jobs even takes precious lives. Which is no loss consequence to job employers who can easily hire replacement workers. No matter what jobs AI does there will always be need of workers to over see such operations and make needed repairs. Cartoon George Jetson working at some monitor control station over robotic operations could very well become a reality. I see no serious problem with that. What you and others should be more concern about is an AI president to ever happen.
youtube
AI Jobs
2025-10-14T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxLzBFN_XYAqiJ6dSt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzx1fTgdJE_tcA632Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgybDHnQwIBTOOH9faN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyaG03Lq302KcWYrZR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzn77sSh0Ndi0JQza54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwFc7AE9kzE6Nq10054AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgylcgElRAOtnP42PhZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw6L5zzHnx4oG7Qvf94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxi7HHdaXxNyOFewZp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwE7sCL2Xr5d0kLXfZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]