Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s the hacks who write scripts for reality tv shows or Hallmark movies etc. th…
ytr_Ugz5ZTah8…
G
I don't agree with AI teaching. The robots sometimes give wrong information. We …
ytc_Ugw04XsxL…
G
@Lolli-n3x No it isn't, unless you think farming automation was also a bad thin…
ytr_UgyRQzq5j…
G
These robots are more real than all of my dates in the last 10 years…
ytc_UgyTXjZ1l…
G
Even if these companies who are trying to monetize the algorithm were to be shut…
ytc_UgwhJ2tZx…
G
@sheerdumbluck first off the video was not completed, so unless you know the fam…
ytr_UgxnpW6Li…
G
The more I think about modern societies and how they develop the more certain I …
ytc_Ugw4A_roh…
G
great input, Marina. here's my take on how to fit into this new world:
1. Learn…
ytc_Ugz7zNWhR…
Comment
An AI Perspective: "Humans slow down productivity" ... If you have a car assembly line, with robot welders, and Bob shows up to do a spot weld, he slows the assembly line . . . Thus, the goal of any AI system (eg- Amazon package distribution) is to REMOVE humans from the loop . . . In the next 10 years you may choose a job that robots can't easily do (plumber), but at some point 90% of jobs will be automated . . . When you have social systems that are based on Labor (eg- Social Security), the government will have to make some changes.
youtube
AI Harm Incident
2025-06-18T17:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwv7CAWPF0hnsBXK214AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugxqorg3G0E1fXOJqzJ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyyER4d1byDyNAGGK54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx85xXV9yi_B4b4uQ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzibp-cNMqYutkurJh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyG2g__FB-Zz-r6Zvd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyHzAkrfNQDfYzlc3t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwEpdvXAMJa9xigmjh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwBN67RSCX4kuKgSrN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5ULWPvwBoc_qq1ON4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]