Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
19:30: so, if we can't predict the future, just ask AI to predict for us :) Othe…
ytc_Ugw4HdIfO…
G
What so no one is gonna mention how Han the robot on the right started taking ab…
ytc_UgwA2dC83…
G
Those who think the robot will not give the weapon back or will attack the human…
ytc_Ugy-b2nFJ…
G
This is really frightening since we are being desensitized to differentiate the …
ytc_Ugx31H58n…
G
I think one of the biggest issues with AI art is that it looks fine at a glace, …
ytc_Ugzl1Paeo…
G
AI isn't about taking jobs but replacing employees because german immigrants are…
ytc_UgzlcA2pv…
G
Sora is the first stage of an AGI since it can replicate physics without ever le…
ytc_Ugyc6RGHW…
G
Dear Professor,
Your lecture raises fundamental questions, but there's an asp…
ytc_Ugyrv371H…
Comment
“Not to replace doctors with AI” my butt! I’m sure that wasn’t the direct purpose of this study right now but there will be an ongoing long-term effort to replace doctors with AI unfortunately. The more advanced AI gets the more pressure there will be. The AMA will keep the pressure at bay for awhile, they still have a lot of power
youtube
AI Harm Incident
2024-07-16T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzUVR79bGJtQR310S94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz4qidbZLlgsWxqeah4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWQWNUWy_zG27eb-54AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzL4fjUpekCYJyfuKl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxlwoQE_feXWGaACwh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwILJKHW69GzYFZbTB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhDOwZ1QmyiXG5JgN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwxJoURbVjWWlJZ0nt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxKHbzWsSRAf9CWfZN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaKuec7tvGL4lWGFx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]