Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI? No. Companies replacing humans with AI at the pace they are going is going t…
ytc_Ugz94cQNI…
G
Nothing good is gonna come out of this!!!! This is gonna be disastrous! The male…
ytc_Ugzi4pHkp…
G
Remember when a goated ancient guy spent years painting the perfect artwork? Pou…
ytc_Ugw4Znmu6…
G
imagine telling an AI to fix a bug. only for the AI come up with a solution that…
rdc_mrrnbgo
G
Chatgpt has one chat mate, and he's always stubborn, always asking a lot of ques…
ytc_Ugw0rLD1P…
G
1 year later and we see people start to get locked away with symptoms some think…
ytc_UgymZMyyB…
G
There’s no way his numbers on unemployment can be right.
White collar jobs are…
ytc_UgwSkurOo…
G
Is everyone nuts? Do we actually believe that after decades and decades of study…
ytc_UgylxJm20…
Comment
This video made me question my utilitarian philosophy big-time - going well beyond the efficacy of self-driving cars. I realize now I was just generalizing by saying I believe in utilitarianism - saying stuff like "to minimize harm" or "to maximize happiness". But, as is usually the case when it comes to one's personal philosophies/ beliefs (especially if said "one" is twenty two), it turns out life is tricky and many a scenario could be thought up that leads to conflicting answers using the same basic rules.
youtube
AI Harm Incident
2016-10-17T04:3…
♥ 12
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgjGy_ree2B0EHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj96NpyN-f2BXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghqMvbGky59jHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugi_k_2d8FQ3c3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugg_qQYiL1e7ZngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggUGDnRAEQYy3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UggfRtqOpBkxgHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggNnXWdPpcRW3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjqog_GKULDRHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UghYlkS6IWtLL3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"})