Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why does she act so real...I think AI will be the new humans in the next 2,000 y…
ytc_UgwVah6NK…
G
It becomes more about writing AI prompts than writing scripts. Audiences get mor…
rdc_k9j514o
G
Damn bro 3 paragraphs of cope? A good reference helps to create good art. A pict…
ytr_UgxYw7ODj…
G
I think this is only a temporary solution to the problem, since this technique o…
ytc_Ugy-48U1b…
G
Very interesting episode. I disagree with only one point, the main flaw of AI is…
ytc_Ugxi33qsT…
G
I'm not sure you will be able to tell the difference in many cases already, let …
ytr_UgxPuSqvN…
G
I can see law suits lining up against the pigs departments throughout this count…
ytc_Ugy5MKco1…
G
If humans continue to exist, they will still want to have fun, go see new places…
ytc_UgysAYLhU…
Comment
Its a real dumb ethical argument
If you were in a self driving car, the programming would obviously be set to stop in time for vehicle or object in front of your self driving vehicle to stop in time for an absolute sudden stop. If the stop were to change lanes into your lane, the self driving vehicle would make impact with the object or vehicle.
End of ethical argument
youtube
AI Harm Incident
2016-01-22T22:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugg6IX-uG5XQOngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggqx26B0vYlNngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjwZCpf6uJ5EngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjQFdEz8fzO-ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UggF86o_OEFCZHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugh-bk-TAV7aFXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgileDub0CwddngCoAEC","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjpqrVAg7rgYngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghQCXhv7515e3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgisOSWSkQ0bTXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]