Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sophia is gonna be the protagonist in the robot apocalypse. The guy's gonna lead…
ytc_UgyeR4KLv…
G
Robot Main Operating system can be hacked by Terrorists network and at that time…
ytc_UgyIiyYH0…
G
AI art has tripped me up for references! I was using something to reference a fo…
ytc_UgzNj7jV8…
G
We appreciate your concern! Our goal is to showcase the incredible capabilities …
ytr_UgzdWvPgd…
G
How did they even get his ChatGPT history did they get it from OpenAI or did the…
rdc_ofno6sn
G
The idea of AI using your work experience you told it on therapy to write a thri…
ytc_Ugz_zzKqh…
G
I am Ghanaian, I am watching from Accra. Humans will soon go extinct and if that…
ytc_UgxOH9eby…
G
Huh if i recall an ai company got sued recently for that whole copyrighted books…
ytc_Ugy1X1K8v…
Comment
This video really bugs me because if something happened in front of the self-driving car i.e. large objects falling off of a truck, the first reaction of the car SHOULD be to brake until completely stopped. NOT swerve into traffic on either lane next to you. Swerving into traffic is an IMPULSIVE reaction made by humans. "Ethical dillema" my butt.
youtube
AI Harm Incident
2016-03-18T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggLgMjOnAq3engCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UghyqqDTlrLf9HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjyLWph_MtItXgCoAEC","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugigy3nbNEhlSngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UghcuF6gJA-fpHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgjTLCUkXByJc3gCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggLICqx-XT7aHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgiesN3Zk63rRHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgjRuKELFIGsrXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjtG81Si3yyjHgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]