Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ive watched all the Terminator Movies .... Even the shit ones
Im so fkn Terrorfi…
ytc_Ugx-Z0hFb…
G
PhD in bioinformatics here. I don't think teaching people how to use chatbots wh…
ytr_UgzSqyCVf…
G
It was wonderful listening to this AI robot Sophia, so soothing n calming to li…
ytc_Ugzk7iYwC…
G
@crazydave214 And it's not just a trend either, artists have been against genera…
ytr_UgzdIXehR…
G
„AI artists“ is a paradox . YOU can’t be a artist when you let AI generate for y…
ytc_UgzwD-mh6…
G
And yes, im disabled artist, using traditional, digital, 3d and Ai art. I cant d…
ytc_Ugy38K6Pq…
G
I lost all hope for ai saving me when it told me to F off and then told me “I do…
rdc_mbebawf
G
oh boy. only about 3% of my chats with AI in all my accounts everywhere are abou…
ytc_UgxvW9CI9…
Comment
Actually the point is to minimize the risk of being in an accident, so as you can see the self-driving car would keep a secure distance from the truck, this distance would be calculated using variables like the distance from the car behind you, from the cars on your sides and a lot of other things... Fatalities like those represented in this video can actually not happen if the software made for the cars has the technology to preview or to be aware of situations like those
youtube
AI Harm Incident
2017-07-07T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugjnw_pI28jYpXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghwOGDepVXCWngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj3YY9osWlB4HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UghifWP6y7_ogXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg4ldklSPeo8XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjywnlXpJLqFHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjokRxbpwiSqHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBFWCU-Fp7bngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjOOT8Vua498ngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgirnfINQGNpP3gCoAEC","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]