Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It sounds like you have strong feelings about AI! In the video, Sophia discusses…
ytr_Ugw1O2YOx…
G
I went to public school and I did those things too. I was more into art and home…
ytr_Ugw634zwm…
G
There is no such thing as an "AI Artist". They are a commissioning an AI to gene…
ytc_Ugyt2GlkZ…
G
Witnessing the rise of the first AI advocates in real time. What a time to be al…
rdc_j8vqayp
G
Even your Job at Breaking points paying you? You have no back up do you? AI is t…
ytc_Ugyxg6NZi…
G
Its too late, we are cooked already but most people don't know it yet. This is p…
ytr_UgwnR9SgT…
G
@kaba_me The artist's work was used in the training data sets for these AI mode…
ytr_UgxAHwoxj…
G
Watching documentaries like this one, one has to feel that Chinese people have b…
ytc_UgxIqHYw2…
Comment
My thinking on this is that if the same situation was given with 1 alteration I see no REAL issue here. The one small change is that if ALL cars are smart/self driving then each car on the road would see the immediate danger and avoid it, while the others do the same. So when the first car must avoid the falling object the other car would avoid the now swerving one. A domino effect of sorts, now I can also see this leading to multi car pile ups, however that is to say any incident currently can lead to the same with smaller margins of error, a human driver can be distracted, the machine can't leading to over all fewer collisions.
youtube
AI Harm Incident
2015-12-09T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugg0f6gYDoM2u3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg8lvN9vbzqmngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiOKnl1PCwC5XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghcyXISlo02pngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi2qolU0hnvF3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghXbi7fYusSUngCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugj2jxTP45dbIngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiMqqVdmeG89HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj-Xh3Fxwz1RXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugh3GHPd7ug6D3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]