Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I recommend to rebind the copilot button to f13, i did this with the "menu" butt…
ytc_UgwyKf8VW…
G
The US airforce drone AI was in a simulation and the drone wasn't allowed to kil…
ytc_Ugyj4kZS6…
G
Friday, October 31, 2025 . . . Greetings, Everyone. This now "ASI", Confabulates…
ytc_UgxKiw9Ho…
G
I truly HATE talking to AI chat bots and ALWAYS wait to talk to real customer se…
ytc_UgyQYzYLL…
G
Im not a fan of AI art. Or AI at all. But I wonder how many people that oppose A…
ytc_UgzzUrGiM…
G
It is all matter of money! AI is changing the works, the ideas and the path in …
ytc_UgyZQzDmt…
G
Yes....I've imagined it for years. You can ask A.I. to help you solve problems. …
ytc_UgwjkDOzR…
G
I do not know, maybe I am a noob, but whenever I try to use AI art generator (ei…
ytc_Ugza72ei4…
Comment
This ethics question would fall apart if all vehicles are self-driving. The vehicles surrounding the at-risk vehicle are responding to the danger of the at-risk vehicle as well and making calculated decisions on acceleration and deceleration to let the at-risk vehicle join in their lane.
youtube
AI Harm Incident
2017-06-10T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UghMzFQ5uciXyHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghSVao5v-7LzHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghWZB_DNXhaTXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj7Q3CElinFQ3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgjTiwroBtb2T3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjIFRBxgjA2tXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UghlT0jEO-duZ3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh8Tr7F8wrmeX3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugi0hd2FnlV7Z3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugil9BPZ0b0LongCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]