Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI art was definitely created by the kind of people who would beg for art to be …
ytc_UgxhhAhqy…
G
So they act in self defense only when it comes to betrayal and death. That human…
ytc_Ugxt3UZPM…
G
Quote tweeting and redrawing AI is just promoting the AI shit through engagement…
ytc_Ugzj9eBe7…
G
Calling artists gatekeepers for not liking AI is like calling runners gatekeeper…
ytc_Ugyrt04jV…
G
the lighting and texture in this generation are next level it is amazing to see …
ytc_UgzjmNDwn…
G
I'm nearing 70 and no AI "companions" for me. I recognize the human animal is so…
ytc_Ugx7GAu_2…
G
I dont think a robot would shut down a library and open a giant sweet shop so yo…
ytc_UgzgyoriJ…
G
That's why education needs to go back to the states. Dismantle the dept of Educa…
ytc_Ugw6etK4C…
Comment
According to Perplexity:
Tesla vehicles driving in self-driving (Autopilot/FSD) mode have a significantly lower crash rate per mile than the average human-driven vehicle in the United States....
—In the second quarter of 2025, Tesla reported one crash for every 6.69 million miles driven using Autopilot technology.
—For Teslas not using Autopilot, there was one crash for every 963,000 miles driven.
—The U.S. national average (including all vehicles and drivers) was one crash for every 702,000 miles in 2023, according to NHTSA and FHWA data.
This means Teslas in self-driving mode are about 9–10 times less likely to crash per mile than the U.S. average for human-driven cars.
youtube
AI Harm Incident
2025-10-19T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwV5UQXPw2H4R9Uzqt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwKR2UGzjgsCR85Oal4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnQ3HCy1z-6qClJq14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXgHJvTtHvLLYiIb54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz-hsjLkvAau0I8DlZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugysg6rwsyzaIoJhnu14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxE0nyxr64eRE3ZhQl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwloWy2D0uUw94JxyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQhUzoOQhvb6E4Uv14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx1jTYI9x8D9xXm5Ml4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}
]