Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
*AI isn’t killing the planet.*
– GPT-3 training = ~550kg CO₂ — less than one fl…
ytc_UgwCOxCNM…
G
You’re. It an expert fit you’ve been working on an AI governance role since 2020…
ytr_Ugw9Qf0Ds…
G
There is nothing funny about AI access. Nothing!!! Take this “news” piece as a w…
ytc_UgzxxEaz-…
G
AI might plateau around human intelligence... sure, perhaps. But you can still a…
ytc_Ugx-MyIK8…
G
Agree with the analysis of this guy.
That said, I might have a “new pro-AI argu…
ytc_UgwTaLCZR…
G
if you shared it on instagram, technically you consented having your data being …
ytc_UgzmkYHeF…
G
it's not just empathy they need to program into the AI, it's responsibility and …
ytc_UgyQAvlDQ…
G
Step 1: Artist post their drawing
Step 2: AI scrape data from that drawing and r…
ytc_Ugyxee2-i…
Comment
Calling it autopilot is the worst part. I do know that this technology has probably saved more lives than it has ended, just look up videos about close calls where no normal human would be able to react to a situation while driving. However people still need to pay attention while using it as it's not real full self driving. As in said the video the guy wasn't paying attention despite being told to by the car. I still think that something needs to change with tesla vehicles.
youtube
AI Harm Incident
2025-01-21T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzZ-0lUFQrxdAMS9Bp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy2UX64H1Mo4c0vgt54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxApkn0WMMqvL5pWX14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0swZ4kh5iRug0c9N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0DYmkwE5ehsW0JoN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyyJWaVDd20D2MIxxR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxGMAWY7O7_4eg2v_d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz3of2MQmWnLP309o14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7uO5xSoYu8S5aAnB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz9qbULdEK1QWoqTih4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]