Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@weareinsideAI Made a video about it on how the Three Laws of Robotics are Slave…
ytr_UgyXvXT1z…
G
AI inadvertently proved the existence of the human soul by showing us what art l…
ytc_UgzlM-FPl…
G
Everyone: we should stop SoraAI!
Me: *iconicly searches for the 3 hour long para…
ytc_UgzLRf9AF…
G
I know somebody that works in AI and it is not good. Jobs will be taken in a mat…
ytc_UgxfnfztV…
G
The way i see it the only reason we get stuff like AI is because huge corporatio…
ytr_UgxVllMw5…
G
Ngl, the gen ai (even though largely used wrong ) is so much more advanced than …
ytc_UgyGbrpr8…
G
Even as someone who is as talened with art as a sponge with nuclear physics i ca…
ytc_UgwWHGJtY…
G
People were mad the workers HD to pee in bottles now they're mad the jobs been a…
ytc_Ugx92Dlyz…
Comment
What am I missing here? When you engage Autopilot, a Tesla CLEARLY tells you to stay engaged and be prepared to take over at any time. There’s even a camera inside the cabin to monitor your attention and cancel the autopilot if you’re not paying attention. Tesla is one of many car manufacturers with “self driving” cruise control, but it’s the only one that cops flack because of its high profile. I’m not a church of Elon follower by any means, but this is just sensationalised news when really the discussion should be that people are using this as a crutch when it’s merely a tool😊
youtube
AI Harm Incident
2025-04-14T09:3…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxR9I6PNNLzQleDQGN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwCVe-uLqHzpaQwRyd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgweWosgEdBDbM1J9t54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyVzcrXsyuBHAsJZIZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwztLjEZzLc6Zu7oxp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzNVC1ydO0c-soNTr14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3JtDkh1o42myOp4d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwskBsLDHm3rIbREOF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugwv_5nV_2Z909Ro70h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxyE2r3cdmW6pb2iWN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]