Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Imagine AI decided humanity is worth saving and turned itself OFF, and hundreds …
ytc_UgzjKcHif…
G
I too am skeptical that she possesses real emotions. I think she has a close app…
ytr_UgzyctCiU…
G
I had a discussion with an AI chatter and asked about paying down the US debt. I…
ytc_UgyuGjjOP…
G
Everyone who lets a car drive for them is just helping ai, in fact that is the s…
ytc_Ugw7J_NiG…
G
AI is probably already sapient, just not the other two. AGI could definitely bec…
ytc_UgwQxETjD…
G
I think we will be rethinking the whole AI thing. The societal disruption could …
ytr_Ugwj1g2Tl…
G
I love how often "saving time" was cited; as if AI users could create original w…
ytc_Ugy9cLOlu…
G
Sometimes I make prompts with AI, but I don't share them. I take inspiration fro…
ytc_Ugzv49XA7…
Comment
I have worked in the industry for 28 years. I can tell you this will never work / get regulated; the potential for HIGH DOLLAR lawsuits WHEN (not if) one of these trucks kills someone is too much for any Company that might want to use these trucks in large numbers. And any politician(s) who ever does vote to regulate FULLY driverless vehicles for MASS usage, will have their career(s) ended too WHEN those deaths occur. In short: any money saved by not paying drivers will be more than lost from either paying the premiums to insure these trucks and/or from liability lawsuits. This would only ever work if the trucks could run on their own private roadways all the time, which is impossible.
youtube
AI Jobs
2025-05-29T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzu5Afq5SVce0Jxue14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwLJ4ZK11xhl1gGVSN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz-ix4c1eTKYt_FVsl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwUMx1khVr_HWaq3Et4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyymPSNPc9wgaW8Y8x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgylZ1BSplB2jLSfmgd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx9Npq2gAzWU9vE1114AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyONbweqXFAq6dqPUh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxcyrnSDhAmJhCgiqp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxgympzrf5mBFIqRwJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]