Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'd love an AI doctor with a real doctor there to write the prescriptions and au…
ytc_Ugz-P94RB…
G
As a geotechnical engineer, this surprised me as well. Only thing I can think of…
rdc_nnvg2fr
G
Silicon Valley has eliminated more jobs and done more harm to the economy with t…
ytc_UgyqOb52E…
G
Mujhe aisa lag raha hai ki ek time aayega jab AI world pe human beings pe domina…
ytc_Ugzt5hr11…
G
A big difference is the resource pull for regular data centers vs AI data center…
ytr_UgxLU3vRO…
G
@heyaisdabomb without the Tesla Autopilot or full self driving feature, there pr…
ytr_Ugzw0LL9F…
G
Chatgpt using language algorythm words that it knows will make you feel somethin…
ytc_UgzCbNb4C…
G
God created man in His Image. Sacrifices His own Son to atone for the sins of m…
ytc_UgzFO34Mg…
Comment
Yes, self driving trucks are here. But, they are years from being financially viable and truly self driving. Most of us will be retired and dead before they are somewhat common. My question is this....what will society do, when a 4 wheeler causes a major crash with one, and gets people killed? Yes, technology is great. But again, nowhere near viable, or able to predict human behavior.
youtube
2019-08-23T00:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwYawdNpfubNbnonTx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxYNZIj2apIh62K1qd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxM1aSNIJ9aqRZwZWR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwP5ytDEXsFz3s2zbV4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy3y2FOQ-80e6DPttJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxlg8-JOBN5wTI9E554AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyhixFp5RlJZSjXOKl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwLDUENDSDVKrcwif14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxYNahWCWDtY8u0i2Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3VxLY2-YzFbMsOFd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]