Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“Yee hah” said the robot as he put on his cow boy hat and was handed a machine g…
ytc_UgyHg_xwJ…
G
She's perfect never ever let the women of the neighborhood get around around thi…
ytc_UgzVi9jkj…
G
Does nobody realize that to get 4th year associates you need 1-3rd year associat…
rdc_n5gdrox
G
...most people work at art to get better~ just saying there is no such thing as …
ytc_UgzZBKaI0…
G
Karen Hao's book is really good, I'd recommend everyone read it, especially thos…
ytc_Ugys0phjn…
G
If the greatest neural nets work so well with just 1% of the neural connections …
ytc_UgwDqiN0t…
G
The algorithm drives me mad. All I see is Meidas Touch, various heavy metal revi…
ytc_UgxGOkEhO…
G
I meant to say AI art like drawing characters and I kinda know how to draw hands…
ytr_Ugz8ReCLb…
Comment
How will autonomous cars see a small child playing with a ball on the side of the road, or a dog?
Will it be affected by adverse weather conditions?
How will the performance of the radar behave over time and with lack of maintenance?
Who bears the legal costs of accidents?
Can a passenger really relax while the car is driving itself?
How easily will the systems be hackable and exploited?
Full autonomous vehicles are still decades away, if ever.
youtube
AI Jobs
2016-12-27T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UghgcFaTWevmEXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgiWeIUN8L3ttXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugi1_GvFa4AunXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggTPnwXvcMUtXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggihVqE_kUp1ngCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjRWr86xeA9N3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggHi6aThWs9CXgCoAEC","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjDFUPhV6KdyngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggtyRTpvkVrM3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ughj9sxA69eOAXgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]