Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The way we train AI on the open web is almost like letting a child be raised in …
ytc_Ugwm7BCar…
G
This guy worked on neural networks in the 80s & 90s long before the processing p…
ytc_Ugxf4hRQT…
G
AI is no where near better than humans now. That guy was very awkward to watch. …
ytc_UgzBcpjpR…
G
Give it 10 years and you won’t be able to tell what is made by a human or AI…
ytc_UgwJHPZyr…
G
I know someone that made it his mission to personally “rizz up all his ai dates”…
ytc_UgyK3EV_V…
G
Ive seen people in the industry spoken up about have workers right now who use a…
ytc_UgwiIJQ93…
G
your entire life would be just tending to this farm. One bad season and you star…
ytr_UgxYuO5Al…
G
I've watched a fair number of content creators who make their own art and are di…
ytc_UgxiOcsNI…
Comment
He forgot to mention that most of those 5000 incidents are also attributed to the driver not paying attention because they put too much trust in the automatic systems. The law, as it stands, still requires someone to actively be in control of the vehicle. It doesn't matter if the vehicle is self driving. In the case of self driving cars the human at the wheel is the backup for most of the drive. On top of that, if you're the one who initiated the travel of the vehicle you are the one who is responsible if anything goes wrong when there's no driver to take over.
youtube
2023-07-31T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxukQjqRR6f6Xre5sF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxBiMBQ5lWOhK-4mQB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxgySTwFUJ43HVWlvt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxJqYuRi0G1MGQMPix4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMJz-IxeP6ueISXph4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzkSf9SF8XFo-Dn6xd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_UNVwacIfOiICyyN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwutRFxldrAFzPqnBV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxbghNUGweGjTHowWd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxyGYXtuJJyi7iO8_l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]