Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The absolute limit of fucking madness and stupidity mixed together. Biometric an…
rdc_ohxtzf8
G
well ai doesnt have personal life problem emotions so if we get rid of it we cou…
ytc_UgwEV2sdc…
G
Facial recognition technology was imminent and *will* exist in developing and 3r…
ytc_UgxiAg8Ux…
G
Also you can do temporary chats with chatgpt which doesn't feed into its trainin…
ytr_UgzAMEDNs…
G
You nailed it. AI is just another tool to help you do your job. The issue is tha…
ytc_UgzophaCC…
G
I’m literally stuck in the between the pros and cons of Ai art but I have to say…
ytc_UgyKPnpm9…
G
yall need to leave my ai alone 😂 Im just tryna make music man. i write the songs…
ytc_UgybuTd9I…
G
to see what AI is going to do, i think all we need to do is look at our own past…
ytc_UgwChvjtY…
Comment
This might be a dangerous road for human drivers - but that seems to be mostly down to people going too fast and over-estimating themselves. I don't think this road is particularly taxing for self-driving cars though. They usually don't feel tempted to go too fast.
I think the big challenges for self-driving cars lie elsewhere entirely - in weird intersections, bad or missing street markings, confusing lane changes, bad behavior of other drivers, and visually challenging conditions (think night driving in heavy snow fall, or something). Things like odd angled crossings of more than just two roads - or unmarked pedestrian crossings - or nearing a crash-site on a highway, etc.
youtube
2022-07-07T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxdP0EGbd2Wl3jPYSd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgycVT7Ucpf_4N3qteN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwaAsvJZU34Y4K5Pih4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxcTz4z5IvaS6M7qOt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugw1OnYwOnNzasPCFD94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKA6rKlMNK4GWNGe94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxSl1-MCkydfjPmphd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzE8KaJoBTqz84QerB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyCfhi0vpWVIrcAakt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"skepticism"},
{"id":"ytc_UgxVteF9r2N6NjwU7GF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]