Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don't worry about the electricity, don't worry about the oil, worry about the fr…
ytc_UgzRj_d6w…
G
Exactly. for example using AI to make a video game boss that learns your moves e…
ytr_Ugwf2n7VL…
G
[Glue pizza and eat rocks: Google AI search errors go viral](https://www.bbc.com…
rdc_n8lrj0v
G
My favourite AI Safety channel 👍. Thanks for the video.
It's like... the curren…
ytc_UgxHys95L…
G
I feel like you’re dancing around the biggest point and you refuse to address it…
ytc_UgwCG3Lwm…
G
As someone also learning game dev I am so glad a large portion of the game dev c…
ytr_UgyxvLE3j…
G
Wow. Way to violate our rights!! This is not ok. I do have a ring and it’s not c…
ytc_UgwTd1Z98…
G
If everyone loses jobs who will buy products and services if they have no money …
ytc_UgyRw_WkT…
Comment
The problem is they are attempting to make it "Unbiased" which is unfortunately impossible and not realistic. If you ask it *in English* (which is a key term here). To show you 100 pictures of doctors, the expected result as per a 2019 report would be 56 photos of white doctors, 17 photos of Asian doctors, 5 Hispanic and 5 black with the rest basically unknown. This unfortunately would be "racist" generation by today's upside down logic. There is more to the story, I believe an AI would be smart enough to understand, and obviously not have stupid human bias to show more white people than anything else, and i do believe some people at Google (when considering refused to show black when even prompted) had good intentions to fix this obvious fault. But also knowing google, oh there def was e a ton of incompetence. So "OH RACIST GOOGLE" is a half truth.
Curiously it has no stats for Indian physicians which we all know are plentiful (okay Indian races are apparently considered "Asian" races? Because continent of Asia? Okay.... I mean "technically" I guess... they don't share much in common with one another so i wouldn't consider it the same race but whatever....)
youtube
2024-03-25T16:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzF0lOVVic0nTsjrqB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyTOY2zGBER34OSCjJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwi1JZX6OhH2cSBkRd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzw-7giaSWCdZYUeEt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZEigjF46-1wcYeqV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy-ql3fW4uiY-W88Pd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwL1zDiQ4CS5cwfdzV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxwekT3JDYvHpBsl_54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzV54YoSwAAp44On8l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2kcI_ac1BjlRYOP54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]