Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We are literally 100% beyond the point of stopping AI from being autonomous mili…
ytc_UgzEP8QtA…
G
No, you can have UBI and real people can still contribute to monitoring social i…
ytc_UgypWgVYP…
G
This happened in Canada and here there needs to be more than just a disclaimer w…
rdc_ks8u3sm
G
21:19 this sounds like all AI will eventually become the same. Just one big supe…
ytc_UgyowVuOk…
G
@nidadursunoglu6663 Wow. YT really doesn't want me talking about this (my last t…
ytr_UgyUmVNE1…
G
yeah like sucks to suck if you can't see or you're so mentally fvck'd that you c…
ytr_UgyArMN3x…
G
AI is just like the event horizon 1997 film. There is no escape you’re coming wi…
ytr_UgwsylluL…
G
AI “artists” don’t seem to know what sentimental value is. Saying AI is also an …
ytc_UgzStHHqi…
Comment
<hate capture portals>
PRIME INTELLIGENCE
Amazon built an AI tool to hire people but had to shut it down because it was discriminating against women
Isobel Asher Hamilton 4h
Jeff Bezos
Amazon CEO Jeff Bezos. David Ryder/Getty Images
Amazon tried building an artificial-intelligence tool to help with recruiting, but it showed a bias against women, Reuters reports.
Engineers reportedly found the AI was unfavorable toward female candidates because it had combed through male-dominated résumés to accrue its data.
Amazon reportedly abandoned the project at the beginning of 2017.
Amazon worked on building an artificial-intelligence tool to help with hiring, but the plans backfired when the company discovered the system discriminated against women, Reuters reports.
Citing five sources, Reuters said Amazon set up an engineering team in Edinburgh, Scotland, in 2014 to find a way to automate its recruitment.
The company created 500 computer models to trawl through past candidates' résumés and pick up on about 50,000 key terms. The system would crawl the web to recommend candidates.
"They literally wanted it to be an engine where I'm going to give you 100 résumés, it will spit out the top five, and we'll hire those," one source told Reuters.
A year later, however, the engineers reportedly noticed something troubling about their engine — it didn't like women. This was apparently because the AI combed through predominantly male résumés submitted to Amazon over a 10-year period to accrue data about whom to hire.
Consequently, the AI concluded that men were preferable. It reportedly downgraded résumés containing the words "women's" and filtered out candidates who had attended two women-only colleges.
Amazon's engineers apparently tweaked the system to remedy these particular forms of bias but couldn't be sure the AI wouldn't find new ways to unfairly discriminate against candidates.
Gender bias was not the only problem, Reuters' sources said. The computer programs al
reddit
Cross-Cultural
1539208935.0
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_e7jfxbi","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"rdc_e7j8m7u","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"rdc_e7j4eex","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"rdc_e7jrf20","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"rdc_e7iswsg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]