Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He would've killed himself without open Ai or chatgpt he should ve asked his par…
ytc_UgzT4DQoh…
G
You are a beautiful artist
Also, I’m a graphic designer kinda a little bit (idk…
ytc_Ugww79P30…
G
I can’t listen to your podcast and I don’t like the Godfather of AI. I want him …
ytc_UgxuLSGLw…
G
@scarletredblood If you actually credited every piece that formed your artistic…
ytr_UgzZEEy7V…
G
if either one ends in no jobs for the little man, then obviously we relinquish o…
ytc_Ugybwsmm2…
G
Absolutely fascinating to witness this level of transparency and tough questioni…
ytc_Ugzsh9BC7…
G
I live in Silicon Valley. The only engineers they are keeping are the Indians. B…
ytc_UgzPg6-i_…
G
And another reason to mask up in public. Even if a mask only stops 30% of the vi…
ytc_UgwKa1qiC…
Comment
>A year later, however, the engineers reportedly noticed something troubling about their engine – it didn’t like women. This was apparently because the AI combed through predominantly male résumés submitted to Amazon over a 10-year period to accrue data about whom to hire.
>
>Consequently, the AI concluded that men were preferable. It reportedly downgraded résumés containing the words “women’s” and filtered out candidates who had attended two women-only colleges.
I imagine they didn't implement a gender check, rather they probably started by hand-picking the best résumés and then letting the A.I. determine what it was that made them good. "women's" was apparently not a winning word.
I don't know a thing about A.I. development though, so I could be way off the mark.
reddit
Cross-Cultural
1539253685.0
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_e7keifq","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"rdc_e7keda1","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"rdc_e7koo9j","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_e7irmvj","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"rdc_e7jqq89","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}
]