Home > News content

Amazon AI provokes public anger: actually learned to "heavier"

via:博客园     time:2018/10/11 16:33:03     readed:77

Amazon has a big head.

For a long time, this home appliance giant has been applying AI to improve efficiency, but what I never expected is that AI, which should be fair and unconscious, will also be "dead" & <; ……

If you don't learn well, you must learn to be patriarchal and discriminate against women.

And this internal secret has just been exposed to the world by Reuters.

When the incident came out, it set off an uproar.

Foreign netizens have begun to vomit, telling about the unfair events they have experienced, as well as the wonderful computer systems. There are also a bunch of netizens who started to analyze the cause of the problem

Amazon also likes to mention a wave of ridicule:

“It can be said that this artificial intelligence is very successful because it mimics Amazon’s current recruitment status. ”

An AI with colored glasses

orgsrc=//img2018.cnblogs.com/news/1058512/201810/1058512-20181011150209097-1925900883.jpg

Screening resume is one of the core tasks of many HR. For a large company like Amazon, there are not only many jobs, but also a large number of job seekers who post the same job. Therefore, the endless browsing of resumes has become the daily routine of HR.

Moreover, in the resumes received by the same post, there are always some resumes that do not meet the employer's standard, and the text is not correct. Even if it is suitable, it will be divided into three or six, etc. Some people are excellent, and some people are also excellent. Better than good people. Choosing the most suitable person among job seekers is also a headache for HR people.

Can you find an AI to automatically help HR screen your resume?

Therefore, as an e-commerce company, Amazon moved the daily user's function of scoring the product star to the AI ​​to score the resume of the job seeker. In this way, HRs can be beautiful and said: "I only look at the five-star resume, AI will pick me out." ”

In 2014, the project began, and Amazon's machine learning team began developing a screening resume AI.

But by 2015, Amazon found out: Worse, this AI is not good at learning, and has learned the same set of human gender discrimination.

If you post the resume of a job developer" software development engineer" or other technical job to the AI ​​to score, it will give a low score to the resume containing the keyword "female", and "male"; It is not affected.

In addition, if the resume says that the job seeker graduated from two women's colleges, the AI ​​will also drop the star for the job seeker. The Reuters informant did not say which two schools were.

Can humans find a job, and they have to look at the face of AI, as a female job seeker, why do AI make irresponsible remarks to me? In the United States, such a lack of attention to such suspected discrimination will attract a large wave of opposition and cause big negatives. Amazon hastened to modify this AI to keep it neutral to the words "male" and "female".

However, the change is not complete. Today, AI is not to discriminate against women. Will it discriminate against black people tomorrow? Will it discriminate against LGBT people the day after tomorrow? Will it open a map cannon after the day? These are not allowed.

Later, Amazon gave up the project.

It’s all your human pot!

A good AI, how to automatically learn the gender discrimination in the process of growing up?

It depends on how it is "eat", what is grown up, that is, what data is used to train the AI.

Unfortunately, the pot still has Amazon's own back, because this AI "eat" is Amazon's own recruitment data, from the resumes Amazon has received in the past 10 years, according to these resumes, Amazon's AI learned to pick out male job hunting By.

How many men are there?

What is wrong with these data? Reuters analysis may be because most of the technical jobs in the technology industry are done by men.

△

△ Sex ratio of global employees of well-known American technology companies

△

△ Sex ratio of technical staff in well-known American technology companies

Reuters has compiled data released by these companies since 2017. It can be seen that companies such as Google, Apple, Microsoft, and Facebook account for 2/3 of the total men, while the technical positions are singled out. Nearly 4/5.

AI learns about human discrimination

However, the amount of data does not mean that a small amount of data will be discriminated against. I believe that Amazon's AI will not be stupid enough to select only the common characteristics of most people in the resume data. So, have you missed a few geniuses?

In Hacker News and Reddit's comment area, some of the more technically savvy netizens are aiming at the existing gender discrimination in Amazon's recruitment.

Technically, it can be said that this artificial intelligence is successful because it mimics Amazon's current recruitment status.

And give the logic behind it.

orgsrc=//img2018.cnblogs.com/news/1058512/201810/1058512-20181011150209108-2075748859.jpg

The machine learning process does not introduce any bias, but any deviations in the training data will be faithfully displayed in the algorithm.

orgsrc=//img2018.cnblogs.com/news/1058512/201810/1058512-20181011150209100-1589279724.jpg

In other words, AI itself is an innocent "children", it does not learn to prejudice and discrimination, but if you give "children" to the class "teacher", Amazon's recruitment data itself Prejudice, then these prejudices will be <; words and deeds & rdquo; to the innocent AI.

Or, AI learns human prejudice and discrimination from human society.

We don't want AI to discriminate against women, but this is no easy task, because AI can't ignore the discrimination against women in the human society it learns. This is definitely a difficult problem, not just a technical problem, but also a philosophical problem.

orgsrc=//img2018.cnblogs.com/news/1058512/201810/1058512-20181011150209117-1668482932.jpg

It is not the first time that AI has inadvertently learned the evils of mankind.

Previously, Microsoft's chat robot Tay had learned the extreme remarks of humanity and cursed feminists and Jews on Twitter.

And recruiting AI this time, human error has made AI repeat the same mistakes.

“Obviously, we have not learned any lessons from Microsoft Tay. ” Some netizens commented.

orgsrc=//img2018.cnblogs.com/news/1058512/201810/1058512-20181011150209114-1745521313.jpg

This AI relies on keywords?

Not just a question of training data. The Reuters report also revealed details of Amazon's training AI.

· developed 500 models for specific job functions and positions.

· Train each model to identify nearly 50,000 keywords that appeared in past job seekers' resumes.

· The model algorithm prioritizes the skills of job seekers according to their importance.

So this AI, a lot of work is catching keywords. For example, its preferred “execution”, “crawl” words appear more frequently in the resumes of male job seekers, and also create a disadvantage for female candidates from another dimension.

Therefore, this is also a cause of gender discrimination. Even, it may give people the opportunity to “drill the loophole”.

A netizen on Reddit commented:

“The AI ​​that Amazon reads resumes is doomed from the start, because anyone can learn how to write a resume. Let me write a doctor's resume, I bet I will be better than the real doctor. ”

orgsrc=//img2018.cnblogs.com/news/1058512/201810/1058512-20181011150209102-1109264674.jpg

Think about the training of “CV Writing Skills”, do you tell them HR? Looking at a resume for only twenty or thirty seconds, some keywords and important data in the resume can attract HR's attention?

Therefore, this mechanism of catching keywords has enabled many people to obtain higher star ratings by forcing keywords into their resumes, creating another kind of unfairness.

AI recruitment, road resistance and long

According to a survey conducted by Recruiting Company CareerBuilder in the United States in 2017, 55% of HR managers said they will adopt AI in the next five years as a tool in their daily work.

Some “radical” or companies with a large number of recruitment needs have applied AI to the recruitment process. For example, the Hilton Hotel, when recruiting, will first use the chat robot interview, and match the job for the job seeker, and then enter the next round of interviews.

In an interview, Sarah Smart, the deputy director of Hilton's recruitment, said, “Artificial Intelligence analyzes the tone of the job seeker's tone, eyes and answers, to judge whether the job seeker is passionate about the job, to help us screen job seekers. ”

orgsrc=//img2018.cnblogs.com/news/1058512/201810/1058512-20181011150209113-1636437432.gif

How about the specific experience? Japser Rey, who has experienced the chat robot interviewer, said, “When talking to a chat bot, I don’t have to worry about distracting myself, and the robot doesn’t wear colored glasses to look at people, it’s relatively fair and just. ”

Relatively speaking, most companies do not put AI into specific recruitment decisions, just as an aid.

· Baidu: Apply AI to recruitment. In this year's campus recruitment, AI analyzes resumes and recommended positions.

· Goldman Sachs: Developed a resume analysis tool that would match job seekers with the “best fit” department.

·LinkedIn: Use the algorithm to provide employers with a ranking of job seekers based on job postings posted on the website.

Although people are not very eager to recruit AI, they can't stop the trend: AI will eventually look at your job.

It will only be late, but it will not be absent.

China IT News APP

Download China IT News APP

Please rate this news

The average score will be displayed after you score.

Post comment

Do not see clearly? Click for a new code.

User comments