TheJakartaPost

Please Update your browser

Your browser is out of date, and may not be compatible with our website. A list of the most popular web browsers can be found below.
Just click on the icons to get to the download page.

Jakarta Post

Will AI-based recruitment process lead to gender discrimination?

Yohana Belinda (The Jakarta Post)
Jakarta
Wed, July 27, 2022 Published on Jul. 26, 2022 Published on 2022-07-26T10:27:19+07:00

Change text size

Gift Premium Articles
to Anyone

Share the best of The Jakarta Post with friends, family, or colleagues. As a subscriber, you can gift 3 to 5 articles each month that anyone can read—no subscription needed!
Will AI-based recruitment process lead to gender discrimination?

Do artificial intelligence-assisted recruitments lead to gender-based bias?

Gender equality in the workplace is an issue that most countries still struggle with, including in Indonesia. The Indonesian Central Bureau of Statistics’ 2019-2021 data on formal workforce percentage by gender show that men participate more in the formal workforce than women by an average of 7.75 percent. 

In Indonesia, such biases are often explicitly displayed on job listings. Gender preferences that seem unrelated to the advertized job descriptions are a common sight. 

In acknowledging these biases, some companies move toward adopting technologies for recruitment, in the hopes that, when hiring decisions are separated from human biases, decisions can be made more objectively. Advanced technology, however, does not mean gender biases have completely evaporated. 

In Indonesia, a number of major companies, including those in the property and finance sector, already utilize the system.

Sheilla Njoto, who is currently pursuing her PhD in sociology and digital ethics at the University of Melbourne, found in her research the possibility of gender bias during the artificial intelligence (AI)-assisted recruitment process. Male resumes tend to receive more positive feedback, such as more downloads on their curriculum vitae and profile views from hirers, than females. Using the fake profiles “Jane” and “James”, each with identical experiences and qualifications, James received 36.8 percent positive feedback while Jane got none for management positions. 

“Indonesia has yet to experience this level of recruitment technology, though we are moving towards that advancement in trajectory. Eventually, without proper mitigation, recruitment technologies could broaden the gap that already exists, especially in industries that are considered more ‘masculine’,” Sheilla shared. She further added that it is highly possible that AI intervention in recruitment could amplify the gender gap in certain roles. 

Searching: Algorithms used for job recruitment are choosing men over more qualified women, especially if they have taken maternity leave, according to a University of Melbourne study. (Pexel/Pixabay)
Searching: Algorithms used for job recruitment are choosing men over more qualified women, especially if they have taken maternity leave, according to a University of Melbourne study. (Pexel/Pixabay) (Pexel/Pixabay)

How does AI play a role in gender discrimination in recruitment? 

Most AI systems work based on a certain sequence. Such systems learn to make predictions (or decisions) based on existing data, which so often include the unbalanced representation of gender, missing data and previous human subconscious biases. If a certain pattern of bias is found, then the AI would implement the algorithm to execute the following selection process. 

An example of this case is Amazon. As Reuters wrote in its 2018 article about Amazon scrapping its “secret AI-recruiting tool” after it proved to be biased against female job seekers, the resume system recorded that most positions have been dominated by men for over ten years. In effect, Amazon’s system taught itself that men candidates were preferable. Amazon has revised the system ever since.  

What does this mean? There could be further gender discrimination against women pursuing careers in science, technology, engineering and mathematics (STEM) field or higher positions in a company. 

Unconscious gender bias 

Valencia Gabriella, an Indonesian entrepreneur who holds a bachelor's degree in chemical and bio-molecular engineering from Hongkong University and a masters in entrepreneurship from the University of Melbourne, raises the issue of perceived biological preference. 

“In my previous employment, most tech companies, or the marketing sector, preferred to hire men candidates over women candidates. One of the strongest reasons is because men are less likely to take leave during pregnancy or resign after giving birth,” she elaborated. 

Syahid Deradjat, a labor practitioner, shared that women suffer the stereotypes attached to them by society. They are considered physically weak and unfit to work in the fields which require physical strength and fitness. Women are perceived to be less independent and unable to be located in remote areas far from family. Women need accommodations that many employers find difficult to provide such as separate bedrooms, bathrooms in the field, a lactation room for breastfeeding mothers and months of maternity leaves while men can work all the time.

“We see problems even long before the recruitment process. Many companies find it difficult to have a decent pool of women talent for hire. Women are unlikely to enter or survive in the STEM program because of the stereotypical culture,” Syahid explained. Syahid stated furthermore that it could also affect people's decision-making in the future. 

“Parents don’t want women to enter STEM programs because it is not for women. Parents are not sure if women will get a decent STEM job after graduation because many employers prefer men. Parents, who are the main sponsor of education, don’t want to waste their money on STEM programs for their daughters unless there are scholarships. We don't have enough representation of successful women in the STEM fields to show society that women have a future in STEM careers,” Syahid further added. 

An article posted on the UNESCO website puts forth that while women make up almost half the world population, globally, they make up only an estimated 30 percent of researchers in STEM. In Asia, only three of 18 countries had an equal or above proportion of women: the Philippines (52 percent), Thailand (51 percent) and Kazakhstan (50 percent). 

In Indonesia, the gap is still high. According to a research from the University of Melbourne, only 18 percent of total working women were employed in the industry sector, encompassing mining, manufacturing, utilities, construction and information technology and communications. For men, the figure was 28 percent in 2019. The difference was even more evident in high-tech industries such as information technology and communications, where only 14 percent of women working in that sector had a position as professionals or technicians, compared to 31 percent of men.

Human-taught bias

Sunu Wibirama, AI practitioner and lecturer at the University of Gadjah Mada, explained that gender bias in the algorithms can happen due to bias in the training data and human decisions.  

Bias in the training data means an unbalanced representation of protected groups; algorithms may include protected attributes as one of the variables of “success”. While bias in human decisions means that algorithms are programmed by humans and instructed with datasets containing human decisions, they tend to mirror human behavior patterns. Thus can we infer their proneness to repeating human biases. 

“We can actually create our own biases unintentionally. For example, we gave input on 70 characteristics of men and 30 characteristics of women. The automation will suggest the output of men instead of women. But, in fact, when we face more complicated variables, the system won’t learn by itself. Especially when humans also teach them to do so,” Sunu explained. 

Choosing candidates: Sheilla explained that the algorithm detected men had more relevant experience than women, even in the small sample tested, but women had a better match on the keyword requirement. (Courtesy of Sheilla Njoto)
Choosing candidates: Sheilla explained that the algorithm detected men had more relevant experience than women, even in the small sample tested, but women had a better match on the keyword requirement. (Courtesy of Sheilla Njoto) (Courtesy of Sheilla Njoto/Courtesy of Sheilla Njoto)

Additionally, Sheilla added that over time, due to the unbalanced representation, the program also thought itself to favor masculine vocabulary. This is because men and women often describe themselves in different ways even when they convey the same meaning. For example, in describing a leadership role, men tend to use the words, “direct a team” whereas women tend to use the words, “facilitate a team”.

“Moreover, if we’re looking at unspecific companies, there are more men than women workers. Thus, it can also affect the data-recruitment process,” he continued. Sunu further explained that gender bias in AI can eventually worsen with the role of humans that prefer men for a specific position in a company. 

Sunu further added that algorithms should be programmed not to discriminate in the future. 

“It’s going to be difficult to determine the gender discrimination in the future if the AI does it because they learned from the previous pattern,” Sunu said. 

Sheilla further explained that the influence of AI in the future of the recruitment process might be greater, and so might its impact on the role of women workforce in the economy. AI is nothing more than just a system, making decisions based on data. Unfortunately, if data is meant to reflect society, it will never be immune to the reality of current inequality. 

“Technological advancements are to be celebrated -- but to improve equality, what we need isn’t necessarily another set of technologies; rather, our conscious acknowledgment of a biased system and the empathetic endeavors towards a more equitable future -- precisely those [factors] that technologies have yet to understand,” she closed. 

Your Opinion Matters

Share your experiences, suggestions, and any issues you've encountered on The Jakarta Post. We're here to listen.

Enter at least 30 characters
0 / 30

Thank You

Thank you for sharing your thoughts. We appreciate your feedback.

Share options

Quickly share this news with your network—keep everyone informed with just a single click!

Change text size options

Customize your reading experience by adjusting the text size to small, medium, or large—find what’s most comfortable for you.

Gift Premium Articles
to Anyone

Share the best of The Jakarta Post with friends, family, or colleagues. As a subscriber, you can gift 3 to 5 articles each month that anyone can read—no subscription needed!

Continue in the app

Get the best experience—faster access, exclusive features, and a seamless way to stay updated.