TheJakartaPost

Please Update your browser

Your browser is out of date, and may not be compatible with our website. A list of the most popular web browsers can be found below.
Just click on the icons to get to the download page.

Jakarta Post

AI worsens the burden for Indonesia’s female gig workers

Women pay the highest price because algorithms fail to recognize the realities of care work, safety concerns and social norms.

Suci Lestari Yuana (The Jakarta Post)
The Conversation
Mon, April 20, 2026 Published on Apr. 19, 2026 Published on 2026-04-19T11:06:36+07:00

Change text size

Gift Premium Articles
to Anyone

Share the best of The Jakarta Post with friends, family, or colleagues. As a subscriber, you can gift 3 to 5 articles each month that anyone can read—no subscription needed!
Female ojol (online motorbike transportation) drivers participate in a motorcade on April 21, 2024, organized by the East Java provincial administration in Surabaya, as part of Kartini Day, a day dedicated to the fight for gender equality. Female ojol (online motorbike transportation) drivers participate in a motorcade on April 21, 2024, organized by the East Java provincial administration in Surabaya, as part of Kartini Day, a day dedicated to the fight for gender equality. (Antara/Trisnadi)

A

rtificial intelligence is often celebrated as the future of work. It is efficient, innovative and neutral. Yet, for many women in Indonesia’s gig economy, AI feels like a source of mounting pressure.

In my recent research on female gig workers in Indonesia, I examine what I call AI colonialism. This term describes how colonial influence persists today through technology and digital systems that maintain control.

This concept captures how powerful actors use AI, often based in the Global North, to exploit workers in the Global South. Much like historical colonialism, this digital iteration relies on the extraction of data, labor and resources to cement unequal power relations.

In Indonesia, AI-driven platforms like ride-hailing and e-commerce draw on informal labor but push the risks and responsibilities back onto workers. Women pay the highest price because algorithms fail to recognize the realities of care work, safety concerns and social norms.

Indonesia’s labor market has long been defined by informality. Millions are working without formal contracts or social protections. Tech companies like Gojek, Grab, Maxim and Shopee did not formalize this workforce, they only digitized it.

Drivers are classified as partners rather than employees. This means no minimum wage, no sick pay and no maternity leave. Income is dictated entirely by completed tasks and algorithmic ratings.

The Jakarta Post - Newsletter Icon

Viewpoint

Every Thursday

Whether you're looking to broaden your horizons or stay informed on the latest developments, "Viewpoint" is the perfect source for anyone seeking to engage with the issues that matter most.

By registering, you agree with The Jakarta Post's

Thank You

for signing up our newsletter!

Please check your email for your newsletter subscription.

View More Newsletter

For women, this structure collides with the so-called “double burden” since they are responsible for paid work and unpaid care.

Lia, a 33-year-old food delivery rider, wakes before sunrise to cook and get her children ready for school. It is only after she has cleared her domestic duties that she finally logs into the app.

“The system doesn’t know I have children,” she told me. “It only knows whether I am online.”

Platform algorithms reward constant, uninterrupted availability. Incentive schemes demand a specific number of trips within narrow time windows, a high bar for those with domestic ties.

If Lia logs off to pick up her children, she risks losing potential bonuses. If she reduces her hours due to menstrual pain or fatigue, her performance metrics drop.

Neoliberal capitalism relies on a massive amount of unpaid “invisible labor”, such as childcare and housework, but refuses to pay for it or provide a safety net for those who do it. Far from correcting this imbalance, AI systems make things worse.

When Cinthia, a female food delivery rider and a single mother of a one-year-old, fell ill and turned off her app for several days, she noticed fewer job offers upon returning. “It felt like the system punished me,” she said. “Now I’m afraid to stop working.”

The algorithm does not explicitly discriminate. However, it operates on the assumption of a worker without caregiving constraints, a norm that systematically disadvantages women.

The digital economy often claims neutrality. But gender bias persists.

Yanti, a 43-year-old ride-hailing driver in Yogyakarta, regularly messages male passengers before pickup: “I am a woman driver. Is that okay?”

Many cancel immediately.

The app records cancellations. It does not record gender bias.

Because Yanti avoids working late at night for safety reasons, she misses out on rush-hour incentives. The system, however, doesn’t account for safety, it simply interprets her absence as lower productivity.

Scholars, like Virginia Eubanks, have pointed out that automated systems often mirror and amplify social inequalities rather than eliminate them.

In Indonesia’s platform economy, discrimination isn’t necessarily hard-coded. It is a byproduct of a design logic that favors efficiency over equity.

In India, women drivers also report earning less on average than their male counterparts, partly due to safety-driven choices regarding timing and route selection. The algorithm does not account for risk in its calculations. It only measures raw output.

For women drivers, safety is a constant negotiation.

Around 90 percent of the women in our focus group discussions chose food delivery because it felt safer than ride-hailing. Even so, harassment persists in delivery work.

Lia shared how a male colleague targeted her with inappropriate comments as they waited for orders. “It’s not only customers,” she said. “Sometimes it’s other drivers.”

During the COVID-19 pandemic, gig workers were labelled “essential”. Yet their income dropped dramatically by as much as 67 percent in early 2020. To cover the loss, many worked 13 or more hours per day.

Platforms maintained their rigid performance metrics throughout the crisis. Drivers who are forced to stop working due to illness often see their ratings decline. Health vulnerability was translated directly into an algorithmic penalty.

This reflects labor discipline through digital infrastructure: control shifting from foreman to code.

AI colonialism is more than just foreign ownership. It is about the way extractive logics are woven into everyday digital systems. Workers bear the burden of labor, data, time and risk, yet the platforms hold all the power over algorithmic governance.

Female gig workers have built dense networks of solidarity through WhatsApp and Telegram groups. They share information about policy changes, warn each other about unsafe customers and exchange strategies for navigating algorithmic shifts.

If an account becomes “gagu" (silent), receiving few orders, experienced drivers “warm it up” by temporarily boosting its activity. They lend money for fuel. They pool resources for vehicle repairs.

When someone faces harassment, others circulate the information quickly to protect fellow drivers. They visited the platform office together when a member was suspended.

Rather than waiting to be formally acknowledged as employees, these women build protection among themselves. This “solidarity over recognition” emerges from shared vulnerability as mothers, caregivers and workers in male-dominated spaces.

Their mutual aid turns care into a strategy and a form of “everyday resistance”, subtle acts that challenge dominant systems, while reflecting a distinctly feminist ethic of survival through relational solidarity.

AI is not colonial by design. But when embedded in platform capitalism within unequal societies, it can reproduce colonial patterns of exploitation and loss of ownership.

If we are serious about building just digital futures, we must move beyond innovation narratives and listen to workers, especially women and vulnerable groups in the Global South.

Their stories are a vital reminder that behind every “efficient” algorithm is a human being navigating the delicate balance of survival, dignity and hope.

---

The writer is a lecturer at the Faculty of Social and Political Sciences, Gadjah Mada University. This article is republished under a Creative Commons license.

The Conversation

Your Opinion Matters

Share your experiences, suggestions, and any issues you've encountered on The Jakarta Post. We're here to listen.

Enter at least 30 characters
0 / 30

Thank You

Thank you for sharing your thoughts. We appreciate your feedback.

Share options

Quickly share this news with your network—keep everyone informed with just a single click!

Change text size options

Customize your reading experience by adjusting the text size to small, medium, or large—find what’s most comfortable for you.

Gift Premium Articles
to Anyone

Share the best of The Jakarta Post with friends, family, or colleagues. As a subscriber, you can gift 3 to 5 articles each month that anyone can read—no subscription needed!

Continue in the app

Get the best experience—faster access, exclusive features, and a seamless way to stay updated.