Blog

5 October 2020

Platform Economy

Platform Drivers: From Algorithmizing Humans to Humanizing Algorithms

How are platforms using algorithms to control and surveil their driver partners and what needs to be done to address workers' concerns in the gig economy? Find out in this FemLab.co blog as Pallavi Bansal gathers voices from both platforms and drivers.

I remember getting stranded in the middle of the road a few years ago when an Ola cab driver remarked that my trip had stopped abruptly and he could not take me to my destination. Frantic, I still requested him to drop me home, but he refused saying he cannot complete the ride since the app stopped working. On another unfortunate day, I was unable to find a cab back home as the drivers kept refusing to take up what they saw as a long ride. When I eventually found a cab, the driver continuously complained about how multiple short rides benefit him more. I tried to tip him after he finished the ride, but instead he requested me to book the same cab again, for a few kilometres, as that would reap more rewards. While I wanted to oblige, I couldn’t find the same driver, even though he had parked his car right outside my house. In yet another incident, I spent the entire night at the airport as I was terrified to book a cab at that late hour. I regretted not checking the flight timings before confirming the booking, having overlooked the fact that women need to be cautious about these things.

Image credit: Pixabay / Pexels

Although my first response was to blame the cab drivers for what I saw as an unprofessional attitude, it slowly dawned on me that they have their own constraints. In the first scenario, the app had actually stopped working, so he couldn’t complete the ride due to the fear of getting penalized, which also resulted in a bad rating by me. In the second situation, I wondered why the algorithms reward shorter rides rather than longer ones. Moreover, how do they assign drivers if proximity isn’t the only factor and why was my driver not aware of that? In the third instance, why couldn’t I be assigned a woman driver to make me feel safer when traveling late at night?

I spoke to a few senior managers and executives working at popular ride-sharing apps in India to find the answers.

Constant tracking

A senior manager of a well-known ride-sharing platform explained their tracking practices on condition of anonymity:

“The location of driver-partners is tracked every two-three seconds and if they deviate from their assigned destination, our system detects it immediately. Besides ensuring safety, this is done so that the drivers do not spoof their locations. It has been noticed that some drivers use counterfeit location technology to give fake information about their location – they could be sitting at their homes and their location would be miles away. If the system identifies anomalies in their geo-ping, we block the payment of the drivers.”

While this appears to be a legitimate strategy to address fraud, there is no clarity on how a driver can generate evidence when there is an actual GPS malfunction. Another interviewee, a person in a top management position of a ride-sharing company, said, “it is difficult to establish trust between platform companies and driver-partners, especially when we hear about drivers coming up with new strategies to outwit the system every second day.” For instance, some of the drivers had a technical hacker on board to ensure that booking could be made via a computer rather than a smartphone or artificially surging the price by collaborating with other drivers and turning their apps off and on again simultaneously.

Though the ‘frauds’ committed by the drivers are out in the public domain, it is seldom discussed how constant surveillance reduces productivity and amplifies frustration resulting in ‘clever ways’ to fight it. The drivers are continuously tracked by ride-sharing apps and if they fail to follow any of the instructions provided by these apps, they either get penalized or banned from the platform. This technology-mediated attention can intensify drivers’ negativity and can have adverse effects on their mental health and psychological well-being.

Algorithmic-management

Algorithms control several aspects of the job for the drivers – from allocating rides to tracking workers’ behaviour and evaluating their performance. This lack of personal contact with the supervisors and other colleagues can be dehumanizing and disempowering and can result in the weakening of worker solidarities.

When asked if the algorithms can adjust the route for the drivers, especially for women, if they need to use the restroom, a platform executive said, “They always have the option not to accept the ride if there is a need to use the washroom. The customers cannot wait if the driver stops the car for restroom break and at the same time, who will pay for the waiting time?”

Image credit: Antonio Batinić / Pexels

While this makes sense at first glance, in reality, algorithms of a few ride-sharing platforms like Lyft penalize drivers in such cases by lowering their assignment acceptance rate (number of ride requests accepted by the driver divided by the total number of requests received). Lee and team, HCI (Human Computer Interaction) scholars from Carnegie Mellon University explored the impact of algorithmic-management on human workers in context of ride-sharing platforms and found:

 “The regulation of the acceptance rate threshold encouraged drivers to accept most requests, enabling more passengers to get rides. Keeping the assignment acceptance rate high was important, placing pressure on drivers. For example, P13 [one of the drivers] stated in response to why he accepted a particular request: ‘Because my acceptance rating has to be really high, and there’s lots of pressure to do that. […] I had no reason not to accept it, so […] I did. Because if, you know, you miss those pings, it kind of really affects that rating and Lyft doesn’t like that.’”

Uber no longer displays the assignment acceptance rate in the app and states that it does not have an impact on drivers’ promotions. Ola India’s terms and conditions state “the driver has sole and complete discretion to accept or reject each request for Service” without mentioning about the acceptance rate. However, Ola Australia indicate the following on their website: “Build your acceptance rate quickly to get prioritised for booking! The sooner and more often you accept rides (as soon as you are on-boarded), the greater the priority and access to MORE ride bookings!”

The lack of information coupled with ambiguity complicates the situation for drivers, who would try not to reject the rides under any circumstances. Moreover, the algorithms are designed to create persistent pressure on the drivers by using psychological tricks as pointed out by Noam Scheiber in an article for The New York Times:

“To keep drivers on the road, the company has exploited some people’s tendency to set earnings goals — alerting them that they are ever so close to hitting a precious target when they try to log off. It has even concocted an algorithm similar to a Netflix feature that automatically loads the next program, which many experts believe encourages binge-watching. In Uber’s case, this means sending drivers their next fare opportunity before their current ride is even over.”

The algorithmic decision-making also directs our attention to how the rides are allocated. The product manager of a popular ride-sharing app said:

“Apart from proximity, the algorithms keep in mind various parameters for assigning rides, such as past performance of the drivers, their loyalty towards the platform, feedback from the customers, if the drivers made enough money during the day etc. The weightage of these parameters keep changing and hence cannot be revealed.”

All the four people interviewed said that number of women driving professionally is considerably low. This makes it difficult for the algorithms to match women passengers with women drivers. Secondly, this may delay ride allocation for women passengers as the algorithms will first try to locate women drivers.

A lack of understanding of how algorithms assign tasks makes it difficult to hold these systems accountable. Consequently, a group of UK Uber drivers have decided to launch a legal bid to uncover how the app’s algorithms work – how the rides are allocated, who gets the short rides or who gets the nice rides. In a piece in The Guardian, the drivers’ claim says:

“Uber uses tags on drivers’ profiles, for example ‘inappropriate behaviour’ or simply ‘police tag’. Reports relate to ‘navigation – late arrival / missed ETA’ and ‘professionalism – cancelled on rider, inappropriate behaviour, attitude’. The drivers complain they were not being provided with this data or information on the underlying logic of how it was used. They want to [know] how that processing affects them, including on their driver score.”

The fact is that multiple, conflicting algorithms impact the driver’s trust in algorithms as elaborated in an ongoing study of ‘human-algorithm’ relationships.  The research scholars discovered that Uber’s algorithms often conflict with each other while assigning tasks, such as, drivers were expected to cover the airport area but at the same time, they received requests from a 20-mile radius. “The algorithm that emphasizes the driver’s role to cover the airport was at odds with the algorithm that emphasizes the driver’s duty to help all customers, resulting in a tug o’ war shuffling drivers back and forth.” Similarly, conflict is often created when drivers are in the surge area and they get pings to serve customers somewhere out of the way.

Ultimately, we need to shift from self-optimization as the end goal for workers to that of humane algorithms – that which centres workers’ pressures, stress, and concerns in this gig economy. This would also change the attitudes of the passengers, who need to see platform drivers as human drivers, facing challenges at work, like the rest of us.

——-

This blog was initially published on 2 October 2020 on the website of Feminist Approaches to Labour Collectives, FemLab.co – a project that is part of the research initiative Women, Work, and the Gig EconomyFollow the project on Twitter and Facebook.
by Pallavi Bansal
Bennet University, FemLab.co
Algorithms, Digital Platforms, Employment Data, Gig Economy, Platform Workers

About the Author / Organization

Pallavi Bansal

FemLab.co

Bennet University

Related Blogs and Multimedia