Uber, Lyft algorithms charged users more for trips to non-white neighborhoods: study

A new study suggests that Uber's and Lyft's algorithms charge higher rates to customers in non-white neighborhoods

By Matthew Rozsa

Staff Writer

Published June 21, 2020 8:00AM (EDT)

The Lyft logo is displayed on a car on March 11, 2019 in San Francisco, California. On-demand transportation company Lyft has filed paperwork for its initial public offering that is expected to value the company at up to $25 billion.  (Justin Sullivan/Getty Images)
The Lyft logo is displayed on a car on March 11, 2019 in San Francisco, California. On-demand transportation company Lyft has filed paperwork for its initial public offering that is expected to value the company at up to $25 billion. (Justin Sullivan/Getty Images)

A recent study suggests that the algorithm used by popular ride-hailing companies Uber and Lyft may actually discriminate against customers seeking transportation in predominantly non-white neighborhoods.

Aylin Caliskan and Akshat Pandey at George Washington University in Washington DC analyzed transportation and census data in Chicago in a paper that assessed whether there was a racial disparity in how much passengers were charged based on location. Their data set included more than 100 million trips between November 2018 and December 2019, with 68 million of them being made by individual riders.

What they found is that the ride-hailing companies charged higher price per mile for a trip if either the destination or pick-up point has a higher percentage of non-white residents, low-income residents or high education residents.

"While demand and speed have the highest correlation with ride-hailing fares, analysis shows that users of ride-hailing applications in the city of Chicago may be experiencing social bias with regard to fare prices when they are picked up or dropped off in neighborhoods with a low percentage of individuals over 40 or a low percentage of individuals with a high school diploma or less," the authors wrote in their conclusion.

They also pointed out in their introduction why this study is particularly important.

"Unlike traditional taxi services, fare prices for ride-hailing services are dynamic, calculated using both the length of the requested trip as well as the demand for ride-hailing services in the area," the authors explained. "Uber determines demand for rides using machine learning models, using forecasting based on prior demand to determine which areas drivers will be needed most at a given time. While the use of machine learning to forecast demand may improve ride-hailing applications' ability to provide services to their riders, machine learning methods have been known to adopt policies that display demographic disparity in online recruitment, online advertisements, and recidivism prediction."

This is not the first time that Uber and Lyft have been accused of algorithmic bias — or in-person human bias, for that matter. A 2016 study found that racial and gender discrimination were pronounced among drivers for Uber, Lyft and Flywheel. Male customers with names that sounded African American were more than twice as likely to have drivers cancel their rides as white passengers (11.2 percent to 4.5 percent), and women with African American-sounding names experienced similar results (8.4 percent to 5.4 percent). In Seattle, African American passengers waited an average of eight percent longer than their white counterparts, while women in Boston were driven an average of 6 percent further than men.

In addition, female researchers in Boston reported that drivers would be more likely to force conversation with them, with one of the study participants noting that the drivers seemed to do so out of "a combination of profiteering and flirting to a captive audience." Drivers with female passengers were also more likely to take female drivers on longer rides, even though the participants made a point of not choosing routes longer than a mile or two, and one participant described a driver running through the same intersection on three occasions during a single trip.

Update: A representative from Lyft reached out to Salon with a statement saying, "This analysis is deeply flawed. The researcher acknowledges that the study was not based on actual demographic data of rideshare users. In fact, the study makes clear that speed and demand have the highest correlation with algorithmically generated fares and that individual demographic data is neither available to rideshare companies nor used in the algorithms that determine pricing. There are many factors that go into dynamic pricing — race is not one of them. We appreciate the researchers' attempt to study unintentional bias, but this study misses the mark."


By Matthew Rozsa

Matthew Rozsa is a staff writer at Salon. He received a Master's Degree in History from Rutgers-Newark in 2012 and was awarded a science journalism fellowship from the Metcalf Institute in 2022.

MORE FROM Matthew Rozsa


Related Topics ------------------------------------------

Aggregate Algorithmic Bias Lyft Racism Silicon Valley Uber