The sharing economy has a race problem

Heralded as the future of commerce, apps like Uber and Airbnb still aren't immune to the same old prejudices

By Nancy Leong
November 2, 2014 4:58PM (UTC)
main article image
(hillwoman2 via iStock)

The sharing economy is here to stay. Defined as a socioeconomic ecosystem built around the shared creation, production, distribution, trade and consumption of goods and services, the sharing economy offers new ways of filling human needs as basic as housing and transportation. The success of businesses such as Airbnb (vacation rentals), Uber and Lyft (car service), TaskRabbit (errands and household tasks), Homejoy (cleaning) and many others shows how the sharing economy can make our lives easier and more efficient. Last year the sharing economy was predicted to generate $3.5 billion in annual revenue, with growth exceeding 25 percent.

The sharing economy’s impact is more than economic. Some have lauded it as a potential cure for race discrimination in the marketplace. And in many situations, it may well improve upon the status quo. But the mere existence of the sharing economy isn’t a panacea for race discrimination. Rather, as we applaud the sharing economy’s marketplace innovations, we should also work to ensure that it does not replicate and perpetuate age-old biases that plague existing economic relationships.


In particular, we should guard against implicit bias — attitudes that affect our behavior without our awareness. Implicit bias is common, affecting everything from job applications to employee evaluation to professional mentorship to driving behavior. Implicit bias is also notably difficult to counteract because people genuinely don’t mean to discriminate—the whole point of implicit bias is that it affects our thinking and behavior at an unconscious level.

Thus far, almost no research has examined implicit bias within the sharing economy, though some evidence suggests that such bias might exist. For example, two professors at Harvard Business School conducted a study that found that Airbnb properties offered by non-black hosts earned, on average, 12 percent more than properties offered by black hosts. Some anecdotal evidence suggests similar conclusions.

I offer no opinion as to whether Airbnb or any other specific business in the sharing economy is affected by implicit bias. But given that implicit bias is pervasive in nearly every other area of life, it would be surprising if the sharing economy bore no trace of such bias. And certain features of the sharing economy raise particular concerns.


One concern is that many businesses in the sharing economy rely on user-created profiles designed to let the parties to a transaction learn more about each other. The profiles generally include a photograph and other personal information.

Certainly profiles perform a valuable function. The sharing economy often involves people who don’t meet in person or talk by phone before they agree to a transaction. Profiles are a way of humanizing a physically distant person on the other side of an Internet transaction (for example, when renting through Airbnb) or allowing two parties to recognize one another (for example, when a Lyft driver picks up a passenger).

Yet research suggests that such profiles can also trigger implicit racial bias. Consider the power of a photo: one study found that identical baseball cards listed on eBay sold for 20% more if the hand holding the cards in the listing photograph was white rather than black. Another study of online iPod sales yielded similar results. Such research suggests that sharing-economy businesses whose platforms prominently feature user photos could result in worse treatment for non-white people.


Another concern is that the rating systems pervasive in the sharing economy risk facilitating and aggregating the expression of implicit bias. All the sharing economy participants I’ve mentioned have rating systems, and in many instances the systems are bidirectional: Airbnb hosts and renters rate one another, as do Uber drivers and passengers.

Although of course rating systems are not intended to express implicit bias, the research I’ve mentioned suggests that they may indeed do so by providing an avenue for individual sharing economy participants to express implicit bias. The rating system then aggregates these individual biased scores, resulting in a composite score reflecting the net effect of many biased ratings. And negative ratings can become self-perpetuating. If a passenger sees that an Uber driver has a low rating, the passenger may be primed to view the driver negatively. Such priming may lead to interpreting ambiguous conduct more negatively, and, ultimately, to more negative ratings. The result is a vicious cycle of self-reinforcing bias.


Together, the substitution of an online profile for personal contact and the use of rating systems create worrisome possibilities for the expression of implicit bias. And these concerns are exacerbated by the way that the sharing economy filters out human contact. There is no consequence for leaving a bad rating for a renter you have never met or a passenger you will never see again.

As I’ve already noted, it’s important to remember that the sharing economy also has the potential to reduce race discrimination. For example, Latoya Peterson has explained that, for a black person, ordering a car through Uber can be dramatically easier than hailing a cab. But this doesn’t actually mean the sharing economy is free from bias. The fact that a black person finds it easier to summon an Uber than to hail a cab doesn’t mean that black people and white people find it equally easy to participate in the sharing economy. In other words, Uber may be better, but that doesn’t mean it’s perfect. The goal of the sharing economy should be the elimination of bias, not merely the reduction.

So how do we build a sharing economy free from bias? As we consider this question, it’s worth remembering that bias in the sharing economy is a new version of an old problem. For years, lawmakers have grappled with the question of how to prevent public accommodations—private entities used by the public, such as hotels, restaurants, stores, theaters and recreational facilities—from engaging in discrimination. Our laws already address some manifestations of the problem; for instance, Congress decades ago outlawed racial discrimination in forming contracts and banned public accommodations from discriminating on the basis of race, color, religion, or national origin.


Sharing economy businesses fall within the concern of these existing laws. A business’s online platform shouldn’t make it more difficult for a non-white person to summon a car because of her low passenger ratings, nor should it force a non-white host to accept a lower rate for a comparable property. A sharing economy business that designs and maintains such a platform denies non-white people the equal opportunity to access public accommodations. While the format is different, the resulting inequality is the same.

While sharing economy business platforms that cause discrimination have not yet faced legal challenge, courts should interpret existing laws to include such businesses. Better yet, rather than shoe-horning new situations into decades-old laws, legislatures should enact new statutes directly targeting discrimination in the sharing economy. Congress could, for instance, require businesses to report the results of their ratings systems, including any aggregate racial disparities, or to monitor the effect of changes in platform design.  Such information could guide further regulatory efforts to address the means by which new technologies perpetuate old biases.

Moreover, sharing-economy businesses can take proactive steps to reduce the extent to which their platforms perpetuate bias. One possibility is to minimize or eliminate the use of photographs and other signifiers that trigger implicit bias. For instance, platforms could be redesigned so that users see photos of one another only after they have agreed to a transaction. Likewise, platforms could require participants to provide specific verbal feedback in addition to, or instead of, numerical ratings. This requires people to articulate why they are assigning a particular rating, which both forces them to confront their own reasons and provides more useful information for other customers. And businesses can voluntarily make their practices transparent by disclosing data relating to race, including unexplained racial and other disparities. When people are aware that bias exists in a particular setting it can prompt them to examine their own behavior more closely.


Participants in the sharing economy may be tempted to dismiss the topic of racial bias with reflexively defensive statements such as “Our service prohibits discrimination.” But the more laudable course is for a business to acknowledge that implicit bias affects almost everyone and to consider how to eliminate it. The real shame in implicit bias is not that it exists, but that some people pretend that it doesn’t.

Nancy Leong

MORE FROM Nancy Leong

Related Topics ------------------------------------------

Airbnb Bias Editor's Picks Race Racism Technology The Sharing Economy Uber