By: Zeynep Tufekci and Brayden King
Uber, the popular car-service app that allows you to hail a cab from your smartphone, shows your assigned car as a moving dot on a map as it makes its way toward you. It’s reassuring, especially as you wait on a rainy street corner. Less reassuring, though, was the apparent threat from a senior vice president of Uber to spend “a million dollars” looking into the personal lives of journalists who wrote critically about Uber. The problem wasn’t just that a representative of a powerful corporation was contemplating opposition research on reporters; the problem was that Uber already had sensitive data on journalists who used it for rides.
Buzzfeed reported that one of Uber’s executives had already looked up without permission rides taken by one of its own journalists. And according to The Washington Post, the company was so lax about such sensitive data that it even allowed a job applicant to view people’s rides, including those of a family member of a prominent politician. (The app is popular with members of Congress, among others.)
After the Uber executive’s statements, many took note of a 2012 post on the company’s blog that boasted of how Uber had tracked the rides of users who went somewhere other than home on Friday or Saturday nights, and left from the same address the next morning. It identified these “rides of glory” as potential one-night stands. (The blog post was later removed.) Uber had just told all its users that if they were having an affair, it knew about it. Rides to Planned Parenthood? Regular rides to a cancer hospital? Interviews at a rival company? Uber knows about them, too.
Uber isn’t alone. Numerous companies, from social media sites like Facebook to dating sites like OKCupid, make it their business to track what we do, whom we know and what our typical behaviours and preferences are. OKCupid unashamedly announced that it experimented on its users, sometimes matching them with incompatible dates, just to see what happened.
The data collection gets more extensive at every turn. Facebook is updating its terms of service as of January 1. They state in clearer terms that Facebook will be tracking your location (unless you disable it), vacuuming up data that other people provide about you and even contacts from your phone’s address book (if you sync it to your account) — important provisions many of Facebook’s 1.35 billion users may not even notice when they click “accept”.
We use these apps and websites because of their benefits. We discover new music, restaurants and movies; we meet new friends and reconnect with old ones; we trade goods and services. The paradox of this situation is that while we gain from digital connectivity, the accompanying invasion into our private lives makes our personal data ripe for abuse — revealing things we thought we had not even disclosed.
The retailer Target, for example, started sending coupons for baby gear to customers who, sales data told them, were likely to be pregnant. Researchers in Cambridge, England, found that merely knowing a Facebook user’s likes was enough to predict attributes such as gender, race, sexual orientation, political party, potential drug use and personality traits — even if the user had shared none of that information.
Facebook says that it conducts not one but “over a thousand experiments each day”, and a former Facebook data scientist recently revealed that “experiments are run on every user at some point”. A 2012 study in Nature showed that a single tweak modifying an “I voted” button on Facebook increased turnout in the 2010 congressional elections by about 3,40,000 votes. That is enormous power.
What’s rare is not the kind of analysis Uber can do with sensitive data, but that it was publicly disclosed. Because of the user backlash, companies are moving toward secrecy. That would be detrimental to the public interest.
Uber argues that it’s doing only what other technology companies regularly do. That may be true, but it only underlines why we need oversight mechanisms that cover all of them. Reputational penalties have not been sufficient incentives to encourage more responsible use of data and algorithms, especially because almost all the big players engage in similar behaviour — and Uber has just been rewarded by its investors to the tune of $1.2 billion.
Codes of conduct developed by companies are a start, but we need information fiduciaries: independent, external bodies that oversee how data is used, backed by laws that ensure that individuals can see, correct and opt out of data collection. The European Union has established strict controls on personal data that include provisions of privacy, limited and legitimate use and user access to their own data. That shows that accountability is possible.
We already regulate sensitive data, ranging from health records to financial information. We must update oversight for 21st-century data as well. When we’re picked up on a rainy street corner, it’s not enough to know where the car is going. We need to know where our data is going, and how it’s used.
Tufekci is an assistant professor at the School of Information at the University of North Carolina. King is an associate professor of management and organisations at the Kellogg School of Management at Northwestern University.