Surya Mattu. Pic/Twitter/@suryamattu
Surya Mattu never imagined that a "story based around math" would be in the reckoning for the prestigious Pulitzer Prize. The 29-year-old Indian-origin data researcher, along with his team at ProPublica - an independent American non-profit newsroom that produces investigative journalism in public interest - was the finalist at this year's Pulitzer Prize in Explanatory Reporting, announced earlier this week.
The award went to Panama Papers, a series of stories that exposed the global scale of offshore tax havens.
ProPublica won in another category - Public Service - along with New York Daily News.
The ProPublica team nominated for Explanatory Reporting- comprising Julia Angwin (senior reporter), Jeff Larson (data editor), Lauren Kirchner (senior reporting fellow), and Mattu - published a series of stories from 2014-16 by rigorously examining data journalism to "make tangible the abstract world of algorithms, and how they shape our lives in realms as disparate as criminal justice, online shopping, and social media".
Mattu reveals, over a phone call from Italy, where he is attending a wedding, that Angwin - "a sort of leader of the team" - was the one who started it all. "She had earlier done a series of stories on surveillance tools used by ad agencies."
Uncovering systemic bias
Born and raised in Delhi, Mattu moved to the UK for his undergrad, where he worked for a couple of years before moving to New York for grad school. At ProPublica, his role was that of a data researcher. "All of us on the team have always been trying to analyse technology from a critical angle and bring to understand the numerous ways it reflects systemic bias that already exists in society. This series (the one nominated for the Pulitzer) came about as a follow-up on what Julia [Angwin] had already done," he says.
Machine Bias, one of the articles in the series, highlights how a software used across the US to identify and predict future criminals is biased against blacks. Another one talks about Facebook categorising its users based on data that it had bought , and then showing these users advertisements based on their 'preferences'.
In November 2016, after ProPublica bought a Facebook ad in its housing categories that excluded African-Americans, Hispanics and Asian-Americans, the company said it would build an automated system to help it spot ads that illegally discriminate.
Algorithm is king
"Whether you are buying a plane ticket, deciding what to watch on TV, or trying to figure out where to order food from, all of this is driven by an algorithm," explains Mattu. "This is how we interact on a consumer level. But on a broader scale, there are many different parts of our lives where algorithms play an important role in influencing and shaping our decisions." He says the problem arises when capitalism enters this picture.
Machine Bias was followed by another 7-8 articles. "We did one on Amazon where we showed how it changes its prices based on the user who is browsing, and another one of Facebook that shows how its advertisement category violates the US' Fair Housing Act," says Mattu. "Basically, you could buy put out an ad for a house on FB, and you could target specific groups such as first time renters, and at the same time, exclude other specific groups that are ethnic in nature, like African-Americans. They (FB) didn't intend to do that, but that is one of the big problems with algorithms."
Facebook: the biggie
A second FB article - Facebook doesn't tell users everything it really knows about them - explains how the company obtains third-party data and uses it to influence the types of ads you see. The article states that Facebook says it gets information about its users 'from a few different sources'. "What the page doesn't say is that those sources include detailed dossiers obtained from commercial data brokers on users' offline lives. Nor does Facebook show users any of the often remarkably detailed information it gets from those brokers."
Mattu says this finding was "big". "The US Congress wrote to FB, saying it could not do this anymore, since Facebook had said it wasn't its fault."
Much of the ProPublica team's interactive tool for this article was crowdsourced. "The tool was downloaded and used by 20,000 people. Thus, we managed to get 60,000 unique FB categories by getting users to share what FB says about them."
Asked whether algorithms play as important a role in India as in the West, he says, "I would say it is equal, especially with systems like Aadhaar being put into place, which are centralising unique information of each citizen. We might not have as many computers in India, but we definitely have as many smartphones."