In an exclusive chat, Nabanita De and Anant Goel speak about how they found a solution to the social media giant’s biggest problem
Anant Goel and Nabanita De
Even as Mark Zuckerberg scratches his head on finding ways to tackle the glitch in Facebook’s algorithm that is creating ‘echo chambers’ for its users and helping spread viral, fake content, four teenagers have already found a way to plug it: through a simple browser plug-in. Anant Goel and Nabanita De were part of the quad that made the brainwave tangible during a 36-hour hackathon at Princeton University. mid-day speaks to them about their invention.
Qinglin Chen, Mark Craft, Anant Goel and Nabanita De made FiB
The group of college students created the plug-in at HackPrinceton last week, held at Princeton University. Ironically, Facebook was one of the sponsors of the event. “The hackathon is a 36-hour event where teams of up to four people work on a project from scratch,” said 18-year-old Anant Goel, a Mumbaikar who is currently in his freshman year studying computer science at Purdue University in West Lafayette.
“No team is assigned to a task in the hackathon; you are free to build what you like during these 36 hours. After the time is up, the teams show their projects to a panel of judges, which decides the winners.”
It was during those 36 hours that Goel along with Nabanita De, Qinglin Chen and Mark Craft created the Google Chrome extension ‘FiB: Stop living a lie’, which is capable of quickly and accurately identifying fake news stories on Facebook.
Assembling the team
“I spoke to Anant on the Slack Channel (a messaging app for teams) on the first day of the hackathon and he introduced me to Mark and Qinglin [sophomores at the University of Illinois at Urbana-Champaign], who he had just met, and added me to their team,” said De, who is pursuing a Masters degree in computer science from the University of Massachusetts, Amherst. When asked how such an idea executed by four students who barely had any time to brainstorm before this event, let alone develop the algorithm for it in 36 hours, she said: “I was reading this news of people blaming Facebook for [US] election results and felt pretty bad about it while I was on my way to Princeton. I brought this idea up with my team and we brainstormed together on how to tackle the issue.”
How FiB works
“This extension adds a verified or non-verified tag over every post containing links or pictures, while you are scrolling on social media,” explains De.
“For links, we obtain a credible score for each one. If that score is good enough, we tag it as verified. If we can’t verify the link, our algorithm uses keywords from the link’s content, search on Bing (web search engine) to find more credible, summarised content (content from links which shows up on searches and is assigned good scores by our system).
It also works for pictures. Given a tweet snapshot, our algorithm verifies from Twitter if that tweet was really posted by the user. It also tags adult/pornographic content as unverified.”
Small steps to larger goal
But is it the solution to Facebook’s woes? “I like to think of FiB as a proof of concept. We just wanted to show that there is a way to detect fake news algorithmically,” said Goel. “Obviously, FiB is just a small step towards a much larger goal of disrupting the ‘fake news’ economies that thrive because of social media. The algorithm is not perfect right now; we are continuing to develop it. This is a huge project and will likely require constant improvement.”
Talking about the algorithm, De said, “I believe we have a long way to go. Right now it works decently, but we want it to be perfect. It is important for people to know what is real and what is fake, because often, the news plays on people’s conformational bias. For example, if you don’t like a politician and then read fake news bashing them, you would believe the news and start hating them more. We want to end that conformational bias.”
The path ahead
De admits the response has been ‘crazy’ ever since the extension went live.
“We have more than 50,000 users trying to access our extension. Since our servers are built for limited users — being students, we do not have funding, hence just utilising free resources — it is unable to handle it. Right now, we are more concerned about building a good product rather than revenue, and hence, we would love to get funding to buy new servers and API keys to scale our system. There are a lot of developers who are interested in how we built the extension and have been playing with it and suggesting changes,” says De.
“Currently, the only way to access FiB is through the Chrome browser, but we do plan on supporting other browsers and mobile platforms in the future,” said Anant.
Facebook’s ‘echo chamber’
A recent study by researchers from the Boston University and Italian institutions has examined how Facebook’s algorithm creates a virtual ‘echo chamber’ for its users, showing them only those things on their timeline that support their biases. The researchers found that this algorithm helps users “select and share content related to a specific narrative and to ignore the rest”. This echo chamber has been blamed for the widespread denial of global warming as well as for skewering the recently concluded US Presidential elections, during the course of which several fake news articles were circulated on Democratic Presidential candidate Hillary Clinton, who ended up losing to Donald J Trump.
How the plug-in works
All users need to do is download the plug-in, available in the Google Chrome and a box in the top-right hand corner of your Chrome browser will tag pictures/Twitter posts/news links seen on Facebook, as ‘verified’ or ‘not verified’.
Number of users trying to access the extension now
Photos: Katrina Kaif, Shamita Shetty at Mumbai airport
Photos: Varun Dhawan, Alia Bhatt promote 'Badri...' on 'Dil Hai Hindustani'
In pictures: 15 facts about kissing that will surprise you
Photos: Shahid Kapoor and Mira Rajput at cafe in Bandra
Photos: Deepika Padukone, Neha Dhupia and Soha Ali Khan at event