shot-button
Subscription Subscription
Home > News > India News > Article > Big Brother now with added intelligence

Big Brother -- now with added intelligence

Updated on: 14 April,2013 06:44 AM IST  | 
Christoph Behrens |

The American BRS Labs is a company few have heard of. But in one year, its turnover increased tenfold, to $200 million.

Big Brother -- now with added intelligence

The American BRS Labs is a company few have heard of. But in one year, its turnover increased tenfold, to $200 million. It has new offices in Houston, Texas, São Paulo, Brazil and London, UK. The reason for the success of BRS –— which stands for Behavioural Recognition Systems — is a surveillance software called AISight.


This software is revolutionising surveillance at airports and railroad stations worldwide. AISight makes cameras smart. The software watches CCTV footage and can recognise “suspicious or abnormal behaviour” — whether it is somebody leaving a suitcase or getting out of a car where they shouldn’t, or climbing over a fence. As soon as the program spots the unusual behavior, it will sound an alarm. This means that a human is no longer necessary to monitor surveillance tapes — a dream for national security authorities.



AISight, a surveillance software, is capable of sounding an alarm any time it recognises “suspicous or abnormal behaviour.” Pic/AFP


The thinking behind the programme is simple. It is estimated that there are 50 million surveillance cameras worldwide. In Great Britain alone, CCTV cameras film each citizen an average of 300 times a day. But the images don’t usually end up serving much — there just isn’t enough personnel to look at and evaluate all the video material, live or not. A machine on the other hand is never distracted, always alert, and doesn’t need a coffee break. So the idea of “intelligent surveillance cameras” fills a huge market demand.

European security experts would love a piece of this action. Germany’s Federal Ministry of Education and Research (BMBF) alone is investing over 21 million euros in nine “pattern recognition” projects. German researchers have been trying for three years to get computers to read faces, recognise weapons, and register suspicious movements.

Results have so far been poor. Take for instance ADIS: this algorithm was supposed to recognise potentially dangerous situations like mugging or mass gatherings in subway stations, says project coordinator Oliver Röbke of the Munich-based company Indanet AG. “Originally, we intended to test the system on public transportation,” he says, and one big city transportation company was up for it. However, because of ethical and data protection issues, testing had to be conducted behind closed doors. The planned demonstration of the system in action still hasn’t been done, three months before the project is due to end.

The American version of this software — AISight — has been on the market for three years. Unlike the German engineers, the Americans didn’t design their system with rules that have to be programmed individually for each camera. AISight is a self-learner: similarly to a neural network, it can collect experiences and learn from them — for example, what is “normal” in a given situation — such as what doors passersby are not allowed to use — and what isn’t.

A video on the company website says that AISight allows “the computer to do the watching, the learning, and yes, also the thinking.” “But if in the course of operations a system learns new things, then as technician, I no longer have it under control,” ADIS project manager Zaharya Menevidis, of the Fraunhofer Institute for Production Systems and Design Technology, says critically: “And if control eludes you — that is not progress.” BRS engineers see things differently. They consider the rules-based European systems to be “outdated.”

Pattern recognition is highly sensitive “Europe lags behind in security technology,” says Ben Hayes, who keeps his eye on the security sector for Statewatch, an organisation that monitors State and civil liberties in Europe. However, for other reasons doubts are growing as to whether higher camera-based surveillance expectations can be fulfilled. “The dependability of preventive surveillance is questionable,” says a researcher at the Institute for Legal and Criminal Sociology in Vienna, who points out that environmental factors and changing light can falsify results.

According to this source, this results in an “unacceptably high number of false alarms.” The German federal crime control agency BKA experienced this in 2007, after it installed face-recognition technology at the main train station in Mainz, western Germany. It was mostly too dark to recognise individual faces. “Money would be much better spent on construction — if there was more light in subway stations they’d be safer,” says Stephan Urbach of the Pirate Party.

But security is only an apparent concern of intelligent surveillance. Much more appealing are the financial prospects for the technology sector. Market researchers at the Washington DC-based Homeland Security Research Corp are predicting that by 2016 the global market for intelligent surveillance will have quadrupled.

The European Union is also keen to get equipped with intelligent surveillance devices, especially at its outer borders, and at least 60 million euros have been spent over the past few years on “intelligent” — which is to say automated — border control under the heading “Smart Borders.” The European Commission is also considering building up an autonomous fleet of drones for border protection: Oparus — Open Architecture for UAV-based Surveillance System; and Wimaas — Wide Maritime Area Airborne Surveillance. A project called Talos would, parallel to Oparus and Wimaas, develop robots to patrol borders and spot intruders.

All these ideas are just research projects, the European Commission stresses. And indeed until now the programme has produced expensive misses and little that is concrete. u00a0Economics aside, intelligent surveillance is also subject to some fundamental questions. “Pattern recognition is highly sensitive because you’re marrying analog and digital,” says Thilo Weichert, who is in charge of data protection for the German state of Schleswig-Holstein. People, not machines, should have the decision-power, he says. Also, a disadvantage of pattern recognition is that uninvolved third parties captured by the system have to be deleted from it.

“And where do the criteria for pattern recognition come from?” asks Regina Ammicht Quinn of the International Centre for Ethics in the Sciences and Humanities in Tübingen. “If they come from experts then there is the danger that a specific way of viewing the world is programmed into the system.” However, if the criteria are statistically determined, then the statistically normal, which is to say the most frequently recorded behaviour by the cameras, would become the “desirable” behaviour.

"Exciting news! Mid-day is now on WhatsApp Channels Subscribe today by clicking the link and stay updated with the latest news!" Click here!


Mid-Day Web Stories

Mid-Day Web Stories

This website uses cookie or similar technologies, to enhance your browsing experience and provide personalised recommendations. By continuing to use our website, you agree to our Privacy Policy and Cookie Policy. OK