Accuracy concerns could stall facial recognition adoption
With the efficacy of facial recognition technology coming under scrutiny from recent negative news coverage, there are concerns regarding the future of security analytics. Karen Sangha, Field Marketing Manager for Panasonic Security Solutions, discusses.
The advent over recent years of analytic technologies, such as facial recognition, has been seen as a positive response to the increased need for high end, proactive security systems. Face matching and behavioural analytics used in anything from a football stadium to a large transport hub, have the potential to reduce crime and create safer communities.
Yet, in light of recent coverage concerning the inaccuracy of many facial recognition systems, the industry has a challenge to convince end users, already sceptical about its potential efficacy and the likelihood of success.
A freedom of information report published by the BBC shows various police forces have tried and largely failed to implement the technology effectively, with the system limited by inaccurate facial matching. The Metropolitan Police force miss-matched 102 people against images of suspected criminals at the Notting Hill Carnival and a Remembrance Sunday event between 2016-2017. Concerned with results, the Lancashire Police force has even stopped its use of facial recognition following tests in 2015. In addition, Amazon’s facial recognition technology has been flagged as having inaccurately identified 28 members of congress as people who have been arrested for crimes, according to the American Civil Liberties Union (ACLU).
The reports also highlighted concerns of local communities, who felt wary of such systems, feeling that their individual right to privacy was at risk.
As an industry, improving the accuracy of face-matching is essential to tackle these concerns, and is one of the reasons we carried out live tests of our Deep Learning Facial Recognition System at IFSEC this year. Visitors had their faces registered within the system, some of which were even registered through ten year old photographs. They were then given the opportunity to dress in a disguise in an attempt to see whether the software would accurately match results. Over three days and 1,500 different attempts not a single miss-match was found, giving the Panasonic system a 100% accuracy rating – even for images taken ten years previous.
This success can be attributed to the high quality of our system which uses deep learning technology to enhance the power of analytics over an IP system. The technology identifies faces that are difficult to recognise with conventional techniques, with accurate matches from faces taken at an angle, and a minimum 90% accuracy rate when detecting faces that are partially hidden by sunglasses and face masks.
Another benefit to the system is the improved cost and workflow which occurs when operated with our Intelligent Auto mode, technology adapted from our Lumix camera brand. This allows the camera to undertake internal analytics of the images before providing the best three to the server. In doing so, only the best quality images are taking up storage space, meaning its also easier for end users to sift through the information.
Ensuring that all data is protected is another important aspect. To mitigate the risk of a cyber attack and ensure the protection of personal data for members of the public, all of our systems are implemented with technology that protects data through secure passwords, encrypted firmware and more. The system is GDPR friendly and can be programmed so that it either only stores valid data, such as known criminals, or all faces for a specified retention period.
With all of these features combined, it is unsurprisingly that the technology has been recognised as the industry’s best performing facial recognition technology by the National Institute of Standards and Technology (NIST), as well as the winning Site Protection Software of the year in the Benchmark awards.
The restless innovation that is inherent within Panasonic security technology is proved by its reliability. We hope in turn that we can change the industry’s perspective on facial recognition to a powerful technique to safeguard, rather than scare both end users and local communities.
Call: +44 (0) 2070226530