Commuters in eight train stations around the UK were surveilled with AI-powered CCTV cameras, revealed a report from WIRED. Based on documents accessed through a Freedom of Information (FOI) request, the report stated that for the past two years, major train stations around the UK have been testing AI surveillance technology with CCTV cameras to alert staff of safety incidents.
According to the report, the cameras utilised object recognition, a type of machine learning that can identify items in video feeds. The documents described various possible use cases like detecting trespassing, counting people for crowd management, and detecting unusual behaviour like running, shouting, skateboarding and smoking.
Perhaps most worryingly, one of the use cases described is “Demographics,” where “potentially the customer emotion metric could be used to measure satisfaction.” The documents also suggested that the data could be utilised to increase advertising and retail revenue. This facility was provided by Amazon’s Rekognition image identification system. According to WIRED, the images were captured when people crossed a “virtual tripwire” near ticket barriers and were sent to be analysed by Rekognition, which allows face and object analysis. Notably, the Station’s camera setup does not utilise Facial Recognition Technology (FRT).
Gregory Butler, the CEO of data analytics and computer vision company Purple Transform, which has been working on the trials, told WIRED that the emotion recognition capability was discontinued during the tests and that no images were stored when it was active.
Why This Matters?
Emotion recognition is a highly controversial use of AI image recognition that has been slowly making its way into India. In 2021, the Uttar Pradesh police announced their decision to install special AI-enabled CCTV cameras that would detect “distressed women” based on their facial expressions. Similarly, Western Railways Rajkot Division announced plans to implement an FRT system capable of detecting emotions.
According to the Internet Freedom Foundation, “Human emotions do not have simple mappings to their facial expressions across individuals and especially cross-culturally. Despite it being baseless and racist, technologies like emotion detection are popular because the spread of FRT makes the acquisition of large datasets of face images possible which is what emotion detection algorithms work on.”
Similarly, researcher Vidushi Marda argues, “Emotion recognition technology is based on a legacy of problematic and discredited science and exacerbates power differentials in multiple ways. No amount of careful data protection practices can legitimise its use.”
Also Read: