Academics, senior officers sat at uncomfortable table and asked ‘what’s going on ‘ere, then?’
Members of the House of Lords are looking at the role technologies such as AI and facial recognition have in modern policing methods and law enforcement. The Justice and Home Affairs Committee announced in May that it planned to probe the matter and has already heard one round of oral evidence in June from legal experts as it familiarises itself with the subject. Today it was the turn of Professor Michael Wooldridge – Head of Computer Science at the University of Oxford – to be quizzed by peers. He echoed many of the concerns previously raised and cited the example of a computer system that advises custody officers whether someone should be detained in a police cell or released based on a host of data including their criminal history. His concerns lay not necessarily with the technology, but that it could lead to some officers “abdicating [their] responsibility” by becoming over-reliant on it. Wooldridge warned that more needed to be understood about AI and what it can – and can’t – do. And he urged peers not to “humanise it”. “It’s not like human intelligence,” he said. He then went further, adding that the technology is “brittle” and can fail in unexpected ways.
Home
United States
USA — software Peers question experts over UK police use of AI, facial recognition tech