A big biometric security company in the UK, Facewatch, is in hot water after their facial recognition system caused a major snafu - the system wrongly identified a 19-year-old girl as a shoplifter.
A big biometric security company in the UK, Facewatch, is in hot water after their facial recognition system caused a major snafu - the system wrongly identified a 19-year-old girl as a shoplifter.
Wait really?
The EU AI act classifies AI based on risk (in case of mistakes etc), and things like criminality assessment is classed as an unacceptable risk, and is therefore prohibited without exception.
There’s a great high level summary available for the act, if you don’t want to read the hundreds of pages of text.