Doing away with a phone (or a card or a chip in your head) and just going with biometrics is a different issue. Biometric identification is a much harder problem and is fraught with difficulties. It can work very well with limited populations, which is why it is being installed in airports all over the place. I rather like the system going in to Chinese airports where when you walk up to one of the screens displaying flight information it switches to displaying your flight only. Very helpful. And earlier this year at KnowID in Las Vegas I saw a super presentation from US Customs and Border Control talking about the specific use of biometrics in airports as an interesting example of how to use biometric technologies for security but at the same time deliver convenience into the mass market. The investments made in biometrics to allow paperless travel have obvious benefits in terms of security but, as we have found in our other work about the cross-sector exploitation of digital identity, intelligent use of these new capabilities can also transform the customer experience. The same biometric system that scans your passport picture on entry to the airport and then checks you in for your flight can also be used to direct you through the airport and implement smart departure boards that as you approach them switch from displaying a list of all flights to displaying your flight only.
You can imagine this kind of system being extended to retailers and banks. Having been to the AmazonGo
When I go to the airport, however, I want to be identified. I’m already a member of a subgroup of the general population (ie, people who are flying from that airport on that day) and I want to co-operate in being identified to make my journey more convenient. It’s a different matter when you are dealing with the population as a whole, not a self-selected subgroup, including people who don’t want to be identified. The Metropolitan Police have revealed that their facial recognition technology incorrectly identified members of the public in 96% of matches made between 2016 and 2018. So, round off, that’s in practical terms all matches that were incorrect.
Hhhmmmm…..
One particularly interesting aspect of biometric identification is its amusing susceptibility to what is known as “adversarial” biometrics. If you know how a face recognition algorithm works, for example, then you can deliberately choose to wear make-up or some disguise that exploits the characteristics of that algorithm. In fact, as it turns out, it is all too easy to do this and to do it in such a way as to give the recognition algorithms high confidence that they have correctly identified something. When it comes to picture recognition, the results can be hilarious (and disturbing). MIT researchers found that Google’s AI-powered open source “Inception” picture classifier can be easily fooled. Take a picture of a cat, add some “noise” that is imperceptible to people and the computer thinks it is looking a guacamole (this is a real example). There are techniques, such as Adversarial Generative Networks (AGNs), that can automatically create images to fool the recognition algorithms!
Master criminals may not need to resort to such sophisticated algorithmic skullduggery to get away with
xxx
Despite all the trouble of updating her ID and registering her new face on all the online platforms she used, Huan said she was very happy with her new nose.
Facial recognition technology in China beaten by a nose job | South China Morning Post:
xxx
Comments
Post a Comment