Peter Griffin, Editor. 17 June 2022, 2:35 pm
I recently met a man whose job is to buy the beer and wine sold at one of Wellington’s largest supermarkets.
It’s a role with many obvious perks, but one big downside – accounting for the inventory lost to shoplifting, which amounts to hundreds of thousands of dollars every year. I asked him what the store was doing about the theft?
Among the measures he outlined was the use of facial recognition software. If playback of in-store surveillance footage reveals someone pinching something off a shelf, that person will then be flagged in the system. When they enter the store in future, their facial profile triggers an alert sent to security officers who then ask the person to leave the store.
I’d consider that to be a reasonable use of facial recognition technology. But are people aware when they walk through the supermarket doors that their biometric data could be used for that purpose? I doubt it.
Sky City is using facial recognition technology at its casinos in Auckland, Hamilton and Queenstown to screen for known problem gamblers who have volunteered to have their photo added to a database.
Huawei uses facial recognition to monitor crowd composition at sports events in China (source Peter Griffin)
If a match comes up, they will be refused entry. As Sky City opened up again in the wake of the initial pandemic lockdown in 2020, it also used its facial recognition system to keep track of gamblers who were members, for contact tracing purposes. Those are also reasonable uses of the technology and the public was made aware of them.
What else is the technology being used for by our retailers and other businesses? It’s hard to know exactly. A backlash against the technology has its users coy about discussing it and some tech vendors such as IBM have even scrapped the use of facial recognition development, considering it to be too problematic from an ethical point of view.
An investigation across the Tasman by consumer watchdog magazine Choice this week raised concerns about the use of facial recognition by the likes of Bunnings, Kmart and The Good Guys who claim they are mainly interested in it for the same purposes as that Wellington supermarket.
“This technology is an important measure that helps us to maintain a safe and secure environment for our team and customers,” Bunnings’ Bunnings’ chief operating officer Simon McDowell told Choice.
Choice considers use of facial recognition by the retailers to be “completely inappropriate and unnecessary”.
But does that allow Bunnings or Kmart to know if a customer recorded in one store enters another outlet somewhere else? Does it link purchase history to a facial recognition profile in order to gain insights into shopping preferences? It’s technically all possible. But facial recognition isn’t perfect. It can misidentify people, who then may be incorrectly labelled a security threat to a retailer. It happened in 2018, when a New World shopper was misidentified as a shoplifter. He received an apology from the company.
“There are some really innovative companies creating technologies using facial recognition that makes our lives safer and more productive, we just have to make sure that the way the technology is used lines up with our society’s expectations around privacy,” the AI Forum’s executive director, Madeline Newman, said this week.
The national strategy on artificial intelligence currently in the works has the potential to offer help here in the form of clear guidelines on what can and can’t be done with the technology.
But as the Privacy Commissioner has already pointed out, facial recognition systems must be used in accordance with the principles of the Privacy Act, which requires among other things that you will notify people that you are using the technology, not use the technology in a way that’s unfair or intrusive and that you accommodate an individual’s right to access the information about them?
I’m all for innovative uses of facial recognition for improving customer service and security. But now is the time for the business community to start a more open discussion about its use of the technology and to be laser-clear about what it is doing with our data.