Twitter
Facebook
Instagram
Youtube

Fluid Edge Themes

A man and woman on shopping store with facial recognition

Privacy risks: Tackling facial recognition technology head on

Have you seen any signs about the use of facial recognition technology when you’ve been out shopping? CCTV is one thing, but this controversial technology raises more concerns about breaches of our privacy.

Three Australian retailers were referred to the Office of the Australian Information Commissioner (OAIC) in 2022 for the "completely inappropriate and unnecessary" use of technology in their stores.

At the time, CHOICE consumer data advocate Kate Bower said it was similar to the retailers collecting your fingerprints or DNA every time you shop.

The controversy prompted Bunnings, Kmart and The Good Guys to suspend their use of the technology after the OAIC announced an investigation, although Bunnings was subsequently accused of reintroducing it.

Commissioner Angelene Falk announced the OAIC had dropped its investigation into The Good Guys in February 2023, saying “the company had suspended their use of facial recognition technology and indicated that they weren’t intending to reinstate it” and suggested the report could be completed within a few months. A year later, we’re still waiting.

So what is facial recognition technology, and should you be worried about it creeping into a shopping centre near you?

What is facial recognition technology, and who’s using it?

Facial recognition technology (FRT) is increasingly present in our lives, from unlocking smartphones to securing airports. Its main purpose is to identify or verify. Identification is to find out who someone is, while verification confirms if a person is who they claim to be.

 

A woman using FRT to unlock a mobile phone

 

FRT uses cameras and software to analyse and identify individuals based on their unique facial features. Sophisticated algorithms analyse features such as distance between eyes, nose shape, and jawline to create a "faceprint”. This is then matched against the database, potentially identifying the person.

For identification, the process involves comparing a face against an extensive database of known faces; for verification, it compares a live or captured face to a pre-enrolled image of the same person.

Under the Privacy Act , FRT may only be used by organisations “to identify you or as part of an automated biometric verification system, if the law authorises or requires them to collect it or it’s necessary to prevent a serious threat to the life, health or safety of any individual”.

These authorised organisations are generally:

    • Smartphone or app makers who use FRT for unlocking devices or photo tagging, such as Facebook
    • Law enforcement for investigations, border security, and identifying missing persons
    • Immigration for verifying electronic passports

Private companies are also increasingly using FRT for various purposes, raising concerns about its potential misuse. Examples include:

    • Madison Square Garden: Used FRT to prevent a lawyer who sued their parent company from entering the premises
    • 7-Eleven: Used FRT to prevent people from submitting multiple survey responses on the same tablet
    • Kmart, Bunnings, and The Good Guys: Used FRT to identify individuals previously involved in shoplifting or other incidents

Ethical concerns about facial recognition

FRT offers convenience and security benefits but raises significant ethical concerns, particularly around privacy. Who has access to the data? How long is it stored? Can it be used for other purposes? These are crucial questions, as facial data is considered highly sensitive personal information.

Then there’s the potential for bias and discrimination. Facial recognition algorithms can exhibit bias based on factors such as race, gender and age. This can lead to inaccurate identifications and discriminatory treatment. In Detroit, Michigan, Robert Williams, a 45-year-old black man, spent 30 hours in police custody after FRT wrongly identified him as a suspect in a watch theft case.

Lastly, there’s the issue of transparency and consent. Many argue that retailers often lack transparency about their use of FRT. Customers may not be aware they are being scanned and consent, if obtained, may be unclear or inadequate. This undermines trust and raises concerns about potential misuse.

How retailers are addressing these concerns

When Australian retailers were asked about FRT, they claimed signage was used to inform customers about their usage. However, Choice said these signs were often small and easily missed, raising concerns about the adequacy of such disclosures.

 

A man shocked to see a sinage of facial recognition in a retail store

 

7-Eleven reportedly used “vague disclosure signs in its window”. They further tried to dodge privacy concerns by claiming they weren't collecting "personal information" through their facial recognition cameras. They argued that the captured faces and their computer-generated "faceprints" didn't directly identify individuals, they didn't link the facial data to other customer information, and only a few people could access the data, who wouldn't be able to identify anyone anyway.

However, one can argue that:

    • Even without names, individuals could be identified by linking the data with other sources
    • The very act of capturing and storing facial data is considered collecting personal information
    • Limiting access doesn't guarantee complete security or prevent misuse

Bunnings, Kmart, and The Good Guys halted their facial recognition programs after the OAIC announced its investigation in June 2022. 7-Eleven went a step further, destroying the collected data altogether. However, in July 2023, a customer spotted signage at a Bunnings store suggesting the technology might be back in use, which the retailer denied.

While the OAIC dropped The Good Guys from its investigation after the retailer announced it had no plans to reintroduce FRT, the others have not provided an update, presumably awaiting the outcome of the report.

What Australia’s laws say about FRT

There's no dedicated legislation governing FRT use in Australia. Existing privacy laws such as the Privacy Act 1988 apply, but their interpretation in the context of FRT can be exploited.

For instance, no law explicitly forbids private businesses from using FRT on their own property, even when highly intrusive. This means that businesses can, to a large extent, decide what level of surveillance they use. While they are responsible for informing you about using FRT through clear signage, it leaves you practically powerless.

The signage serves as a form of implied consent, meaning that by choosing to enter the premises, you agree to the conditions, including being scanned. What happens if you don’t want to be scanned but it’s the only store nearby that sells what you need?

It gets murkier with covert surveillance, where you can be scanned unaware. While some public interest justifications might allow it – such as preventing crime in crowded areas – the lack of transparency and potential for misuse raises ethical concerns.

What you can hold on to, for now, is the legislation that these businesses will not share your images and other information unless the law requires it. However, they can store the data for specific purposes, again creating concerns about potential misuse. Small businesses (organisations with an annual turnover of less than $3 million) are also currently exempt from these regulations.

Proposed reforms to address privacy issues

The Commonwealth Government has proposed changes, including extending the regulations to small businesses, allowing individuals to sue for serious invasions of privacy related to FRT or biometrics misuse, and requiring organisations using FRT or biometrics to conduct risk assessments to identify and address potential privacy issues before using the technology. But the reforms are yet to be finalised.

The Privacy Act is struggling to keep pace with the ever-evolving threats to our personal data. Issues such as FRT and car data privacy, highlight the need for comprehensive reform.

What you can do for now

 

A happy couple talomh note of stores without facial recognition

 

While complete avoidance might be challenging, you can take these actions to minimise FRT exposure:

    • Understand how FRT works, its potential risks and benefits, and how it's being used in stores you frequent.
    • Engage with store staff about their FRT practices, including data collection, storage, and purpose.
    • Voice your discomfort with FRT to store management and customer service channels - your voice matters.
    • If available, use opt-out mechanisms provided by some retailers to avoid being scanned.
    • Choose stores that don't use FRT or offer clear opt-out options. While comprehensive lists are still evolving, organisations such as the Australian Privacy Foundation can help identify stores without FRT.
    • Advocate for stricter regulations and legal protections regarding FRT use by businesses.
    • Keep up-to-date on developments in FRT regulations and consumer rights.

Handling breach concerns

While privacy laws are still playing catch-up with facial recognition technology, we as consumers need to be aware of its possible use, and misuse. If you believe your biometric data is being collected without your consent or your scanned information has been mishandled, you can lodge a complaint with the Office of the Australian Information Commissioner or its recognised external dispute resolution schemes.

You can also get in touch with us, and we’ll help you handle it.