Australian Privacy Commissioner Carly Kind has found Bunnings Group Limited breached Australians’ privacy by collecting their personal and sensitive information through a facial recognition technology system.
The system, via CCTV, captured the faces of every person, likely hundreds of thousands of individuals, who entered 63 Bunnings stores in Victoria and New South Wales between November 2018 and November 2021.
“We acknowledge the potential for facial recognition technology to help protect against serious issues, such as crime and violent behaviour,” said Kind. “However, any possible benefits need to be weighed against the impact on privacy rights, as well as our collective values as a society.”
“Facial recognition technology may have been an efficient and cost-effective option available to Bunnings at the time in its well-intentioned efforts to address unlawful activity, which included incidents of violence and aggression,” she added. “However, just because a technology may be helpful or convenient does not mean its use is justifiable. In this instance, deploying facial recognition technology was the most intrusive option, disproportionately interfering with the privacy of everyone who entered its stores, not just high-risk individuals.”
As well as addressing issues of proportionality and necessity, the determination highlighted the lack of transparency around Bunnings’ use of facial recognition technology.
Kind said Bunnings collected individuals’ sensitive information without consent, failed to take reasonable steps to notify individuals that their personal information was being collected, and did not include required information in its privacy policy.
Director of the WA Data Science and Innovation Hub at Curtin University, Alex Jenkins, said people needed to be informed if their appearance was being captured, as biometric data has become the new fingerprint for online security.
“Facial recognition technology can be a valuable tool in protecting the safety of staff and the public, but it also comes with significant security risks in an increasingly digital world,” said Jenkins. “Unlike passwords, your face cannot be changed, making it a permanent identifier. This opens the door for misuse, including deepfakes used for fraud ormisinformation, scammers bypassing facial recognition-based security systems, biometric data being sold or stolen, or manipulated images or videos being used to blackmail people.”
“Protecting your digital identity is no longer just about keeping passwords safe, it’s about securing your unchangeable personal traits. Regulation and ethical safeguards are essential to ensure this technology benefits society without compromising individual security and privacy,” he added.
Bunnings says it will seek a review of the privacy commissioner’s determination. The hardware chain says facial recognition technology was trialled at a limited number of Bunnings stores in Victoria and New South Wales between 2018- 2021, with strict controls around its use, with the sole and clear intent of keeping team members and customers safe and preventing unlawful activity.
“Our use of facial recognition technology was never about convenience or saving money but was all about safeguarding our business and protecting our team, customers, and suppliers from violent, aggressive behaviour, criminal conduct and preventing them from being physically or mentally harmed by these individuals,” a Bunnings statement reads.
“The trial demonstrated the use of facial recognition technology was effective in creating a safer environment for our team members and customers, with stores participating in the trial having a clear reduction of incidents, compared to stores without facial recognition technology,” the statement added. “We also saw a significant reduction in theft in the stores where facial recognition technology was used. We believe that customer privacy was not at risk. The electronic data was never used for marketing purposes or to track customer behaviour.”