The Barcelona Provincial Court ruled that there was a "violation of privacy" in this project.
It is a legal case that highlights the complexity of surveillance systems.
The Mercadona penalty as an alarm signal for systems of this type
As the company itself explains, the system “applied a technological filter and a second visual verification established that the identified person had a restraining order current of the establishment“.
However, the AEPD concluded that the General Data Protection Rules were violated. In particular, Article 6 (Legality of the treatment) and Article 9 (Treatment of special categories of personal data). For this reason, a fine of two million euros is imposed, accompanied by other amounts for violations of other articles of the GDPR.
This sanction has been reduced by 20% because Mercadona voluntarily decided to make a payment, having taken into account the absence of recidivism or reiteration as a mitigating factor of particular importance. Mercadona explains that the company had judicial authorization from the very beginning. The company maintained close contact was with the corresponding authorities, and before the start of the tests, they shared all the procedures with the AEPD. Nevertheless, one of the grounds for applying sanctions is an incorrect evaluation of the impact.
Mercadona has completed a pilot project with a technology that is in the sights of Data Protection agencies. A test that was not carried out with the necessary rigor, as determined by the AEPD, resulted in a fine. A sanction that the Agency considers "proportional, effective and deterrent". It will serve as a warning to other companies seeking to implement facial recognition systems.
Mercadona states that "the most responsible and rigorous right now is to terminate this pilot test." They decided to pay a fine and close the procedure before Data Protection.
What aspects of the facial recognition project led to the sanction
Jorge Garcia Herrero, a lawyer specializing in Data Protection, is reviewing the Agency's ruling. Among the proven facts, it is found that Mercadona launched the project in June 2020, only in May 2021, the project was discontinued in its forty establishments. So, these establishments used facial recognition technology at the entrance for about a year.
How did the Mercadona system distinguish between those who had received a court order? The company relied on its lawsuits against shoplifters and asked the judge to order this precise measure. The AEPD accuses them of having started before conducting an impact assessment. An impact report in which risks related to the company employees and vulnerable customers, such as minors, were not assessed, according to the AEPD.
According to the Agency, biometric data is processed without sufficient basis nor are basic public interest requirements met.
One of the deep discussions about these facial recognition systems is the difference between using data for specific people and the rest. The AEPD understands that there is the legitimacy for the convicted, but not for the “not convicted”.
Another aspect taken into account when using biometric data processing systems is the need for the measure. The AEPD explains that "utility" is confused with "necessity". Although these facial recognition systems may be "useful", they are not strictly necessary. Therefore consider that the Data Protection regulation prevents their use in cases such as Mercadona, where it is considered that the public interest is not being protected, but rather private, interests.
The news piece has been based on the article "Mercadona’s facial recognition ends in a fine of 2.5 million euros: what the Data Protection Agency says and what lessons can be learned". If you find our articles riveting, you can subscribe here; we will be pleased to get feedback.