The Spanish data protection authority (Agencia Española de Protección de Datos - AEPD) imposed a €96,000 fine for privacy violations in when using facial recognition technologies. The company deployed facial recognition system as the sole access method to its sports centers without providing alternatives, infringing Article 9 of the GDPR. A person refused to being subject to facial recognition and the sports center denied access to the sports facilities, without allowing the use of any identitification alternatives (such as ID card, drivers licence, etc) They also failed to conduct a Data Protection Impact Assessment as required by Article 35 and did not adequately inform members about data processing, violating Article 13. The DPIA is required even in this case, where the processing is for verification/authentication (1:1), instead of biometric identification (1:N) The AEPD noted that for the suitability analysis the controller first should define the effectiveness and error thresholds for each identification method to fulfill the same purpose of the processing, and determine whether the identification can actually be performed more effectively compared to traditional methods that use human intervention. In this case, the fined company did not: - analyze the effectiveness thresholds or error rates of the various access systems to the centers it has been using in the past (identification cards, fingerprints, facial recognition, and ID display), and - indicate why biometric using facial recognition is more effective than all other methods in verifying members' identity. The company merely stated that it decided to replace a previous system (fingerprint) with facial recognition due to the technical flaws detected in its application, without specifying or justifying these flaws. However, it did not include any analysis of the effectiveness or suitability of the other access methods that do not use biometric systems, such as identification cards and ID display. Implications for businesses: 1)Conduct DPIA before implementing any biometric technology 2) Always provide alternative identification methods when implementing biometric systems 3) Document and analyze effectiveness thresholds and error rates of all potential identification methods 4) Demonstrate why a biometric solution is more effective and necessary than traditional identification methods that use human intervention or non-biometric alternatives. 5) Provide information about the processing, including their right to use alternative ID methods
Biometric Data Privacy in Corporate Settings
Explore top LinkedIn content from expert professionals.
Summary
Biometric data privacy in corporate settings refers to how businesses handle, store, and protect sensitive personal information like fingerprints, facial scans, or voice patterns collected for authentication or identification. Ensuring privacy is crucial because, unlike passwords, biometric data cannot be changed if compromised, and mishandling it can lead to legal penalties and loss of trust.
- Offer alternatives: Always give employees and customers the option to use non-biometric identification methods alongside biometric systems.
- Get clear consent: Inform people how their biometric data will be used and ask for freely given permission before collecting it.
- Secure storage: Store biometric data locally on user devices whenever possible, and use strong security controls to limit who can access it within your organization.
-
-
𝑩𝒊𝒐-𝒎𝒆𝒕𝒓𝒊𝒄 𝒅𝒂𝒕𝒂 𝒊𝒔𝒏’𝒕 𝒍𝒊𝒌𝒆 𝒂 𝒑𝒂𝒔𝒔𝒘𝒐𝒓𝒅 𝒚𝒐𝒖 𝒄𝒂𝒏 𝒓𝒆𝒔𝒆𝒕. Once it’s exposed, it’s gone forever. That’s why laws like 𝗜𝗹𝗹𝗶𝗻𝗼𝗶𝘀’ 𝗕𝗶𝗼𝗺𝗲𝘁𝗿𝗶𝗰 𝗜𝗻𝗳𝗼𝗿𝗺𝗮𝘁𝗶𝗼𝗻 𝗣𝗿𝗶𝘃𝗮𝗰𝘆 𝗔𝗰𝘁 (𝗕𝗜𝗣𝗔) are so strict—and why founders who are building with 𝗳𝗶𝗻𝗴𝗲𝗿𝗽𝗿𝗶𝗻𝘁𝘀, 𝗳𝗮𝗰𝗲 𝘀𝗰𝗮𝗻𝘀, 𝗼𝗿 𝘃𝗼𝗶𝗰𝗲 𝗿𝗲𝗰𝗼𝗴𝗻𝗶𝘁𝗶𝗼𝗻 need to think about compliance from day one. I’ve put together a playbook especially for startups that explains: ➡️ The real risks of non-compliance (including penalties running into millions) ➡️Why investors and users both care about how you handle sensitive data ➡️Practical steps founders can take today—policies, consent, retention, and security ➡️Industry examples from fintech, healthtech, HR tech, gaming, and more At 𝗠𝗢𝗦 | 𝗖𝘆𝗯𝗲𝗿𝗗𝗼𝗰𝘀, we work with founders and teams to turn data privacy into a strength—not a hurdle. Done right, compliance builds trust, boosts credibility, and helps you scale with confidence. If you’re building products that touch biometric data, this guide is a must-read. You can find it attached in this post. I’d love for you to follow me for more practical insights on data privacy and global compliance laws—and if you need support embedding privacy into your startup’s DNA, feel free to reach out. #DataPrivacy #BiometricData #Compliance #Startups #PrivacyByDesign #MOS #CyberDocs #TrustByDesign
-
The Role of Trust and Compliance in Employee Data Privacy Amid Remote Work The shift toward remote work has increased the need for strong employee data privacy practices. Companies must consider state-specific laws governing how to handle, store, and protect personal information. Violations can not only damage the company's reputation but also result in severe financial penalties. State-Specific Regulations on Employee Data Privacy California Employees in California have the right to know how companies collect and use their data under the California Privacy Rights Act (CPRA). Companies must offer a method to opt out of data selling or sharing and provide the ability to correct inaccuracies. Fines for violations can reach up to $7,500 per incident. New York The New York SHIELD Act requires companies to implement administrative, technical, and physical safeguards to protect personal information of employees. Illinois The Illinois Biometric Information Privacy Act (BIPA) focuses on biometric data, such as fingerprints and facial recognition. Companies must obtain explicit consent from employees before collecting this type of data. Washington Companies should stay updated on evolving legislation like the Washington Privacy Act, which is expected to offer privacy rights like those in California and New York. Proposed Controls Aligned with State Regulations 1. Data Auditing: Conduct audits to identify where employee data resides and who can access it. 2. Access Control: Restrict data access to authorized personnel. 3. Encryption: Use encryption methods for sensitive data both when stored and when transferred. 4. Training: Train employees periodically to understand the importance of data privacy. Building an Effective Employee Privacy Policy An employee privacy policy should be distinct from a customer privacy policy. It should explicitly address: 1. Scope: Define whether the policy applies to all employees or only those under specific regulations. 2. Accountability: Assign a specific team or individual for oversight. 3. Employee Rights: Clearly state the rights employees have over their data and how to exercise them. 4. Compliance Controls: Explain the methods used to protect employee data. 5. Review Process: Specify a timeline for periodic reviews and updates to the policy. By adhering to these guidelines, companies can create a trusting and compliant work environment, which benefits both employees and the business. #cybersecurity #privacy #regulation
-
The Connecticut AG is focused on facial recognition, including by retailers for loss prevention purpose. Here's what's expected ⬇️ The Office of the Connecticut Attorney General recently released a report on its enforcement actions and priorities under the state comprehensive #privacy law. In the middle of the report was detailed guidance about how the AG's office views the use of #FacialRecognition technologies under the state privacy law. The guidance was prompted by a #retail use of facial recognition technology for loss prevention purposes. It notes that the state comprehensive privacy law applies to these uses, and that the crime and fraud exception in the law "is not a blanket exception" to the law's requirements. There "is no 'out' on compliance." It also indicates that facial recognition technology necessarily involves collection, use, and sometimes sharing of #biometrics. The guidance says companies should: 1️⃣clearly disclose use of facial recognition technology and available consumer rights; 2️⃣obtain informed and freely given consent and allow revocation of consent for facial recognition (biometric) data processing; 3️⃣conduct and document a data protection assessment per state law requirements (and re-assess when the facial recognition technology changes or there are new trends in observed data); 4️⃣actively monitor use and performance of the facial recognition technology, including for accuracy in identifications and based on demographic differences; 5️⃣implement strong policies and procedures for data protection assessments, risk-rating, and facial recognition technology-specific bias and discrimination training; 6️⃣address data minimization (limiting the amount of data processed), have clear data retention and deletion procedures, and only process the data for the specific purposes it was collected; and 7️⃣implement a comprehensive #InformationSecurity program with strict access controls, multifactor authentication, and data segmentation for facial recognition data. If your company uses facial recognition technology in consumer contexts in Connecticut, see how your company's practices stack up to these requirements. While some of these requirements like consent and opt-out processes may not be viable in loss prevention or security contexts, others like data protection assessments, security programs, and data minimization and retention practices can still be addressed. If other states take similar views on the applicability of state privacy laws and exceptions, common uses of facial recognition technology in retail and other public-facing contexts may require fresh looks at compliance and risk acceptance. I'm attaching an excerpt of the report that includes the facial recognition guidance. The full report is at https://lnkd.in/gryWMdFM
-
Binding a biometric to a verifiable credential is the best way to ensure that only the right person can use it. Storing the biometric template as a credential makes it easy to keep the biometric data under the user’s control. Instead of storing sensitive biometric data in centralized systems, we recommend keeping biometrics locally on user devices and issuing verifiable credentials to prove that biometric matching occurred. Here's how it works: 1. When users first install their wallet, they enrol their biometric (facial scan, palm scan, fingerprint, or even signature) directly on their device. 2. The biometric service issues a cryptographically signed credential into the wallet containing the enrolment data. 3. During authentication, the wallet performs a local biometric check, verifies the credential hasn't been tampered with, and issues a new credential proving the check passed. 4. Any relying party can verify this credential without ever accessing the actual biometric data. This solves a fundamental problem in digital identity as it: > Ensures that only the rightful owner of a credential can use it for verification (even if a bad actor can access the credential wallet) > Eliminates privacy risks from centralized biometric storage > Reduces liability for organizations handling biometric data > Maintains strong authentication without compromising user control With Dock Labs , you don't have to choose between robust security and user privacy, you can have both.
-
The Colorado Department of Law announced new rules governing biometric data, employee biometrics, children's data and AG Opinion Letters under the Colorado Privacy Act (CPA). Some of these new requirements include: 🔶 A Biometric Identifier Notice either separate from the privacy notice, or included within the privacy notice, if the collection and use of Biometric Identifiers is clearly labeled and accessible. 🔶 Consent prior to processing personal data of a consumer whom the Controller actually knows or willfully disregards is a minor. 🔶 Consent prior to selling/ disseminating Biometric Identifiers. 🔶 Employee consent prior to using new categories of Biometric Identifiers or using Biometric Identifiers for a new purpose. 🔶 New data protection assessment requirements for offering services, products or features to minors. Timing: ➡️ The Colorado AG will issue a formal opinion on the rules. After that formal opinion, the rules will then be filed with the Secretary of State, and they will become effective 30 days after they are published in the state register. ➡️ Data protection assessment requirements for minors will be effective 10/1/2025. #Biometrics #HR #GRC #PrivacyLaw #ChildrensPrivacy #EmploymentLaw #Compliance #HowManyNoticesCanYouFitOnOneWebsite?
-
Interested in AI + privacy / personal info / biometric data? Australian hardware retailer Bunnings has got itself into some trouble, with Privacy Commissioner Carly Kind yesterday releasing her findings that the company interfered with Australians' privacy via its AI facial recognition system. What's the deal? ⚫Bunning's facial recognition system collected sensitive biometric information from members of the public without notifying them or obtaining their consent ⚫The system ran from 6 Nov 2018 to 30 Nov 2021 and was installed across 62 retail locations ⚫Bunnings used this to create a searchable database, which placed people into categories based if Bunnings thought they'd committed "Organised Retail Crime"; violent or threatening behaviour, or acts of actual or threatened violence. ⚫Bunnings "was unable to provide the total number of enrolled individuals on the Database 'due to [Redacted]'"🤔 Why's this interesting? 🔵Australia is currently considering its possible legislative approach to regulating AI use 🔵Cases like this provide useful insights as to if our existing legislative framework is robust enough deal with issues at the intersection of AI + personal privacy 🔵While people are currently fixated on ChatGPT and large language AI, I think the main area of AI for legislative focus should be systems that use personal or biometric information to assist in decision-making or provide algorithmically determined insights about individuals or cohorts.
-
Italian DPA fines Company for unlawful Use of #FacialRecognition to determine the presence of employees --> € 120'000 The company, which has manufacturing facilities in two municipalities with a total of over 40 employees, had implemented a #facial recognition system in its factories in order to determine the presence of its employees and measure their #work performance. The recipient of the fine had justified the move by arguing that such systems could identify employees with greater reliability and thus counteract willful absenteeism. However, the Italian #data protection authority found that this did not constitute an interest that outweighed the rights and freedoms of the data subjects. The DPA found that the processing of #biometric data associated with facial recognition was carried out without a #legal basis. #gdpr #compliance