Why Privacy Risk Assessments Are No Longer Optional for GCs

By Patrick Zeller and Patrick Burke

May 8, 2026

Why Privacy Risk Assessments Are No Longer Optional for GCs

Patrick E. Zeller is the General Counsel of Legal & Compliance at JetStream Security Inc. He is a globally recognized legal and compliance executive specializing in artificial intelligence governance, privacy, cybersecurity, and data protection. He helps organizations navigate the rapidly escalating legal, regulatory, and enterprise risks associated with deploying AI at scale. He can be reached at patrick.zeller@jetstream.security.

Patrick Burke practices at Burke Data & Privacy, advising on AI, privacy, and cybersecurity compliance. He served as Deputy Superintendent at the New York State Department of Financial Services, where his examiners investigated banks' cyber compliance. He was Chief Data & Privacy Officer for Havas, a top-5 global ad agency. Burke taught at Cardozo Law School and practiced at Norton Rose, Reed Smith, and Paul Weiss. He can be reached at patrick@burkedataprivacy.com.

Too many American corporate legal teams are failing to conduct proper privacy risk assessments mandated by new US and European laws and regulations. They are doing so at a time when the stakes are especially high and companies are increasingly deploying artificial intelligence systems that access personal data. As data-focused government lawyers (a federal prosecutor and a state cybersecurity regulator), we’ve seen firsthand how dodging assessments can often accumulate into substantial legal risks and liabilities.

Recently enacted U.S. state privacy legislation mandates risk assessments in certain circumstances in California and eight other states (Colorado, Virginia, Connecticut, Tennessee, Montana, Oregon, and Texas). These laws are similar to others around the world, like the European Union and United Kingdom’s General Data Protection Regulation (GDPR). So it’s important that legal teams make sure their organizations are in compliance. But, too often, it’s not happening.

Why is there assessment phobia?

So, why are otherwise skilled corporate lawyers struggling with risk assessments? There are three main drivers:

  1. Non-accountability: Risk assessments often remain confidential in part to encourage honesty, but this confidentiality tends to camouflage non-compliance or even a lack of assessments. Companies may erroneously claim they conduct regular assessments in compliance with law in their data protection agreements with customers and business partners. This lack of transparency extends to internal legal processes, where compliance tracking is often neglected. Under constant time pressure, privacy lawyers are forced to prioritize commercial tasks and contracts over long-term initiatives like privacy risk assessments.
  2. Hesitation stemming from confusion: Too many American in-house attorneys avoid privacy risk assessments. Some view them as “European” and not within the wheelhouse of a lawyer, that assessing technical risks is a non-legal task (it was not in their law school curriculum). Some find technical data flows intimidating and struggle to communicate with IT teams. They should be reminded that a risk assessment is just a fact investigation, as discussed further below.
  3. General counsel prioritization: Too many corporate general counsel are relying on outdated views, incorrectly believing that Europe requires risk assessments while the US does not. Others want to avoid overloading their tightly-stretched legal teams with complex compliance work. They think basic actions like mechanically updating privacy notices to superficially comply with new laws and regulations should suffice. Without leadership support, many privacy lawyers deprioritize risk assessments because they believe their boss prioritizes keeping the commercial contract review pipeline moving. The tone at the top shapes the privacy team’s priorities.

What do the regulations require?

The US, United Kingdom, and EU data protection laws all adopt a similar, largely flexible approach to compliance. Organizations must accurately disclose their data processing practices—particularly those involving privacy risks—through a public notice accessible to all affected individuals. In some cases and jurisdictions, individuals must provide their consent based on this notice.

These laws impose few outright bans on the corporate processing of personal information provided the processing is accurately disclosed in the privacy notice. However, recent restrictions target sensitive information like children’s data, geolocation, biometrics, and automated decisions in certain areas. When businesses provide accurate data notices, these legal frameworks allow them to manage personal data as they see fit, provided they have the individuals’ consent.

How do legal departments write privacy notices with sufficient technical disclosure that effectively shield the company from liability for risky processing? A lawyer can’t do that without learning  accurate and up-to-date information about that processing. Since lawyers can’t be expected to know that information themselves, they need to conduct a fact investigation first. In the privacy world, that is called a risk assessment, or sometimes a data protection impact assessment (DPIA) or a privacy impact assessment (PIA). Whatever it is called, it is nothing more than a confidential internal investigation concerning data processing practices.

Beyond the regulatory obligations in the US and Europe, each organization needs to be in compliance with data protection laws in many, if not most, of its contracts with business partners (and agree to be audited on its compliance processes). Cybersecurity insurers and other audits like System and Organization Controls 2 (SOC 2) require demonstrated documentation of risk assessments. An insufficient number of risk assessments can result in lost clients and vendors, and failure to obtain key certifications or flunking of insurer audits.

How should you conduct a risk assessment?

Risk assessments are confidential internal investigations. Full stop. Any in-house lawyer who can conduct an internal investigation can conduct a privacy risk assessment; they’re the same thing, just on a technical risk. Remember to think like a lawyer and, as with any fact investigation, step one is to gather evidence from knowledgeable witnesses. Identify such witnesses, schedule interviews, collect applicable policies, procedures and audit reports, and take good notes (about the technologies, data, processes, risks of non-compliance, and controls at issue). Afterward, write it up and send that document to the witness with instructions to correct it, or to certify its accuracy by signature. They are your source of truth.

Here are three examples of different risk assessments and how to conduct them:

  1. Health Insurance Portability and Accountability Act (HIPAA) third-party vendor assessment: Investigate a third-party health benefits vendor managing employee medical claims and therapy records to ensure compliance with HIPAA and data protection standards.
    • Interview experts: Privacy and compliance staff, information security team, legal counsel, and similar representatives from the vendor’s organization
    • Gather evidence: Signed vendor agreement for handling personal health information, independent security audit reports, documentation showing how patient information is protected, records of who can access data, procedures for responding to security incidents, plans for notifying affected parties if data is compromised
    • Evaluate key risks: Unauthorized disclosure of patient health information, security breaches exposing sensitive medical records, weak data protection methods, too many people having access to confidential information, vendor’s partners lacking proper agreements to protect data, failure to notify patients and regulators within the required timeframe
    • Mitigations: Require strong data encryption standards, use two-step verification for system access, schedule regular independent security reviews, limit access based on job responsibilities, maintain detailed activity records to track who accessed what information, create a clear incident response plan requiring vendor to promptly notify you of any security incident
  2. HR AI procurement assessment: Evaluate a proposed AI tool that screens CVs and predicts “cultural fit” to prevent algorithmic bias and legal liability under the EU’s General Data Protection Regulation (GDPR) and AI Act.
    • Interview experts: Privacy officer, employment law attorney, ethics specialist, HR leadership, diversity and inclusion staff, technical experts who understand the AI system
    • Gather evidence: AI system documentation, training data sources, accuracy testing across demographics, privacy impact assessment, decision-making explanations, candidate consent forms, hiring outcome statistics by protected characteristics
    • Evaluate key risks: Discrimination against protected groups, automated decisions without human oversight, inability to justify rejections to candidates, perpetuating historical bias, “cultural fit” excluding diversity, insufficient legal basis for data processing, potential 20 million euro fines
    • Mitigations: Eliminate “cultural fit” scoring, conduct quarterly bias testing, require human review of all recommendations, explain rejection reasons clearly, use diverse training data, remove proxy discrimination factors (zip codes, universities), obtain candidate consent with opt-out options, monitor selection rate disparities, establish appeals process
  3. Email marketing platform assessment: Investigate a third-party email marketing platform used for customer newsletters to ensure secure data handling and compliance with international privacy frameworks.
    • Interview experts: Privacy officer, marketing leadership, information security team, legal counsel, compliance staff, customer service manager, vendor account representative and engineer
    • Gather evidence: Vendor’s contract for data handling, independent security certifications, privacy policies, documentation showing how customers consent to emails, records of where customer data is stored and transferred, email authentication settings, unsubscribe processes, security incident history, list of vendor’s subcontractors
    • Evaluate key risks: Unauthorized access to customer contact information, privacy law violations (GDPR, CCPA), improper international data transfers, anti-spam law non-compliance, inadequate consent records, inability to honor customer deletion requests, undisclosed third parties accessing data, spam blacklisting damaging reputation, delayed breach notifications
    • Mitigations: Execute comprehensive data handling agreement, verify customer data stored in appropriate regions, double opt-in for subscriptions, ensure unsubscribe links in all emails, allow customers to manage preferences, require vendor to notify us within 72 hours of breaches, use two-step verification for platform access, collect only necessary information, automatically delete inactive subscribers after set period, maintain records of customer consent, use secure connections for integrations, secure right to review vendor’s partners quarterly

In each of these cases, the lawyer documents the findings and sends the draft risk assessment to the vendor’s representative for correction and certification. The vendor would be required to sign an updated Data Processing Agreement with Standard Contractual Clauses and implement automated deletion schedules.

Conclusion

Many general counsel need to become better informed on the applicable legal requirements, better appreciate data protection risks, and support compliance through leadership and resources. Review the data protection provisions in your agreements with customers, clients, business partners, and insurers. Ensure you understand the potential liability your company faces if it is found non-compliant with the risk assessment requirements in those contracts.

Effective compliance requires a mix of legal expertise and robust privacy software. By investing in these resources, companies prevent costly liabilities and improve their ability to conduct efficient risk assessments. This skill set is especially critical for managing enterprise AI systems, which are subject to data protection laws and requires its own risk assessments. Given the increasing reliance on AI, that alone justifies making privacy risk assessments a standard operational practice.

Must read intelligence for general counsel

Subscribe to the Daily Updates newsletter to be at the forefront of best practices and the latest legal news.

Daily Updates

Sign up for our free daily newsletter for the latest news and business legal developments.

Scroll to Top