<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>cyber/data/privacy insights</title>
	<atom:link href="https://cdp.cooley.com/feed/" rel="self" type="application/rss+xml" />
	<link>https://cdp.cooley.com/</link>
	<description>Legal insight for market innovators</description>
	<lastBuildDate>Mon, 06 Apr 2026 21:15:44 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
<site xmlns="com-wordpress:feed-additions:1">237151628</site>	<item>
		<title>Part 2: NYDFS Sharpens Its Focus on Multifactor Authentication</title>
		<link>https://cdp.cooley.com/part-2-nydfs-sharpens-its-focus-on-multifactor-authentication/</link>
		
		<dc:creator><![CDATA[Jenna Moore]]></dc:creator>
		<pubDate>Mon, 06 Apr 2026 21:15:41 +0000</pubDate>
				<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4745</guid>

					<description><![CDATA[<p>Financial institutions covered by 23 NYCRR Part 500 (Part 500) (covered entities) must annually certify their compliance with these cybersecurity regulations. As the April 15 date for certifying compliance approaches, the New York Department of Financial Services (NYDFS) has been reinforcing its focus on one particular element of the updated requirements – multifactor authentication (MFA). [&#8230;]</p>
<p>The post <a href="https://cdp.cooley.com/part-2-nydfs-sharpens-its-focus-on-multifactor-authentication/">Part 2: NYDFS Sharpens Its Focus on Multifactor Authentication</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Financial institutions covered by <a href="https://cdp.cooley.com/new-york-department-of-financial-services-amends-its-cybersecurity-regulations/">23 NYCRR Part 500 (Part 500)</a> (covered entities) must annually certify their compliance with these cybersecurity regulations. As the <a href="https://finsights.cooley.com/nydfs-refresher-series-part-1-what-companies-need-to-know-ahead-of-annual-certifications-of-compliance/">April 15 date for certifying compliance approaches</a>, the New York Department of Financial Services (NYDFS) has been reinforcing its focus on one particular element of the updated requirements – multifactor authentication (MFA). On February 26, 2026, NYDFS hosted a <a href="https://www.dfs.ny.gov/system/files/documents/2026/02/Cyber-Public-Training-Lets-Talk-MFA-2026-02-26.pdf">public cybersecurity presentation</a> called “Let’s Talk MFA,” offering important insight into how NYDFS interprets and supervises the expanded MFA requirements under Part 500. The presentation and corresponding <a href="https://www.dfs.ny.gov/industry_guidance/cybersecurity">Frequently Asked Questions</a> make clear that MFA remains a top supervisory priority – and that covered entities should expect close scrutiny of how their MFA is designed, implemented, documented and governed.</p>



<h2 class="wp-block-heading"><strong>MFA is a baseline requirement, but not a one-size-fits-all control</strong></h2>



<p>Under the amendments to Part 500, MFA is now required for <strong>any</strong> person accessing a covered entity’s information systems, unless an exemption is approved in writing by the chief information security officer (CISO), or senior-most executive responsible for cybersecurity if the covered entity does not have a CISO. To comply with the requirements, the MFA must consist of at least two distinct authentication factors drawn from three different categories: knowledge (something you know), possession (something you have) or inherence (something you are). Using two factors from the same category (for example, a password and a security question – both something you know) does not satisfy the requirement.</p>



<p>While NYDFS stated that it is agnostic on specific MFA solutions, it reiterated that covered entities are expected to select MFA solutions and vendors appropriate for their specific risk profile. NYDFS’ “Let’s Talk MFA” presentation emphasized that simply deploying an MFA solution is not sufficient to meet the requirements if the configuration is weak or can be bypassed.</p>



<h2 class="wp-block-heading"><strong>Specific use cases: Single sign-on, cloud platforms and external-facing websites</strong></h2>



<p>NYDFS highlighted a few specific use cases drawn from industry questions it received regarding Part 500’s updated MFA requirements. First, NYDFS confirmed that single sign-on (SSO) solutions are permitted under Part 500, provided that MFA is enforced and cannot be effectively bypassed through SSO.&nbsp;</p>



<p>NYDFS also made explicit that cloud-based email, document hosting and other software as a service (SaaS) platforms are considered part of a covered entity’s “information systems” for purposes of Part 500, even when provided or managed by third parties. The entity must comply with Part 500 with respect to these platforms, and MFA must be enforced consistently on these platforms, including for privileged users. NYDFS stated that covered entities may not rely solely on a provider’s default MFA settings to satisfy Part 500 obligations. Instead, institutions are expected to evaluate whether those controls are compliant with Part 500 and appropriate to the covered entity’s risks, information systems and data.</p>



<p>Lastly, NYDFS addressed external-facing resources, a common question regarding the expansion of Part 500’s requirements. External websites intended solely for public consumption do not require MFA because they do not provide access to nonpublic information (NPI). However, NYDFS cautioned that if an external-facing system hosts NPI or poses a material risk to the covered entity or its customers, MFA to access those pages would be required. In practice, this means customer portals that provide access to NPI or other account information must have compliant MFA.</p>



<h2 class="wp-block-heading"><strong>Privileged access remains a supervisory focus</strong></h2>



<p>NYDFS noted in the webinar that it continues to observe weaknesses where privileged or administrative users are not consistently subject to MFA. Because privileged access is inherently higher risk, NYDFS expects covered entities to address it explicitly in their risk assessments and consider appropriate MFA. The MFA used for standard access, NYDFS warned, may not be considered compliant for privileged access if privileged access poses significantly more risk to the covered entity’s information systems or NPI.</p>



<h2 class="wp-block-heading"><strong>What NYDFS will look for in examinations</strong></h2>



<p>In the presentation, NYDFS noted that its supervisory exams will focus on:</p>



<ul class="wp-block-list">
<li>Whether MFA is implemented where required.</li>



<li>Whether high-risk systems and users are appropriately protected through the use of MFA.</li>



<li>The configuration of MFA and its effectiveness.</li>



<li>The MFA’s ability to prevent phishing, replay attacks and technical bypasses.</li>



<li>How MFA integrates with the covered entity’s incident detection and response.</li>
</ul>



<p>In short, NYDFS expects MFA to function as a meaningful security control and not a check-the-box exercise.</p>



<h2 class="wp-block-heading"><strong>Practical takeaways</strong></h2>



<p>For covered entities, the “Let’s Talk MFA” presentation reinforces that MFA is now a foundational cybersecurity control under Part 500. Covered entities should ensure that their MFA programs are risk-based, well-documented, consistently enforced (particularly for privileged users and cloud platforms), and supported by strong governance and monitoring.</p>



<p>As NYDFS continues to refine its guidance and enforcement posture, covered entities that can demonstrate thoughtful design and substantive risk analysis will be best positioned in examinations and supervisory inquiries.</p>



<p>Stay tuned for the final installment of our Part 500 refresher series, where we’ll explore how NYDFS has tackled emerging and novel cybersecurity issues.</p>



<h5 class="wp-block-heading">Authors</h5>



<p><a href="https://www.cooley.com/people/mari-dugas">Mari Dugas</a></p>



<p><a href="https://www.cooley.com/people/michael-egan">Mike Egan</a></p>



<p><a href="https://www.cooley.com/people/kate-goodman">Kate Goodman</a></p>



<p><a href="https://www.cooley.com/people/elyse-moyer/in-depth">Elyse Moyer</a></p>



<p><a href="https://www.cooley.com/people/bekah-putz">Bekah Putz</a></p>
<p>The post <a href="https://cdp.cooley.com/part-2-nydfs-sharpens-its-focus-on-multifactor-authentication/">Part 2: NYDFS Sharpens Its Focus on Multifactor Authentication</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4745</post-id>	</item>
		<item>
		<title>NYDFS Refresher Series – Part 1: What Companies Need to Know Ahead of Annual Certifications of Compliance</title>
		<link>https://cdp.cooley.com/nydfs-refresher-series-part-1-what-companies-need-to-know-ahead-of-annual-certifications-of-compliance/</link>
		
		<dc:creator><![CDATA[Jenna Moore]]></dc:creator>
		<pubDate>Wed, 25 Mar 2026 17:25:10 +0000</pubDate>
				<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4736</guid>

					<description><![CDATA[<p>Upcoming compliance certification Every year by April 15, financial entities subject to the New York Department of Financial Services (NYDFS) oversight (covered entities) are required to certify their compliance with the NYDFS’ cybersecurity regulations, 23 NYCRR Part 500 (Part 500). This year’s deadline will be the first time covered entities must certify compliance with all [&#8230;]</p>
<p>The post <a href="https://cdp.cooley.com/nydfs-refresher-series-part-1-what-companies-need-to-know-ahead-of-annual-certifications-of-compliance/">NYDFS Refresher Series – Part 1: What Companies Need to Know Ahead of Annual Certifications of Compliance</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<h2 class="wp-block-heading"><strong>Upcoming compliance certification</strong></h2>



<p>Every year by April 15, financial entities subject to the New York Department of Financial Services (NYDFS) oversight (covered entities) are required to certify their compliance with the NYDFS’ cybersecurity regulations, <a href="https://cdp.cooley.com/new-york-department-of-financial-services-amends-its-cybersecurity-regulations/">23 NYCRR Part 500 (Part 500)</a>. This year’s deadline will be the first time covered entities must certify compliance with all of the amendments to Part 500 that were phased in from November 2023 through November 2025 (Part 500 amendments).</p>



<p>This series will highlight key aspects of Part 500’s amendments, as well as recent NYDFS guidance, and provide insight into how NYDFS may assess compliance with Part 500.</p>



<p>Part 1 addresses asset inventories and risk assessment amendments, Part 2 details updated requirements for multifactor authentication and Part 3 explores the emerging cybersecurity issues that NYDFS has identified as key priority areas.&nbsp;</p>



<h2 class="wp-block-heading"><strong>Certification requirements</strong></h2>



<p>Certifications of compliance are affirmative representations by a covered entity’s chief information security officer (CISO) or senior most executive responsible for cybersecurity, attesting that the covered entity is in compliance with Part 500, and that the certification has been made upon the certifying individual’s review of the documents and controls upon which the certification is based. Certifications must be accurate, as making false statements to NYDFS itself is actionable, in addition to any substantive violations of Part 500. Additionally, the certifying individual could be held personally liable for certifying false statements to NYDFS. NYDFS has made clear through examinations, consent orders and explicit guidance that it expects certifications to be accurate, supportable and grounded in documented controls.</p>



<p>With the Part 500 amendments now in effect, NYDFS provides covered entities with two options: Submit a certification of material compliance, or submit an acknowledgement of noncompliance. An acknowledgement of noncompliance must contain:</p>



<ol class="wp-block-list">
<li>An acknowledgment that the covered entity did not materially comply with Part 500.</li>



<li>An identification of all sections of Part 500 that the entity is not in material compliance with.</li>



<li>A description of the nature and extent of noncompliance.</li>



<li>A remediation timeline or confirmation that remediation has been completed for the areas of noncompliance.</li>
</ol>



<h2 class="wp-block-heading"><strong>Part 500.13: Asset inventories</strong></h2>



<p>One of the most significant developments under the amended Section 500.13, effective November 2025, explicitly requires covered entities to maintain an inventory of all assets, not just those that are material to the covered entity or contain nonpublic information (NPI). This reflects NYDFS’ position that institutions cannot protect systems, devices and data they do not know they have. Numerous other Part 500 requirements rely on functional asset inventories, including risk assessments, access controls, vulnerability management and incident response planning. Deficiencies in asset inventories can cascade into compliance gaps with these provisions of Part 500 as well.</p>



<p>An asset management policy should cover the entire asset life cycle – from onboarding and classification to tracking, support and eventual deprecation. The policy should also document a cadence for reviewing, updating and validating the asset inventory. The asset inventory itself should identify owner, location and recovery time objectives for each asset.</p>



<p>The Part 500 amendments make clear that covered entities cannot treat asset inventories as a static list of systems, devices and data; the inventory is meant to be a living record.</p>



<h2 class="wp-block-heading"><strong>Part 500.9: Risk assessments</strong></h2>



<p>Risk assessments have always been central to Part 500, but the amendments reinforce their role as the driver of the cybersecurity program and the basis on which a program is evaluated. NYDFS now requires covered entities to conduct risk assessments at least annually and whenever material business or technology changes occur, which could include geopolitical events.</p>



<p>This reflects NYDFS’ position that a risk assessment cannot be static, generic or disconnected from operational reality. A risk assessment serves as the evidentiary bridge between hypothetical risk and implemented controls. A covered entity that cannot demonstrate how its cybersecurity measures are appropriate in the context of assessed risks may face questions about the sufficiency of its overall compliance and certification with Part 500.</p>



<h2 class="wp-block-heading"><strong>Looking ahead</strong></h2>



<p>For covered entities, the annual certification should be approached as a governance exercise, not a formality. Individuals responsible for preparing for certifications should take care to review the institution’s compliance posture holistically, building on the asset inventory and risk assessment controls as the key components underpinning compliance.</p>



<p>In our next post, we turn to one of the most heavily scrutinized areas of the amended Part 500: multifactor authentication.</p>



<h4 class="wp-block-heading">Authors</h4>



<p><a href="https://www.cooley.com/people/michelle-rogers">Michelle Rogers</a></p>



<p><a href="https://www.cooley.com/people/michael-egan">Michael Egan</a></p>



<p><a href="https://www.cooley.com/people/kate-goodman">Kate Goodman</a></p>



<p><a href="https://www.cooley.com/people/mari-dugas">Mari Dugas</a></p>
<p>The post <a href="https://cdp.cooley.com/nydfs-refresher-series-part-1-what-companies-need-to-know-ahead-of-annual-certifications-of-compliance/">NYDFS Refresher Series – Part 1: What Companies Need to Know Ahead of Annual Certifications of Compliance</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4736</post-id>	</item>
		<item>
		<title>South Korea’s AI Basic Act: Overview and Key Takeaways</title>
		<link>https://cdp.cooley.com/south-koreas-ai-basic-act-overview-and-key-takeaways/</link>
		
		<dc:creator><![CDATA[Jenna Moore]]></dc:creator>
		<pubDate>Wed, 28 Jan 2026 15:45:28 +0000</pubDate>
				<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4721</guid>

					<description><![CDATA[<p>South Korea’s Act on the Development of Artificial Intelligence and Establishment of Trust (AI Basic Act) took effect on January 22, 2026, joining the European Union AI Act as a comprehensive AI regulatory regime. The AI Basic Act provides high-level requirements for transparency and addressing high-risk AI systems, and confirms its extraterritorial application. It also creates the [&#8230;]</p>
<p>The post <a href="https://cdp.cooley.com/south-koreas-ai-basic-act-overview-and-key-takeaways/">South Korea’s AI Basic Act: Overview and Key Takeaways</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>South Korea’s <a href="https://www.law.go.kr/LSW/eng/engLsSc.do?menuId=2&amp;query=FRAMEWORK%20ACT%20ON%20THE%20DEVELOPMENT%20OF%20ARTIFICIAL%20INTELLIGENCE%20AND%20THE%20CREATION%20OF%20A%20FOUNDATION%20FOR%20TRUST#liBgcolor0" target="_blank" rel="noreferrer noopener">Act on the Development of Artificial Intelligence and Establishment of Trust</a> (AI Basic Act) took effect on January 22, 2026, joining the European Union AI Act as a comprehensive AI regulatory regime. The AI Basic Act provides high-level requirements for transparency and addressing high-risk AI systems, and confirms its extraterritorial application. It also creates the framework for the development and promulgation of specific requirements via existing and new government organizations. The Ministry of Science and Information and Communication Technology (MSIT) is charged with finalizing the specific enforcement decrees that will provide the technical details for compliance. </p>



<p><a href="https://www.cooley.com/news/insight/2026/2026-01-27-south-koreas-ai-basic-act-overview-and-key-takeaways">Read the full article on Cooley.com</a></p>
<p>The post <a href="https://cdp.cooley.com/south-koreas-ai-basic-act-overview-and-key-takeaways/">South Korea’s AI Basic Act: Overview and Key Takeaways</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4721</post-id>	</item>
		<item>
		<title>EU AI Act: Proposed ‘Digital Omnibus on AI’ Will Impact Businesses’ AI Compliance Roadmaps</title>
		<link>https://cdp.cooley.com/eu-ai-act-proposed-digital-omnibus-on-ai-will-impact-businesses-ai-compliance-roadmaps/</link>
		
		<dc:creator><![CDATA[Jenna Moore]]></dc:creator>
		<pubDate>Mon, 29 Dec 2025 15:56:59 +0000</pubDate>
				<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4700</guid>

					<description><![CDATA[<p>This update covers the European Commission’s proposed “Digital Omnibus on AI”, published 19 November 2025. Part of the European Union’s simplification drive, the proposal aims to streamline the EU Artificial Intelligence (AI) Act’s implementation, ease compliance burdens and adjust compliance deadlines ahead of the AI Act’s full application on 2 August 2026. These changes will [&#8230;]</p>
<p>The post <a href="https://cdp.cooley.com/eu-ai-act-proposed-digital-omnibus-on-ai-will-impact-businesses-ai-compliance-roadmaps/">EU AI Act: Proposed ‘Digital Omnibus on AI’ Will Impact Businesses’ AI Compliance Roadmaps</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p><br>This update covers the European Commission’s proposed “<a href="https://digital-strategy.ec.europa.eu/en/library/digital-omnibus-ai-regulation-proposal">Digital Omnibus on AI</a>”, published 19 November 2025. Part of the European Union’s simplification drive, the proposal aims to streamline the EU Artificial Intelligence (AI) Act’s implementation, ease compliance burdens and adjust compliance deadlines ahead of the AI Act’s full application on 2 August 2026.<br><br>These changes will reshape how organizations manage AI risk, data governance, and privacy-by-design obligations.<br><br><a href="https://www.cooley.com/news/insight/2025/2025-11-24-eu-ai-act-proposed-digital-omnibus-on-ai-will-impact-businesses-ai-compliance-roadmaps">Read the full article</a> on the Digital Omnibus on AI and explore key changes and simplification measures, including what they mean for businesses’ AI compliance roadmaps.</p>



<p></p>
<p>The post <a href="https://cdp.cooley.com/eu-ai-act-proposed-digital-omnibus-on-ai-will-impact-businesses-ai-compliance-roadmaps/">EU AI Act: Proposed ‘Digital Omnibus on AI’ Will Impact Businesses’ AI Compliance Roadmaps</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4700</post-id>	</item>
		<item>
		<title>China Releases Multiple Key Draft Cyber and Data Security Regulations at Year-End 2025</title>
		<link>https://cdp.cooley.com/china-releases-multiple-key-draft-cyber-and-data-security-regulations-at-year-end-2025/</link>
		
		<dc:creator><![CDATA[Jenna Moore]]></dc:creator>
		<pubDate>Tue, 23 Dec 2025 16:50:28 +0000</pubDate>
				<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4683</guid>

					<description><![CDATA[<p>China is closing out 2025 with significant steps to reinforce its data protection and cybersecurity regime. In the past month, Chinese regulators have unveiled multiple key draft regulations for public comments. These developments underscore China’s efforts to address the increasing data and security risks and the continuous enforcement of its Cybersecurity Law (CSL), Data Security [&#8230;]</p>
<p>The post <a href="https://cdp.cooley.com/china-releases-multiple-key-draft-cyber-and-data-security-regulations-at-year-end-2025/">China Releases Multiple Key Draft Cyber and Data Security Regulations at Year-End 2025</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>China is closing out 2025 with significant steps to reinforce its data protection and cybersecurity regime. In the past month, Chinese regulators have unveiled multiple key draft regulations for public comments. These developments underscore China’s efforts to address the increasing data and security risks and the continuous enforcement of its Cybersecurity Law (CSL), Data Security Law (DSL) and Personal Information Protection Law (PIPL).</p>



<p>This blog post explores the following three key latest developments and their implications:</p>



<ul class="wp-block-list">
<li><a href="https://www.cac.gov.cn/2025-11/22/c_1765543463511624.htm">Draft Provisions on Personal Information Protection for Large Online Platforms</a> (released for comments on November 22, 2025)</li>



<li><a href="http://www.mps.gov.cn:8080/n2254536/n4904355/c10316016/content.html">Draft Measures for Cyberspace Supervision and Inspection by Public Security Authorities</a> (released for comments on November 29, 2025)</li>



<li><a href="https://www.cac.gov.cn/2025-12/06/c_1766578179367262.htm">Draft Measures for Network Data Security Risk Assessment</a> (released for comments on December 6, 2025)</li>
</ul>



<h4 class="wp-block-heading"><strong>Draft Provisions on Personal Information Protection for Large Online Platforms (LOP Provisions)</strong></h4>



<h5 class="wp-block-heading"><strong>1. Scope of large online platforms (LOPs)</strong></h5>



<p>The LOP Provisions apply to LOPs that are established and operated in China. The Cyberspace Administration of China (CAC), the Ministry of Public Security (MPS) and other competent authorities will designate a platform as a LOP by considering whether such platform:</p>



<ul class="wp-block-list">
<li>Has more than 50 million registered users or more than 10 million monthly active users.</li>



<li>Provides critical network services or operates across multiple types of businesses.</li>



<li>Possesses or processes data that, if leaked, tampered with or damaged, would have a significant impact on national security, economic operations or public welfare.</li>



<li>Falls into the scope of other circumstances as determined by the CAC and the MPS.</li>
</ul>



<p>Designated LOPs will be listed in a catalogue and maintained by the CAC, MPS and other competent authorities.</p>



<h5 class="wp-block-heading">2. Appointment of the “person responsible for personal information protection” (DPO)</h5>



<p>A LOP must appoint a DPO and disclose their contact information. The DPO must be a member of the management level of the LOP, hold the nationality of the People’s Republic of China (PRC) and have no overseas permanent residence or long-term residence permit. In addition, the LOP Provisions also require the DPO to possess professional knowledge in personal information protection and have more than five years of relevant experience.</p>



<p>The DPO’s duties include, without limitation, guiding the LOP’s personal information processing compliance efforts, participating in decision-making related to personal information processing matters and exercising veto rights over such matters, supervising the processing activities and security measures adopted, and leading the development of rules for minors’ privacy protection. Note that the LOP Provisions empower the DPO to report personal information protection matters related to the LOP directly to the CAC and other competent authorities.</p>



<h5 class="wp-block-heading">3. Data localization and cross-border data transfer requirements</h5>



<p>LOPs are required to store personal information collected and generated from their operations in China locally. Cross-border transfers are allowed only if such transfers are necessary and will be conducted by LOPs in compliance with data transfer requirements under Chinese laws. In addition, the LOP Provisions impose specific requirements for data centers in which LOPs store data, including:</p>



<ul class="wp-block-list">
<li>The data center must be located in China.</li>



<li>The person in charge of the data center must hold PRC nationality and have no overseas permanent residence or long-term residence permit.</li>



<li>The data center’s security capabilities must comply with the requirements under applicable national standards in China.</li>
</ul>



<p>LOPs are also obligated to file certain information of the data centers used by them with the CAC and other competent authorities, such as the data centers’ management team and organizational structure, internal personal information protection policies, security measures adopted, and contracts signed with the data centers.</p>



<h4 class="wp-block-heading">Draft Measures for Cyberspace Supervision and Inspection by Public Security Authorities (MPS Supervision and Inspection Measures)</h4>



<p>These new draft MPS Supervision and Inspection Measures establish procedural rules and inspection criteria for public security authorities – i.e., China’s police force, the public security bureaus (PSBs) – and are intended to replace the existing Regulations on the Internet Security Supervision and Inspection by Public Security Authorities released in 2018.</p>



<h5 class="wp-block-heading">1. Scope and applicability</h5>



<p>The draft MPS Supervision and Inspection Measures permit PSBs to conduct inspections on the following types of entities:</p>



<ul class="wp-block-list">
<li>Internet service providers offering services, such as internet access, data centers, content delivery services, domain name services and information services.</li>



<li>“Public internet access service providers” (e.g., hotels, hospitals or other public places that provide publicly available Wi-Fi connection).</li>



<li>“Network operators” (i.e., entities that own or use networks to operate or provide services), along with their developers and maintenance providers.</li>



<li>Critical information infrastructure operators, along with their developers and maintenance providers.</li>



<li>Providers of network products and services.</li>



<li>Data handlers and personal information handlers (i.e., entities that independently determine data/personal information processing purposes and means).</li>
</ul>



<h5 class="wp-block-heading">2. Inspection power of PSBs</h5>



<p>Under the draft MPS Supervision and Inspection Measures, PSBs have the power to conduct both online and onsite inspections to assess an entity’s posture in cybersecurity, “information security” (undefined under these measures but likely referring to online content safety) and data security through measures such as “network information patrols,” “information review capability tests” (undefined under these measures but likely referring to content moderation capability), and vulnerability scanning. PSBs must focus their inspections on assessing whether the inspected entity has complied with certain key compliance requirements, including without limitation:</p>



<ul class="wp-block-list">
<li>Developing and implementing cybersecurity, “information security,” and data security management program and operating procedures.</li>



<li>Recording and retaining required user registration information and internet logs.</li>



<li>Compliance with the obligations under China’s cybersecurity multilevel protection scheme (MLPS).</li>



<li>Adopting technical measures to prevent viruses, cyberattacks and network intrusions.</li>



<li>Providing technical support and assistance to PSBs for safeguarding national security, preventing and investigating terrorist activities, and investigating crimes.</li>
</ul>



<h4 class="wp-block-heading">Draft Measures for Network Data Security Risk Assessment (Risk Assessment Measures)</h4>



<p>The Risk Assessment Measures define network data security risk assessment as “the identification, analysis and assessment of the risk associated with network data<a href="#_edn1" id="_ednref1">[i]</a> and network data processing activities.”</p>



<p>Network data handlers<a href="#_edn2" id="_ednref2">[ii]</a> processing “important data”<a href="#_edn3" id="_ednref3">[iii]</a> (important data handlers) are mandatorily required to proactively conduct the risk assessment on an annual basis. Other data handlers that do not process “important data” are encouraged to conduct the risk assessment at least every three years. Risk assessments can be conducted by network data handlers themselves or third-party institutions engaged by them. In addition to the risk assessment proactively conducted by network data handlers, the CAC and other competent authorities may also mandate network data handlers to engage a third-party institution to conduct risk assessments under the following circumstances:</p>



<ul class="wp-block-list">
<li>Where network data processing activities pose significant security risks.</li>



<li>Where a network data security incident occurs, resulting in the leakage or theft of “important data” or large-scale personal information.</li>



<li>Where network data processing activities may endanger national security or public interests.</li>



<li>Other circumstances determined by the CAC or other competent authorities.</li>
</ul>



<p>When conducting an annual risk assessment, important data handlers shall prepare an assessment report in accordance with the template attached to the Risk Assessment Measures and file such an assessment report with the competent authority (or the CAC, if the competent authority for an important data handler is unclear). Competent authorities and the CAC at provincial level or above may conduct random inspections and verifications of the authenticity and accuracy of the assessment reports, and network data handlers shall provide assistance.</p>



<h4 class="wp-block-heading">Next Steps</h4>



<p>Violations of these three regulations will be subject to applicable penalties imposed under the CSL, DSL and the PIPL. Companies providing services to Chinese customers and users should assess the applicability of these regulations and closely monitor their developments.</p>



<p>A Chinese translation of this post is available <a href="https://cdp.cooley.com/wp-content/uploads/2025/12/2025-12-23-china-releases-multiple-key-draft-cyber-and-data-security-regulations-at-year-end-2025-chinese-2.pdf">he</a><a href="https://cdp.cooley.com/wp-content/uploads/2025/12/2025-12-23-china-releases-multiple-key-draft-cyber-and-data-security-regulations-at-year-end-2025-chinese-2.pdf" target="_blank" rel="noreferrer noopener">re</a>.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>Authors</strong></p>



<p><a href="https://www.cooley.com/people/will-pao"><strong>Will Pao</strong></a>, Partner, Los Angeles</p>



<p><a href="https://www.cooley.com/people/zhijing-yu"><strong>Zhijing Yu</strong></a>, Associate, Singapore</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>Cooley LLP is not licensed to practice the law of the People’s Republic of China (PRC), and nothing herein constitutes an opinion or legal advice by Cooley with respect to PRC laws or otherwise. This blog may not be relied upon, construed as or used as an opinion, interpretation of or legal advice in any respect relating to or arising out of PRC laws or otherwise. This blog, and our review of the information referenced in this blog, is based solely upon our general familiarity with matters of the type referenced in this blog and the consultation with PRC counsel with respect to certain matters of PRC law or practice, as referenced in the blog, provided that notwithstanding such consultation, no opinions or legal advice with respect to PRC law are made herein. Any analysis, conclusion, advice or opinion with regard to PRC laws, or otherwise with regard to any of the matters referenced in this blog, must be obtained from PRC local counsel.</em><a id="_msocom_1"></a></p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><a href="#_ednref1" id="_edn1">[i]</a> “Network data” refers to electronic data processed and generated through networks.</p>



<p><a href="#_ednref2" id="_edn2">[ii]</a> “Network data handlers” refers to individuals or organizations that independently determine data processing purposes and means.</p>



<p><a href="#_ednref3" id="_edn3">[iii]</a> “Important data” refers to data in specific fields, specific groups or specific regions, or data that has reached a certain level of accuracy and scale, which, if tampered with, damaged, leaked or illegally obtained or used, may directly endanger national security, economic operations, social stability, public health and safety.</p>
<p>The post <a href="https://cdp.cooley.com/china-releases-multiple-key-draft-cyber-and-data-security-regulations-at-year-end-2025/">China Releases Multiple Key Draft Cyber and Data Security Regulations at Year-End 2025</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4683</post-id>	</item>
		<item>
		<title>ICO Updates Guidance on Encryption</title>
		<link>https://cdp.cooley.com/ico-updates-guidance-on-encryption/</link>
		
		<dc:creator><![CDATA[Georgia Grisaffe]]></dc:creator>
		<pubDate>Mon, 03 Nov 2025 09:11:37 +0000</pubDate>
				<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4670</guid>

					<description><![CDATA[<p>The UK Information Commissioner’s Office (ICO) has released updated guidance on encryption following a recent consultation.</p>
<p>The revised guidance provides a framework outlining when and how organisations should consider implementing encryption to protect personal data. </p>
<p>The post <a href="https://cdp.cooley.com/ico-updates-guidance-on-encryption/">ICO Updates Guidance on Encryption</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p><strong>What happened?</strong></p>



<p>The UK Information Commissioner’s Office (ICO) has released&nbsp;<a href="https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/security/encryption/encryption-and-data-protection/">updated guidance on encryption</a> following a recent consultation.</p>



<p>The revised guidance provides a framework outlining when and how organisations should consider implementing encryption to protect personal data. The guidance does not cover end-to-end encryption, privacy-enhancing technologies or the potential implications of quantum computing.&nbsp;</p>



<span id="more-4670"></span>



<p>Although the UK General Data Protection Regulation (GDPR) does not specifically require companies to use encryption or encrypt all personal data they hold, the ICO strongly recommends implementing encryption as a robust technical measure to support the secure processing of data.</p>



<p>The ICO’s updated encryption guidance adopts its “must, should, could” framework: “<strong>must</strong>” denotes legal obligations, “<strong>should</strong>” reflects strong expectations for compliance and “<strong>could</strong>” offers optional best practices. This article focuses on the non-negotiable <strong>musts</strong>, because understanding and implementing these legal requirements is essential for organisations aiming to avoid regulatory risk.</p>



<p><strong>What must companies do?</strong></p>



<p>Although encryption is not mandatory, the ICO advises that it should be widely used – even in lower-risk situations – alongside other appropriate measures. Encryption is now well established, widely available and low cost, making it an appropriate and practical measure to support organisations’ compliance with data protection legislation.</p>



<p>However, there are a number of non-negotiable requirements for using encryption tools under the new guidance. Organisations&nbsp;must:</p>



<ul class="wp-block-list">
<li>At a general level, put in place appropriate technical and organisational measures to uphold data protection principles and integrate necessary safeguards into organisations’ processing activities. This includes the use of any encryption tools, and measures must be considered both at the design phase and throughout the life cycle of the processing.</li>



<li>Consider the state of the art of technology and the cost of implementing that measure. This is required when you assess whether a technical or organisational measure is appropriate and is implied to include encryption within its scope. As technology evolves, so must organisations’ encryption standards.</li>



<li>Consider the necessity of encryption at the design phase of any processing activity.</li>



<li>Avoid the use of SSL. The guidance notes SSL’s known vulnerabilities and its potential to compromise the security of personal data. Using SSL may result in noncompliance with UK GDPR security obligations, and it must not be used under any circumstances, including public-facing HTTPS implementation.  </li>



<li>Ensure compliance with legal obligations when processing encrypted data. This includes setting an appropriate review period for encryption use and assessing whether a personal data breach involving encrypted data must be reported to the ICO.</li>



<li>Use in-transit encryption for your online applications (e.g. TLS) to prevent unauthorised access to data if it is intercepted during transmission.</li>



<li>Implement robust user authentication mechanisms for accessing encrypted personal data.</li>



<li>Ensure technical measures are in place to restore availability and access to encrypted personal data promptly in the event of an incident.</li>



<li>When determining encryption use and backup retention periods, consider the right to erasure under Article 17 of the UK GDPR and how it may apply.</li>
</ul>



<p><strong>Next steps</strong></p>



<p>In light of the guidance, companies should review their encryption practices and broader data security policies to ensure alignment with UK data protection law. For support with auditing your encryption measures, drafting a tailored encryption policy or any wider queries around compliance with UK data protection legislation, please contact the Cooley team below.</p>



<p><strong>Authors</strong></p>



<p><a href="https://www.cooley.com/people/guadalupe-sampedro">Guadalupe Sampedro</a>, Partner, London</p>



<p><a href="https://www.cooley.com/people/daniel-millard">Dan Millard</a>, Associate, London</p>
<p>The post <a href="https://cdp.cooley.com/ico-updates-guidance-on-encryption/">ICO Updates Guidance on Encryption</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4670</post-id>	</item>
		<item>
		<title>English Court of Appeal Rules on Compensation for Data Breaches</title>
		<link>https://cdp.cooley.com/english-court-of-appeal-rules-on-compensation-for-data-breaches/</link>
		
		<dc:creator><![CDATA[Georgia Grisaffe]]></dc:creator>
		<pubDate>Thu, 04 Sep 2025 11:03:12 +0000</pubDate>
				<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4660</guid>

					<description><![CDATA[<p>The English Court of Appeal has handed down an important judgment in Farley v. Paymaster (Equiniti)[1] on when compensation may be claimed for nonmaterial damage (such as distress or anxiety) arising out of breaches of the General Data Protection Regulation (GDPR) and the Data Protection Act 2018 (DPA).</p>
<p>The post <a href="https://cdp.cooley.com/english-court-of-appeal-rules-on-compensation-for-data-breaches/">English Court of Appeal Rules on Compensation for Data Breaches</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>The English Court of Appeal has handed down an important judgment in <em>Farley v. Paymaster</em> (Equiniti) <a id="_ftnref1" href="#_ftn1">[1]</a> on when compensation may be claimed for nonmaterial damage (such as distress or anxiety) arising out of breaches of the General Data Protection Regulation (GDPR) and the Data Protection Act 2018 (DPA).</p>



<p>The case arose from misaddressed annual pension benefit statements sent to current and former Sussex police officers. The High Court had previously struck out the claims on the basis that there was no evidence that the statements were ever opened or read by third parties. The Court of Appeal confirmed both that disclosure was not essential for a GDPR infringement, and that claimants could recover compensation for fear of the consequences of an infringement if that fear was objectively well-founded, rather than hypothetical or speculative.</p>



<span id="more-4660"></span>



<p><strong>Note:</strong> The breach occurred in 2019, before the end of the Brexit transition period (31 December 2020). At that time, the European Union GDPR applied directly in the UK, so claims were assessed under the EU GDPR rather than the UK GDPR. However, the Court of Appeal noted that there are no material differences between the two regimes for these purposes<em>.</em></p>



<p style="font-size:19px"><strong>Case background</strong></p>



<p>In 2019, Equiniti, acting as administrator of the Sussex Police pension scheme, posted pension statements in window envelopes to more than 750 out-of-date residential addresses. The statements contained personal details, including dates of birth, national insurance numbers and information on salaries and accrued benefits. Sussex Police had provided Equiniti with up-to-date addresses which were uploaded to Equiniti’s database, but when the statements were produced, Equiniti’s system used the out-of-date addresses in error.</p>



<p>The Information Commissioner’s Office (ICO) was notified and concluded that the risk of individuals suffering significant consequences was unlikely. It took no enforcement action. 474 officers brought claims, seeking £1,250 each. They alleged:</p>



<ol style="list-style-type:lower-roman" class="wp-block-list">
<li>Breaches of statutory duties under the GDPR/DPA, focusing on data minimisation, accuracy, fairness, integrity and confidentiality (Article 5) and appropriate technical/organisational measures (Articles 24, 25 and 32).</li>



<li>Misuse of private information, centred on “anxiety, alarm, distress and embarrassment” amounting to nonmaterial damage, with some claimants also alleging aggravation of preexisting medical conditions.</li>
</ol>



<p>At first instance, the High Court struck out most claims on the basis that, unless a claimant could show that the statement was opened/read by a third party, there was no viable case, as there was no “processing” under the GDPR.</p>



<p style="font-size:19px"><strong>Court of Appeal decision</strong></p>



<p><strong>GDPR claim – Processing without disclosure</strong></p>



<p>The Court of Appeal held that the judge was wrong to require the statements to have been opened/read by a third party. Mailing statements to the wrong addresses was itself “processing” under the GDPR, which covers any operation on personal data, not just disclosure. Equiniti’s database handling, printing and posting all fell within the definition of “processing”.</p>



<p><strong>Compensation principles</strong></p>



<ul class="wp-block-list">
<li><strong>No threshold of seriousness</strong>. The Court of Appeal confirmed that there is no “de minimis” threshold for compensation under Article 82 of the GDPR. Following EU case law on this topic (e.g., Austrian Post), compensation cannot be denied simply because harm is modest.</li>



<li><strong>Distress not the only label</strong>. Nonmaterial damage is broader than “distress” alone. While Section 168 DPA makes clear that “non-material damage” under Article 82 GDPR includes distress, this is an umbrella term for various forms of emotional harm, including those listed in Recital 85 GDPR. Claims framed as “stress” or “anxiety” are not automatically out of scope.</li>



<li><strong>Fear of misuse must be “well-founded”</strong>. Claims based on fear of identity theft or misuse can succeed, but only if fears are objectively reasonable in the circumstances. Purely speculative or hypothetical risks will not qualify.</li>



<li><strong>Psychiatric injury</strong>. Where well-founded fears lead to recognisable psychiatric conditions, compensation is also recoverable in principle.</li>
</ul>



<p style="font-size:19px"><strong>What this means for businesses</strong></p>



<ul class="wp-block-list">
<li>Misaddressing or misdirecting personal data is still “processing” under the GDPR and may be an infringement even if nobody opens or reads the communication.</li>



<li>Claims for fear/anxiety can proceed if objectively reasonable, with the “well-founded fear” test as the filter.</li>



<li>Organisations cannot argue that a breach is “too minor” to generate liability under the GDPR.</li>



<li>Where anxiety escalates into psychiatric injury, compensation may be recoverable (subject to the “well-founded fear” test).</li>
</ul>



<p style="font-size:19px"><strong>Notification and litigation risk</strong></p>



<p>A paradox highlighted by this case is that breach notification itself can create liabilities and generate claims. Informing individuals of a breach may give rise to anxiety, distress or other nonmaterial damage based on well-founded fears. In <em>Farley</em>, many officers said the notification letters triggered their concerns about identity theft or misuse.</p>



<p style="font-size:19px"><strong>Bottom line</strong></p>



<p>The Court of Appeal did not decide whether these claims were successful; instead, it remitted them to the High Court for a detailed review. Some may ultimately fall away, and even successful claims are likely to result in modest awards.</p>



<p>However, <em>Farley</em> confirms that organisations may face litigation risk for data breaches even where disclosure never occurs and the alleged harm is modest. Businesses should maintain robust accuracy and security controls, consider their communications carefully when breaches arise and be prepared to defend claims based on well-founded fears.</p>



<p><a id="_ftn1" href="#_ftnref1">[1]</a> [2025] EWCA Civ 1117.</p>



<p><strong>Authors</strong></p>



<p><a href="https://www.cooley.com/people/ann-bevitt">Ann Bevitt</a>, Partner, London</p>



<p><a href="https://www.cooley.com/people/morgan-mccormack">Morgan McCormack</a>, Associate, London</p>



<p></p>



<p></p>
<p>The post <a href="https://cdp.cooley.com/english-court-of-appeal-rules-on-compensation-for-data-breaches/">English Court of Appeal Rules on Compensation for Data Breaches</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4660</post-id>	</item>
		<item>
		<title>What the UK’s New Data (Use and Access) Act Means for Your Business</title>
		<link>https://cdp.cooley.com/what-the-uks-new-data-use-and-access-act-means-for-your-business/</link>
		
		<dc:creator><![CDATA[Paula Witt]]></dc:creator>
		<pubDate>Thu, 03 Jul 2025 21:28:38 +0000</pubDate>
				<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4630</guid>

					<description><![CDATA[<p>Event summary The UK’s Data (Use and Access) Act 2025 has now received royal assent. This landmark legislation introduces targeted updates to the UK’s data protection framework, impacting everything from automated decision-making and scientific research to marketing practices and cookie compliance. Please join our partners for a concise 30-minute webinar as they highlight the keys [&#8230;]</p>
<p>The post <a href="https://cdp.cooley.com/what-the-uks-new-data-use-and-access-act-means-for-your-business/">What the UK’s New Data (Use and Access) Act Means for Your Business</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<h2 class="wp-block-heading">Event summary</h2>



<div class="wp-block-group"><div class="wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained">
<div class="wp-block-group"><div class="wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained">
<div class="wp-block-group has-medium-font-size" style="font-style:normal;font-weight:400"><div class="wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained">
<p style="font-size:17px;font-style:normal;font-weight:400">The UK’s Data (Use and Access) Act 2025 has now received royal assent. This landmark legislation introduces targeted updates to the UK’s data protection framework, impacting everything from automated decision-making and scientific research to marketing practices and cookie compliance.</p>



<p style="font-size:17px;font-style:normal;font-weight:400">Please join our partners for a concise 30-minute webinar as they highlight the keys issues for businesses operating in the UK, along with practical steps to prepare for this new legislation.</p>



<p style="font-size:17px;font-style:normal;font-weight:400"><strong>Thursday, 10 July 2025</strong></p>



<p style="font-size:17px;font-style:normal;font-weight:400">8:30 am PDT // 9:30 am MDT // 10:30 am CDT // 11:30 am EDT // 4:30 pm BST // 5:30 pm CEST</p>



<p style="font-size:17px;font-style:normal;font-weight:400">For more information, please email <a href="ggrisaffe@cooley.com">Georgia-Rose Grisaffe</a>.</p>



<p><strong>Our capabilities</strong></p>



<ul style="font-size:17px" class="wp-block-list">
<li style="font-size:17px;font-style:normal;font-weight:400">Learn about Cooley’s <a href="https://www.cooley.com/services/practice/cyber-data-privacy" target="_blank" rel="noreferrer noopener">cyber/data/privacy team</a></li>
</ul>



<p><strong>Our updates and insights</strong></p>



<ul class="wp-block-list">
<li style="font-size:17px;font-style:normal;font-weight:400"><a href="https://uklitigation.cooley.com/" target="_blank" rel="noreferrer noopener">On the Record</a> – Key insights on disputes and the issues that drive them</li>



<li style="font-size:17px;font-style:normal;font-weight:400"><a href="https://cdp.cooley.com/" target="_blank" rel="noreferrer noopener">Cyber/data/privacy insights</a> – Legal insight for market innovators</li>
</ul>



<p class="has-medium-font-size"><strong><a href="https://i.cooley.com/l/708103/2025-06-30/28nxw4" target="_blank" rel="noreferrer noopener">Details and registration</a></strong></p>
</div></div>
</div></div>
</div></div>



<p></p>
<p>The post <a href="https://cdp.cooley.com/what-the-uks-new-data-use-and-access-act-means-for-your-business/">What the UK’s New Data (Use and Access) Act Means for Your Business</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4630</post-id>	</item>
		<item>
		<title>Comparing New Neural Data Privacy Laws in 4 US States</title>
		<link>https://cdp.cooley.com/comparing-new-neural-data-privacy-laws-in-4-us-states/</link>
		
		<dc:creator><![CDATA[Paula Witt]]></dc:creator>
		<pubDate>Tue, 01 Jul 2025 16:50:18 +0000</pubDate>
				<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4617</guid>

					<description><![CDATA[<p>Cooley partner Kristen Mathews&#8216; Law360 article argues that protecting neural privacy is essential – for both businesses and the human mind. Examining the evolving legal landscape surrounding neural data privacy in the United States, Mathews highlights recent legislation in Colorado, California, Montana and Connecticut regulating the handling of neural data as sensitive personal information. She [&#8230;]</p>
<p>The post <a href="https://cdp.cooley.com/comparing-new-neural-data-privacy-laws-in-4-us-states/">Comparing New Neural Data Privacy Laws in 4 US States</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Cooley partner <a href="https://www.cooley.com/people/kristen-mathews">Kristen Mathews</a>&#8216; <a href="https://cdp.cooley.com/wp-content/uploads/2025/06/Law360-Comparing-New-Neural-Data-Privacy-Laws-In-4-States.pdf">Law360 article</a> argues that protecting neural privacy is essential – for both businesses and the human mind. Examining the evolving legal landscape surrounding neural data privacy in the United States, Mathews highlights recent legislation in Colorado, California, Montana and Connecticut regulating the handling of neural data as sensitive personal information. She discusses the unique privacy concerns associated with neural data, the varying consent models and transparency requirements across the states that have enacted regulatory statutes, and the operational implications for businesses. She also underscores the importance of neurotechnology companies adopting robust self-regulatory practices to stay ahead of regulations while continuing to innovate.</p>



<p></p>
<p>The post <a href="https://cdp.cooley.com/comparing-new-neural-data-privacy-laws-in-4-us-states/">Comparing New Neural Data Privacy Laws in 4 US States</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4617</post-id>	</item>
		<item>
		<title>The DOJ&#8217;s Data Security Program &#8211; Understanding and Complying with the New Bulk Data Transfer Rule</title>
		<link>https://cdp.cooley.com/understanding-and-complying-with-the-dojs-bulk-data-rule/</link>
		
		<dc:creator><![CDATA[Jenna Moore]]></dc:creator>
		<pubDate>Mon, 23 Jun 2025 19:16:39 +0000</pubDate>
				<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4580</guid>

					<description><![CDATA[<p>This post is one in a series where we discuss the US Department of Justice’s (DOJ’s) data security program, commonly known as the bulk data transfer rule, which prohibits individuals or entities from certain foreign countries, including China, from accessing certain types of sensitive data, and imposes onerous privacy and cybersecurity obligations for accessing other [&#8230;]</p>
<p>The post <a href="https://cdp.cooley.com/understanding-and-complying-with-the-dojs-bulk-data-rule/">The DOJ&#8217;s Data Security Program &#8211; Understanding and Complying with the New Bulk Data Transfer Rule</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>This post is one in a series where we discuss the US Department of Justice’s (DOJ’s) <a href="https://www.govinfo.gov/content/pkg/FR-2025-01-08/pdf/2024-31486.pdf">data security program</a>, commonly known as the bulk data transfer rule, which prohibits individuals or entities from certain foreign countries, including China, from accessing certain types of sensitive data, and imposes onerous privacy and cybersecurity obligations for accessing other types of data. In April 2025, we <a href="https://www.cooley.com/news/insight/2025/2025-04-02-the-dojs-bulk-sensitive-personal-data-rules-imminent-relevance-to-life-sciences-companies" target="_blank" rel="noreferrer noopener">discussed the rule from a life sciences perspective</a>. In light of the DOJ’s recent guidance in its <a href="https://www.justice.gov/opa/media/1396351/dl" target="_blank" rel="noreferrer noopener">Frequently Asked Questions</a> and <a href="https://www.justice.gov/opa/media/1396356/dl" target="_blank" rel="noreferrer noopener">Compliance Guide</a>, this post addresses the rule more broadly and:</p>



<ul class="wp-block-list">
<li>Describes what is a “covered data transaction” under the rule. </li>



<li>Summarizes the two kinds of covered data transactions subject to the rule – those that are prohibited versus merely restricted.</li>



<li>Describes the rule’s privacy and cybersecurity requirements for restricted transactions, which are numerous and challenging to implement.</li>



<li>Includes a checklist of next steps for companies to assess their exposure to the rule and resulting compliance obligations.</li>
</ul>



<p>For quick reference, <a href="https://cdp.cooley.com/wp-content/uploads/2025/06/cdp-blogpost-chart-2025-06-v2es.pdf" target="_blank" rel="noreferrer noopener">this flowchart</a> can help assess if a transaction might be subject to the rule.</p>



<p>The rule took effect on April 8, 2025, but was recently deprioritized for enforcement in a <a href="https://www.justice.gov/opa/pr/justice-department-implements-critical-national-security-program-protect-americans-sensitive" target="_blank" rel="noreferrer noopener">temporary reprieve by the DOJ</a> that expires <strong>July 8, 2025</strong>. This deadline is rapidly approaching, and companies should promptly assess whether they have any current or anticipated covered data transactions. If so, they should not delay implementing the rule’s privacy and cybersecurity requirements, which will take time, as well as both legal and technical resources. Violating the rule can be severe because the rule has teeth. Violations entail civil penalties (fines the greater of $368,136 or twice the value of the transaction) and also can incur criminal penalties (fines of up to $1,000,000 and 20 years in prison).&nbsp;</p>



<h3 class="wp-block-heading has-black-color has-text-color has-link-color wp-elements-35e2391cf999a516a7e11589d2da35fb">What transactions are covered by the DOJ’s bulk data transfer rule? </h3>



<p>The rule applies to “covered data transactions,” which may be either prohibited or restricted depending on the type and quantity of data and processing involved.&nbsp;</p>



<p>A covered data transaction is a transaction that involves <strong>access</strong> by a <strong>country of concern</strong> or <strong>covered person</strong> to <strong>bulk US sensitive personal data</strong> or <strong>government-related data</strong>, and that involves a <strong>data brokerage</strong>, <strong>investment agreement</strong><em>, </em><strong>employment agreement</strong> or <strong>vendor agreement</strong><em>.</em> <strong>Exempt transactions</strong> avoid much of the rule. We discuss the bolded terms below.&nbsp;</p>



<h5 class="wp-block-heading has-black-color has-text-color has-link-color wp-elements-2f2faac61492570a731beb30ebb0e9d0">Access</h5>



<p>Access is defined broadly under the rule to mean any logical or physical access without regard to whether security measures, such as access controls, actually deny access.&nbsp; For example, this means that a person located in China, who has access to a database containing bulk US sensitive personal data but for whom access controls prevent them from actually accessing the data, is still considered to have “access” to the data for purposes of determining whether the rule applies.&nbsp;&nbsp;</p>



<h5 class="wp-block-heading has-black-color has-text-color has-link-color wp-elements-98cfe36169c9a7889ea9826a2d16e768">Country of concern or covered person</h5>



<ul class="wp-block-list">
<li>Countries of concern include China (including Hong Kong and Macau), Cuba, Iran, North Korea, Russia and Venezuela.&nbsp;</li>



<li>Covered persons include entities and individuals in four categories:
<ul class="wp-block-list">
<li>Foreign individuals primarily resident in countries of concern.</li>



<li>Foreign entities that are 50% or more owned (directly or indirectly) by a country of concern, organized under the laws of a country of concern or have their principal place of business in a country of concern (including, potentially, a foreign subsidiary of a US company).</li>



<li>Foreign entities that are 50% or more owned (directly or indirectly) by a covered person.</li>



<li>Foreign employees or contractors of countries of concern, or of entities that are covered persons.</li>
</ul>
</li>
</ul>



<h5 class="wp-block-heading has-black-color has-text-color has-link-color wp-elements-ce514f72ff2832100ee6b1c33f663b6f">Bulk US sensitive personal data and government-related data</h5>



<ul class="wp-block-list">
<li>Bulk US sensitive personal data is sensitive data of certain thresholds depending on the type of data, regardless of whether the data is anonymized, pseudonymized, de-identified or encrypted:
<ul class="wp-block-list">
<li><span style="color: initial;">Human genomic data on more than 100 US persons</span></li>



<li><span style="color: initial;">Other human ‘omic data on more than 1,000 US persons. </span></li>



<li><span style="color: initial;">Biometric identifiers on more than 1,000 US persons. </span></li>



<li><span style="color: initial;">Precise geolocation data on more than 1,000 US devices.  </span></li>



<li><span style="color: initial;">Personal health data on more than 10,000 US persons. </span></li>



<li><span style="color: initial;">Personal finance data on more than 10,000 US persons.</span></li>



<li><span style="color: initial;">Covered personal identifiers on more than 100,000 US persons.&nbsp;</span></li>
</ul>
</li>



<li>Government-related data includes certain types of data related to certain sensitive locations (such as relating to national security or intelligence), military installations, or current or former government employees or contractors.&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;</li>
</ul>



<h5 class="wp-block-heading has-black-color has-text-color has-link-color wp-elements-0082e7786e8f66a0f5e531d6a9b70bde">Investment agreement, employment agreement, vendor agreement and data brokerage</h5>



<ul class="wp-block-list">
<li>Investment agreements involve agreements in which a person or entity obtains direct or indirect ownership rights in US real estate or a US legal entity, with some exceptions for passive investments.</li>



<li>Employment agreements involve typical workforce arrangements.</li>



<li>Vendor agreements involve arrangements where a person or entity provides goods or services to another for payment or other consideration.</li>



<li>Data brokerage means selling (or licensing access to) data, where the recipient of the data did not collect the data directly from the individuals associated with the data.&nbsp;</li>
</ul>



<h3 class="wp-block-heading has-black-color has-text-color has-link-color wp-elements-1feac75d101ff736bec31a85b7ba2811">Do any exemptions apply to DOJ&#8217;s bulk data transfer rule?</h3>



<p>Even if a transaction is a covered data transaction as described above, it may nevertheless be exempt from certain obligations in the rule if it falls within one or more exemptions. Exemptions include:</p>



<ul class="wp-block-list">
<li><strong>Personal communications</strong>. Data transactions such as postal and telephonic communications, provided the communication does not transfer anything of value.</li>



<li><strong>Informational materials</strong>. Data transactions that involve importing or exporting information or informational materials to or from any country.</li>



<li><strong>Travel</strong>. Data transactions that are ordinarily incident to travel between countries for personal purposes.</li>



<li><strong>Financial services</strong>. Data transactions ordinarily incident to financial services, such as those provided by financial institutions (e.g., banking, capital markets and financial-insurance services), the transfer of personal financial data incidental to the purchase and sale of goods, the provision of payments or funds transfers involving personal financial data or covered personal identifiers, and the provision of investment management services that manage or provide advice on investments for compensation.</li>



<li><strong>Corporate group transactions</strong>. Data transactions between US companies and subsidiaries or affiliates located in countries of concern that are ordinarily incident to business operations, such as human resources, payroll, risk management and customer support. This exemption is narrower than it sounds as it is limited to transactions that are incidental to standard business operations.&nbsp;&nbsp;</li>



<li><strong>Investment agreements subject to a Committee on Foreign Investment in the United States (CFIUS) agreement or condition</strong> to resolve a national security risk.</li>



<li><strong>Telecommunication services.</strong> Data transactions ordinarily incident to and part of providing telecommunications services.&nbsp;&nbsp;</li>



<li><strong>Certain exemptions relevant to life sciences companies</strong>, such as exemptions for data transactions that are part of clinical investigations, involve regulatory approval data and/or are conducted pursuant to federally funded research, are discussed in more detail in our <a href="https://www.cooley.com/news/insight/2025/2025-04-02-the-dojs-bulk-sensitive-personal-data-rules-imminent-relevance-to-life-sciences-companies" target="_blank" rel="noreferrer noopener">previous post on the rule</a>.</li>
</ul>



<p>The exemptions are informed by examples in the rule and the DOJ’s FAQs. Commentary from the DOJ suggests that the exemptions are viewed narrowly. Given the rule’s novelty, the breadth of these exemptions in practice remains to be seen.</p>



<h3 class="wp-block-heading has-black-color has-text-color has-link-color wp-elements-c504750c00d9ead26879688373a9d49b">Is a covered data transaction prohibited or restricted?</h3>



<p>A covered data transaction is prohibited if it involves a US person engaging in a transaction that involves:</p>



<ul class="wp-block-list">
<li>Data brokerage with covered persons or countries of concern.</li>



<li>Data brokerage with foreign parties that are not covered persons or countries of concern, unless there are certain contractual protections in place.</li>



<li>Access to bulk human ‘omic data (including genomic, epigenomic, proteomic or transcriptomic data) or human biospecimens from which bulk human ‘omic data can be derived.&nbsp;</li>
</ul>



<p>A prohibited transaction means just that: It is simply prohibited.&nbsp;</p>



<p>A covered data transaction is restricted (rather than prohibited) if it involves the following:&nbsp; &nbsp;&nbsp;</p>



<ul class="wp-block-list">
<li>An employment agreement.</li>



<li>A vendor agreement, wherein a covered person or a country of concern is providing goods or services to a US person in exchange for compensation.</li>



<li>An investment agreement.</li>
</ul>



<p>See the above section on which transactions are covered by the rule for more details on these types of agreements.</p>



<p>Restricted transactions are allowed to proceed under the rule, so long as the company implements privacy and cybersecurity measures.&nbsp;</p>



<p>In determining coverage by the rule, companies are prohibited by the rule from making arrangements that have the purpose of evading or avoiding the rule.&nbsp;</p>



<h3 class="wp-block-heading has-black-color has-text-color has-link-color wp-elements-71420760cb6fc92ee73c2e6fef7cb5be">What are the compliance requirements for restricted transactions?</h3>



<p>As noted above, restricted transactions are permitted so long as the company engaging in such transactions implements a rigorous data compliance program and the <a href="https://www.cisa.gov/sites/default/files/2025-01/Security_Requirements_for_Restricted_Transaction-EO_14117_Implementation508.pdf" target="_blank" rel="noreferrer noopener">security requirements</a> issued by the Cybersecurity and Infrastructure Security Agency (CISA).&nbsp; An overview of each component is provided below.&nbsp;</p>



<h5 class="wp-block-heading has-black-color has-text-color has-link-color wp-elements-79dc88b4535954c99501786b7f566c00">CISA security requirements</h5>



<ul class="wp-block-list">
<li>Organizational- and system-level requirements:
<ul class="wp-block-list">
<li>Implement basic organizational cybersecurity policies, practices and requirements.</li>



<li>Implement logical and physical access controls.</li>



<li>Conduct an internal data risk assessment.</li>
</ul>
</li>



<li>Data-level mitigation involving a combination of the following that, when taken together, prevents access to covered data that is linkable, identifiable, unencrypted or decryptable using commonly available technology by covered persons and/or countries of concern:
<ul class="wp-block-list">
<li>Apply data minimization and data masking strategies.</li>



<li>Apply encryption techniques.</li>



<li>Apply privacy enhancing technologies.</li>



<li>Configure identity and access management techniques.</li>
</ul>
</li>
</ul>



<h5 class="wp-block-heading has-black-color has-text-color has-link-color wp-elements-d23fbf35917a68034b566fe44dcbf06a">Data compliance program requirements &nbsp;</h5>



<p>A data compliance program must include:</p>



<ul class="wp-block-list">
<li>Risk-based procedures for verifying data flows in restricted transactions and the identity of vendors.</li>



<li>A written policy describing the program, certified annually by the senior employee responsible for compliance.</li>



<li>A written information security policy (including description of implementation of the CISA security requirements), certified annually by the senior employee responsible for compliance.</li>
</ul>



<h5 class="wp-block-heading has-black-color has-text-color has-link-color wp-elements-3ad6ee3299dab260fed62327836e7b11">Recordkeeping and audit requirements</h5>



<ul class="wp-block-list">
<li>Recordkeeping under the rule requires records/documentation of the following to be kept for at least 10 years:
<ul class="wp-block-list">
<li>The written data compliance program and information security policy.</li>



<li>Results of the annual restricted transaction compliance audit.</li>



<li>Due diligence.</li>



<li>Transfer details.&nbsp;</li>



<li>Licenses or advisory opinions.</li>



<li>An annual certification of the completeness and accuracy of such records by the senior employee responsible for compliance.</li>
</ul>
</li>



<li>Audit
<ul class="wp-block-list">
<li>An audit must be conducted by an independent individual and use a reliable method that examines all restricted transactions, the data compliance program, required recordkeeping and the implementation of the CISA security requirements.</li>



<li>The audit must result in a written report that is retained for at least 10 years.</li>
</ul>
</li>
</ul>



<h3 class="wp-block-heading has-black-color has-text-color has-link-color wp-elements-a8f7657ee5e0568f0fa6890b9869f250">What should all companies do now to address the DOJ’s bulk data transfer rule?</h3>



<p>Companies should first assess whether they have any current or anticipated transactions subject to the rule, which may not be an easy task given that a “transaction” is loosely and broadly defined. Next, companies should determine whether their transactions are simply prohibited under the rule, or merely restricted. Companies also should see if they can take advantage of any exemptions to mitigate exposure under the rule. Finally, companies with restricted transactions should promptly implement the privacy and cybersecurity measures required under the rule. Contact a member of <a href="https://www.cooley.com/services/practice/cyber-data-privacy" target="_blank" rel="noreferrer noopener">Cooley’s cyber/data/privacy team</a> to leverage our guidance across different industries and existing compliance materials to help you get ahead of the rule.&nbsp;</p>



<ol class="wp-block-list">
<li><strong>Conduct diligence to determine whether you currently and/or expect to engage in covered data transactions. </strong>                 For example:
<ul class="wp-block-list">
<li>Analyze a data map to determine the types and quantity of data handled and whether they meet the rule’s thresholds.</li>



<li>Analyze data flows and recipients to determine whether a country of concern or covered person has access to such data.</li>



<li>Analyze contractual arrangements with corporate affiliates/subsidiaries, partners and vendors to determine if a transaction involves a data brokerage, investment agreement, employment agreement or vendor agreement.</li>
</ul>
</li>



<li><strong>Consider ways to mitigate exposure to the rule</strong>, such as by applying exemptions or recharacterizing/revising data flows, but without violating the rule’s prohibition on acts designed to evade the rule.</li>



<li><strong>Determine whether the covered data transactions are prohibited or restricted</strong>. Undertake a review of any data brokerage, vendor, employment and investment agreements to determine whether the rule may apply to such transactions.</li>



<li><strong>Implement a data compliance program</strong>. Draft or update a written information security plan, including supporting policies and procedures for understanding data flows and downstream recipients of data, as well as vendor management.&nbsp; Designate a senior employee responsible for such program.&nbsp;</li>



<li><strong>Implement CISA’s security requirements</strong>. Identify stakeholders, conduct a risk assessment and work with technical personnel to implement organizational-, system- and data-level security measures and policies.</li>



<li><strong>Prepare to comply with recordkeeping requirements</strong>, including records on restricted transactions and their details, results of compliance audits and annual certification.</li>



<li><strong>Prepare to comply with audit requirements</strong>, including identifying an independent auditor and determining a methodology to conduct an audit by reference to the company’s data compliance and security requirements.</li>
</ol>



<h6 class="wp-block-heading">Authors</h6>



<p><a href="https://www.cooley.com/people/michael-egan" target="_blank" rel="noreferrer noopener">Michael Egan</a></p>



<p><a href="https://www.cooley.com/people/christian-lee" target="_blank" rel="noreferrer noopener">Christian Lee</a></p>



<p><a href="https://www.cooley.com/people/emma-plankey" target="_blank" rel="noreferrer noopener">Emma Plankey</a></p>
<p>The post <a href="https://cdp.cooley.com/understanding-and-complying-with-the-dojs-bulk-data-rule/">The DOJ&#8217;s Data Security Program &#8211; Understanding and Complying with the New Bulk Data Transfer Rule</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4580</post-id>	</item>
		<item>
		<title>The Data (Use and Access) Act: What Businesses Need to Know</title>
		<link>https://cdp.cooley.com/the-data-use-and-access-bill-what-businesses-need-to-know/</link>
		
		<dc:creator><![CDATA[Georgia Grisaffe]]></dc:creator>
		<pubDate>Wed, 18 Jun 2025 15:05:16 +0000</pubDate>
				<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4568</guid>

					<description><![CDATA[<p>The UK’s Data (Use and Access) Act (DUA Act) has now received Royal Assent, introducing a series of targeted updates to the UK’s data protection framework in areas like artificial intelligence (AI) and research, while preserving alignment with core UK General Data Protection Regulation (GDPR) principles. The DUA Act is wide-ranging – covering everything from [&#8230;]</p>
<p>The post <a href="https://cdp.cooley.com/the-data-use-and-access-bill-what-businesses-need-to-know/">The Data (Use and Access) Act: What Businesses Need to Know</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>The UK’s Data (Use and Access) Act (DUA Act) has now received Royal Assent, introducing a series of targeted updates to the UK’s data protection framework in areas like artificial intelligence (AI) and research, while preserving alignment with core UK General Data Protection Regulation (GDPR) principles.</p>



<span id="more-4568"></span>



<p>The DUA Act is wide-ranging – covering everything from smart data sharing initiatives to digital identity services – but deliberately avoids many of the more controversial proposals found in its predecessor (the Data Protection and Digital Information Bill), such as redefining ‘personal data’ or changing the requirement to maintain records of processing activities.</p>



<p>This post explores the most impactful privacy-related reforms for companies handling UK personal data.</p>



<p><strong>Key changes</strong></p>



<p><strong>1. Automated decision-making (ADM)</strong></p>



<p>Under the UK GDPR, individuals had a right not to be subject to decisions based solely on automated processing that produced legal/similarly significant effects. This meant ADM was generally prohibited unless a specific exception applied (such as consent, contractual necessity or legal obligation) – and even then, safeguards had to be implemented.</p>



<p>For ADM involving nonsensitive personal data, the DUA Act removes the default prohibition where the processing meets certain conditions and specified safeguards are in place. This allows such processing to proceed without the need for a specific exception – including, potentially, on the basis of legitimate interests.</p>



<p>However, the existing safeguards still apply, including the rights of individuals to obtain human review, express their view and contest the decision. In addition, the DUA Act introduces a new requirement to provide affected individuals with information about automated decisions, building on a similar obligation under Article 13 UK GDPR to inform individuals about the existence of ADM.</p>



<p><strong>2. Scientific research provisions</strong></p>



<p>The definition of scientific research has been clarified so that it explicitly includes:</p>



<ul class="wp-block-list">
<li>Any research that can reasonably be described as scientific, including for the purposes of technological development.</li>



<li>Commercial and privately funded projects.</li>
</ul>



<p>The DUA Act also introduces more flexible rules on further processing for scientific purposes, allowing companies to rely on an individual’s initial consent for future, unspecified research uses provided certain conditions apply. This is likely to benefit research activities where precise future uses may not be known at the outset, such as longitudinal studies or AI model training.</p>



<p><strong>3. Recognised legitimate interests</strong></p>



<p>The DUA Act introduces a new list of ‘recognised legitimate interests’ which do not require a balancing test to be carried out. However, these mainly relate to activities which are unlikely to be relevant to many commercial businesses, such as safeguarding national security or detecting crime – although the list may later be expanded by the UK government.</p>



<p>The DUA Act also clarifies that processing for direct marketing, intra-group transfers and network security can be based on ordinary legitimate interests, subject to the usual balancing test.</p>



<p><strong>4. The UK’s Privacy and Electronic Communications Regulations (PECR)</strong></p>



<p>Previously capped at 500,000 pounds, fines under the PECR – a complementary regime governing direct marketing, cookies and electronic communications – will now be aligned with the UK GDPR, rising to a maximum of 17.5 million pounds or 4% of global turnover. This is significant, as Information Commissioner’s Office (ICO) enforcement has historically focused heavily on PECR breaches.</p>



<p>Additionally, minor changes have been made to cookie consent rules, clarifying that certain low-risk cookies (e.g., those used to detect fraud or authenticate users’ identities) will not require user consent.</p>



<p><strong>Business implications</strong></p>



<p><strong>1. Data strategy and research</strong></p>



<p>For companies in research-intensive sectors, the broadened definition of scientific research and expanded allowances for further processing should help reduce compliance friction across commercial research and development and AI.</p>



<ul class="wp-block-list">
<li>Commercial research: Clearer recognition that private research qualifies as ‘scientific’. This had previously been assumed in practice, but the specific recognition provides greater certainty.</li>



<li>Further processing: Individuals can give consent even if the purpose of data use evolves over time, thereby supporting multiphase research studies.</li>
</ul>



<p><strong>2. Compliance updates</strong></p>



<p>Several operational policies and notices may need updating in light of the DUA Act:</p>



<ul class="wp-block-list">
<li>Marketing: Businesses should review their marketing practices to ensure compliance with requirements under PECR. The significantly increased fine cap, coupled with the ICO’s historical enforcement in this area, substantially increases the stakes for PECR violations.</li>



<li>ADM: Businesses using ADM tools should ensure that appropriate safeguards are in place and review their privacy notices to ensure that transparency requirements are covered.</li>



<li>Cookies: Businesses should reassess cookie classifications and consider removing consent prompts for cookies that fall under the exemption.</li>



<li>Governance documents: Where businesses define ‘UK GDPR’ or reference applicable laws in contracts, data protection agreements or policies, these may need slight adjustments to incorporate the DUA Act.</li>
</ul>



<p><strong>3. Cross-border data transfers</strong></p>



<p>The UK’s European Union adequacy status under EU GDPR has been extended until 27 December 2025 but remains under scrutiny. While signals from Brussels are positive, businesses that rely heavily on EU-UK personal data flows should review their transfer mechanisms and ensure contingency measures (such as standard contractual clauses) are in place in case of any future adequacy lapse.</p>



<p>Please reach out to the Cooley team for more information and assistance in respect of the implementation of the DUA Act.</p>



<p><strong>Authors</strong></p>



<p><a href="https://www.cooley.com/people/guadalupe-sampedro">Guadalupe Sampedro</a>, Partner, London</p>



<p><a href="https://www.cooley.com/people/morgan-mccormack">Morgan McCormack</a>, Associate, London</p>



<p><a href="https://www.cooley.com/people/daniel-millard">Daniel Millard</a>, Associate, London</p>



<p>Emerald Hockley, Trainee, London</p>



<p></p>
<p>The post <a href="https://cdp.cooley.com/the-data-use-and-access-bill-what-businesses-need-to-know/">The Data (Use and Access) Act: What Businesses Need to Know</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4568</post-id>	</item>
		<item>
		<title>The EU AI Act: Key Milestones, Compliance Challenges and the Road Ahead</title>
		<link>https://cdp.cooley.com/the-eu-ai-act-key-milestones-compliance-challenges-and-the-road-ahead/</link>
		
		<dc:creator><![CDATA[Georgia Grisaffe]]></dc:creator>
		<pubDate>Mon, 19 May 2025 10:49:16 +0000</pubDate>
				<category><![CDATA[Compliance, Risk & Strategy]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4555</guid>

					<description><![CDATA[<p>The European Union Artificial Intelligence Act (EU AI Act) is rapidly reshaping the regulatory landscape for AI development and deployment, both within Europe and globally. In a recent Cooley webinar, partner Patrick Van Eecke and associate Bartholomäus Regenhardt, members of the firm’s cyber/data/privacy practice, provided an overview of the EU AI Act’s phased implementation, compliance hurdles and the much-anticipated Code of Practice for general-purpose AI (GPAI) models. Here’s what you need to know.</p>
<p>The post <a href="https://cdp.cooley.com/the-eu-ai-act-key-milestones-compliance-challenges-and-the-road-ahead/">The EU AI Act: Key Milestones, Compliance Challenges and the Road Ahead</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>The European Union Artificial Intelligence Act (EU AI Act) is rapidly reshaping the regulatory landscape for AI development and deployment, both within Europe and globally. <a href="https://www.cooley.com/events/2025/2025-04-30-ai-talks-understanding-the-eu-ai-act--what-it-means-for-companies-worldwide">In a recent Cooley webinar</a>, partner Patrick Van Eecke and associate Bartholomäus Regenhardt, members of the firm’s cyber/data/privacy practice, provided an overview of the EU AI Act’s phased implementation, compliance hurdles and the much-anticipated Code of Practice for general-purpose AI (GPAI) models. Here’s what you need to know.</p>



<span id="more-4555"></span>



<p><strong>Phased rollout: Understanding the timeline</strong></p>



<p>The EU AI Act is being implemented in several key stages:</p>



<ul class="wp-block-list">
<li><strong>February 2, 2025</strong>: The first obligations took effect, focusing on AI literacy and prohibiting certain high-risk AI practices.</li>



<li><strong>May 2, 2025</strong>: The (delayed) publication of the Code of Practice for GPAI models was expected, though pushback from major industry players and international stakeholders has postponed its finalization.</li>



<li><strong>August 2, 2025</strong>: GPAI governance rules and obligations that apply to GPAI models on the market after this date come into force.</li>



<li><strong>August 2, 2026</strong>: The majority of the EU AI Act’s requirements become fully enforceable.</li>



<li><strong>2030</strong>: Final implementation steps, especially for the public sector.</li>
</ul>



<p>This phased approach allows organizations time to adapt but also creates a complex compliance environment.</p>



<p><strong>The EU AI Act in a nutshell</strong></p>



<ul class="wp-block-list">
<li><strong>World’s first comprehensive AI regulation</strong>: The EU AI Act sets a global precedent, though its ultimate impact – akin to the “Brussels Effect” of the EU General Data Protection Regulation (GDPR) – remains to be seen.</li>



<li><strong>Dense legislation</strong>: 450+ pages, 68 new definitions, nearly 200 recitals and multiple annexes, with additional guidance and soft law expected.</li>



<li><strong>Risk-based approach</strong>: Obligations scale with the risk level of the AI system, from prohibited practices to high-risk and low-risk categories.</li>



<li><strong>Wide applicability</strong>: The EU AI Act applies to developers (providers), deployers (users), affected individuals, importers and distributors, regardless of whether they are based in the EU or abroad, due to its extraterritorial reach.</li>



<li><strong>Severe sanctions</strong>: Fines can reach up to 7% of global turnover or 35 million euros, surpassing even GDPR penalties.</li>



<li><strong>Dual enforcement</strong>: Both national supervisory authorities and the new EU AI Office will have enforcement powers, especially for GPAI models.</li>
</ul>



<p><strong>Early compliance: What’s happened since February 2025?</strong></p>



<p>The first two obligations – AI literacy and prohibition of certain practices – have triggered a flurry of activity.</p>



<ul class="wp-block-list">
<li><strong>AI literacy</strong>: Companies have launched training programs to ensure staff understand AI risks and regulatory requirements. The European Commission’s best practices repository, fueled by the AI Pact, offers practical examples, though following these does not guarantee compliance.</li>



<li><strong>Prohibited practices</strong>: Organizations have begun mapping and assessing their AI systems to ensure they are not engaging in prohibited activities. The European Commission has issued detailed (though nonbinding) guidance to clarify what constitutes a prohibited practice.</li>
</ul>



<p><strong>Defining ‘AI system’: Persistent challenges</strong></p>



<p>A recurring challenge is determining whether a solution qualifies as an “AI system” under the EU AI Act. The European Commission’s recent guidelines emphasize a holistic, case-by-case assessment based on seven criteria, acknowledging that not every system marketed as “AI” actually falls within its scope. This has led to concerns about “AI washing”: the overlabeling of products as AI-enabled for marketing purposes.</p>



<p><strong>GPAI models and the Code of Practice</strong></p>



<p>A major focus now is the regulation of GPAI models, such as large language models. The EU AI Act distinguishes between:</p>



<ul class="wp-block-list">
<li><strong>GPAI models</strong>: Core AI technologies (e.g., GPT-4, Mistral) capable of a broad range of tasks.</li>



<li><strong>AI systems</strong>: Applications built on top of GPAI models, with user interfaces and specific use cases (e.g., ChatGPT, Le Chat).</li>
</ul>



<p>Obligations differ for GPAI model providers versus AI system providers. The Code of Practice, currently still under negotiation, is designed to bridge the gap between legal requirements and practical implementation for GPAI model providers. While voluntary, signing up to the Code may help demonstrate compliance and could influence enforcement decisions.</p>



<p>However, industry resistance, particularly from major US tech firms, and pressure from the US administration have delayed its adoption. The final content and legal effect of the Code remain uncertain, but it is expected to focus on:</p>



<ul class="wp-block-list">
<li><strong>Transparency</strong>: Such as documentation and disclosure requirements, both to regulators and downstream AI system providers.</li>



<li><strong>Copyright</strong>: Such as ensuring web-crawled data does not infringe on intellectual property rights.</li>



<li><strong>Systemic risk</strong>: Such as additional safeguards for GPAI models with the potential for significant societal impact.</li>
</ul>



<p><strong>Transparency obligations: A shared responsibility</strong></p>



<p>Transparency is a cornerstone of the EU AI Act. GPAI model providers must maintain up-to-date documentation and share it with both the EU AI Office and downstream system providers. In turn, system providers must inform users about the AI’s capabilities and limitations, echoing GDPR-style privacy notices.</p>



<p><strong>Enforcement: When do the teeth come out?</strong></p>



<p>While compliance is already required for certain obligations, enforcement mechanisms, including fines and penalties, will only become active from August 2025 (August 2026 for GPAI models). National authorities are still being designated but affected individuals and entities can already seek injunctions in national courts.</p>



<p><strong>Key takeaways</strong></p>



<ul class="wp-block-list">
<li>The EU AI Act is complex, far-reaching and still evolving.</li>



<li>Early obligations focus on AI literacy and prohibiting harmful practices.</li>



<li>Defining what counts as an “AI system” remains challenging and requires multidisciplinary input.</li>



<li>The upcoming Code of Practice for GPAI models is a critical but currently delayed piece of the puzzle.</li>



<li>Transparency obligations affect both GPAI model and AI system providers.</li>



<li>Enforcement will ramp up significantly from mid-2025.</li>
</ul>



<p>Stay tuned for further developments, especially as the Code of Practice on GDPAI models is finalized and the AI Act’s next milestones approach. For organizations operating in or with customers in the EU, proactive engagement and cross-functional compliance efforts are essential to navigate this new regulatory era.</p>



<p>Listen to a recording of the webinar, “<a href="https://cooley.zoom.us/rec/play/0_vLXrpeihBRIUIR0qjvdoBMwiWlD6CQ6fhGp9KmOM1XJl59EUhajTEDmyjkj-dj6inoa7ZwjYgcA40S.Y24RsNphmlLyn0Kl?accessLevel=meeting&amp;hasValidToken=false&amp;canPlayFromShare=true&amp;from=share_recording_detail&amp;continueMode=true&amp;componentName=rec-play&amp;originRequestUrl=https%3A%2F%2Fcooley.zoom.us%2Frec%2Fshare%2FCiq-I2ul0Mn0iyZXMRnCdMy0DYB5HThoEDwqZ7eqVcTNTu1WSigXgJli_4ev6lMK.z9j5XEJgufa19kBa">AI Talks: Understanding the EU AI Act – What It Means for Companies Worldwide</a>.”</p>



<p><strong>Disclaimer:</strong> This blog post was generated with the assistance of AI based on the transcript of the webinar, and finally reviewed by a lawyer.</p>



<p><strong>Authors</strong></p>



<p><a href="https://www.cooley.com/people/patrick-van-eecke">Patrick Van Eecke</a>, Partner, Brussels</p>



<p><a href="https://www.cooley.com/people/bartholomaus-regenhardt">Bartholomäus Regenhardt</a>, Associate, Brussels</p>



<p></p>
<p>The post <a href="https://cdp.cooley.com/the-eu-ai-act-key-milestones-compliance-challenges-and-the-road-ahead/">The EU AI Act: Key Milestones, Compliance Challenges and the Road Ahead</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4555</post-id>	</item>
		<item>
		<title>Virginia Enacts New Broad Consent Requirement for Collection of Reproductive and Sexual Health Information</title>
		<link>https://cdp.cooley.com/virginia-enacts-new-broad-consent-requirement-for-collection-of-reproductive-and-sexual-health-information/</link>
		
		<dc:creator><![CDATA[Cooley]]></dc:creator>
		<pubDate>Tue, 08 Apr 2025 19:57:18 +0000</pubDate>
				<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4500</guid>

					<description><![CDATA[<p>On March 24, 2025, Virginia Gov. Glenn Youngkin signed into law SB 754, amending the state’s Consumer Protection Act to prohibit businesses from “[o]btaining, disclosing, selling, or disseminating any personally identifiable reproductive or sexual health information without the consent of the consumer.” The amendment, which takes effect on July 1, 2025, could have significant implications [&#8230;]</p>
<p>The post <a href="https://cdp.cooley.com/virginia-enacts-new-broad-consent-requirement-for-collection-of-reproductive-and-sexual-health-information/">Virginia Enacts New Broad Consent Requirement for Collection of Reproductive and Sexual Health Information</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>On March 24, 2025, Virginia Gov. Glenn Youngkin signed into law <a href="https://lis.virginia.gov/bill-details/20251/SB754" target="_blank" rel="noreferrer noopener">SB 754</a>, amending the state’s Consumer Protection Act to prohibit businesses from “[o]btaining, disclosing, selling, or disseminating any personally identifiable reproductive or sexual health information without the consent of the consumer.” The amendment, which takes effect on July 1, 2025, could have significant implications for companies that do business in Virginia due to its broad scope and definitions, affirmative consent requirement, and the fact that it is subject to a private right of action.</p>



<p><strong>Broad scope and definitions</strong></p>



<p>The amendment defines “reproductive or sexual health information” very broadly, encompassing any “information relating to the past, present, or future reproductive or sexual health of an individual.” This definition expressly includes, but is not limited to, information relating to the following, among other listed examples:</p>



<ul class="wp-block-list">
<li>“Efforts to research or obtain reproductive or sexual health information services or supplies, including location information that may indicate an attempt to acquire such services or supplies”</li>



<li>“Reproductive or sexual health conditions, status, diseases, or diagnoses, including pregnancy, menstruation, ovulation, ability to conceive a pregnancy, whether an individual is sexually active, and whether an individual is engaging in unprotected sex”</li>



<li>“Use or purchase of contraceptives, birth control, or other medication related to reproductive health, including abortifacients”</li>
</ul>



<p>The breadth of “reproductive or sexual health information” means that it could encompass activities of many businesses that may not think of themselves as collecting such information. For example, it could include retailers’ collection of transaction records for consumers’ purchase of products, such as condoms or tampons. Similarly, the collection of precise geolocation by mobile applications (even ones whose purpose is unrelated to health) could fall under the definition if such location data “may indicate an attempt to acquire [reproductive or sexual health] services or supplies,” for instance by visiting a clinic or pharmacy.</p>



<p>Further expanding its potential scope, the amendment applies to any entity that is subject to the Virginia Consumer Protection Act. As a result, businesses may be subject to this new requirement even if they do not meet the (relatively high) data processing volume thresholds for applicability of Virginia’s general consumer privacy law, the Virginia Consumer Data Protection Act (VCDPA).</p>



<p><strong>Affirmative consent</strong></p>



<p>The amendment’s consent requirement utilizes the definition of “consent” from the VCDPA, meaning “a clear affirmative act signifying a consumer&#8217;s freely given, specific, informed, and unambiguous agreement to process personal data relating to the consumer.” This consent requirement means that businesses collecting these categories of data will likely have to implement new, appropriately designed consent flows, rather than relying on implied consent or disclosures buried in a privacy policy or terms of service. However, unlike other consumer health data privacy laws, such as those of Washington state and Nevada, the new Virginia requirement does not expressly mandate that businesses provide a dedicated health data privacy notice that is separate from the business’s regular privacy policy.</p>



<p><strong>Private right of action</strong></p>



<p>In addition to enforcement by the Virginia attorney general, the new requirement is subject to a private right of action. It remains to be seen how attractive the new requirement will be as a basis for demand letters and lawsuits by plaintiffs’ firms. However, the existence of a private right of action undoubtedly increases businesses’ potential risks. This, together with the short timeline until the new requirement comes into force on July 1, 2025, makes it especially important for businesses to start assessing their potential exposure and compliance strategy.</p>



<h2 class="wp-block-heading">Authors</h2>



<p><a href="https://www.cooley.com/people/michael-egan" target="_blank" rel="noreferrer noopener">Michael Egan</a></p>



<p><a href="https://www.cooley.com/people/christopher-suhler" target="_blank" rel="noreferrer noopener">Christopher Suhler</a></p>



<p></p>
<p>The post <a href="https://cdp.cooley.com/virginia-enacts-new-broad-consent-requirement-for-collection-of-reproductive-and-sexual-health-information/">Virginia Enacts New Broad Consent Requirement for Collection of Reproductive and Sexual Health Information</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4500</post-id>	</item>
		<item>
		<title>UK Data Privacy Litigation: What’s New?</title>
		<link>https://cdp.cooley.com/uk-data-privacy-litigation-whats-new/</link>
		
		<dc:creator><![CDATA[Georgia Grisaffe]]></dc:creator>
		<pubDate>Mon, 07 Apr 2025 16:52:49 +0000</pubDate>
				<category><![CDATA[Litigation & Regulator Actions]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4498</guid>

					<description><![CDATA[<p>In honour of the International Association of Privacy Professionals (IAPP) London 2025 conference , we hosted a webinar on European privacy litigation. This post summarises some of the key UK privacy cases we covered in that webinar. Over the past six months, the UK High Court has handed down a number of decisions with important implications for businesses, data controllers and individuals.</p>
<p>The post <a href="https://cdp.cooley.com/uk-data-privacy-litigation-whats-new/">UK Data Privacy Litigation: What’s New?</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>In honour of the International Association of Privacy Professionals (IAPP) London 2025 conference , we hosted a webinar on European privacy litigation. This post summarises some of the key UK privacy cases we covered in that webinar. Over the past six months, the UK High Court has handed down a number of decisions with important implications for businesses, data controllers and individuals.</p>



<span id="more-4498"></span>



<p><strong><em>Duke v. Moores &amp; Ors</em> [2024] EWHC 2746 (KB)</strong></p>



<p><strong>Key issues</strong>: The claimant, a teacher, alleged misuse of private information and breaches of data protection laws after a disciplinary investigation. This was in relation to four categories of information: Facebook messages, WhatsApp messages, references from past employers, and alleged unlawful monitoring and surveillance. The court was asked to decide on an application for summary judgment made by the defendants in respect of a claim for misuse of private information and UK General Data Protection Regulation (GDPR) infringements. &nbsp;</p>



<p><strong>Key decision</strong>: The court granted the application for summary judgment, on the basis that the claimant’s case had no real prospect of success. The court found that any reasonable expectation of privacy was significantly outweighed by the need for investigation in the disciplinary process.</p>



<p><strong>Key takeaways</strong>: The case serves as a reminder of the courts’ willingness to strike out privacy and data cases which they feel do not have a prospect of success. Viable claims need to pass a ‘threshold of seriousness’ test, which was introduced into GDPR cases by the UK Supreme Court in a seminal privacy case in 2021, and since then has been used as an important filter in damages claims in respect of alleged GDPR breaches. One question relating to the threshold which remains open – to be determined this year by the Court of Appeal in the case of <em>Farley v. Paymaster</em> – is whether or not fear of adverse consequences, without the occurrence of actual adverse consequences, can constitute harm serious enough to warrant the payment of compensation.</p>



<p><strong><em>Pacini v. Dow Jones &amp; Co Inc</em> [2024] EWHC 2714 (KB)</strong></p>



<p><strong>Key issues</strong>: The claimants, two former investigation bankers, brought a data protection claim against Dow Jones, the publisher of The Wall Street Journal. Their claim was that Dow Jones had published two articles, which they alleged contained inaccurate and misleading information which caused them reputational damage. The decision concerned preliminary determinations regarding whether personal information being processed by the defendant was incorrect, as alleged by the claimant. There were two central issues: the meaning of any personal data within the articles and whether any such data is criminal offence data within the meaning of Article 10 of UK GDPR.</p>



<p><strong>Key decision</strong>: In determining the first issue regarding the definition of personal data, the court implemented principles from defamation law. The court first applied the ‘single meaning rule’, considering each published article as a whole and interpreting each element in its full context. The court then used this to determine whether the meaning constituted ‘personal data’ under the GDPR. The court also then applied the repetition rule, which treats a party who repeats a defamatory statement as if they made the original statement, to assist with determining whether the publishers were responsible for a breach of the GDPR. With regards to the second issue, the court held that the personal data was not ‘criminal offence’ data within the meaning of Article 10 UK GDPR.</p>



<p><strong>Key takeaways</strong>: This is not the first time that a judge deciding a GDPR case which crosses over with media publication has borrowed concepts from defamation law. The judge in this case went to great lengths to make clear that the approach required to interpret meaning might differ significantly in defamation law and data protection law, although it is interesting that a common approach to this was taken here.</p>



<p><strong><em>RTM v. Bonne Terre Ltd &amp; Hestview Ltd</em></strong><strong> [2025] EWHC 111 (KB)</strong></p>



<p><strong>Key issues</strong>: RTM, an online gambler, sued Bonne Terre, a gambling operator, for sending direct marketing materials to him encouraging him to gamble more. RTM claimed he had not consented to the processing of his personal data for this purpose, and that the unlawful processing for marketing purposes had caused him to suffer harm (namely, financial losses and distress).<br><br><strong>Key decision</strong>: The court concluded that the defendant had not obtained valid consent from the claimant. &nbsp;Despite no argument having been presented by the claimant on this specific point, it held that the claimant’s consent to the processing of his personal data for marketing purposes could not be valid, because it was clear from the evidence that he had a gambling problem. This meant that the claimant’s ability to give valid consent was impaired. The defendant argued that it used the personal data it collected from its customers to assess gambling addiction, in compliance with its safer gambling obligations, and that it had not concluded that RTM was a problem gambler and so had not excluded him from marketing lists. The court dismissed the relevance of this. <strong></strong></p>



<p><strong>Key takeaways</strong>: This case is a good reminder of the need for ‘informed’ and ‘freely given’ consent to data processing, although arguably it sets the bar extremely high for data controllers to meet. The net effect of this decision appears to be that, if a data controller seeks consent from customers to process their data, including for marketing purposes, and vulnerable individuals are within the customer group, then there is a risk that their consent will be invalidated by their vulnerability. This in turn would result in unlawful data processing. That risk apparently lies entirely with the data controller, even if they are completely unaware of the vulnerability in question. This has potentially wide ramifications for the entire online marketing ecosystem.</p>



<p><strong><em>Ashley v. HMRC</em> [2025] EWHC 134 (KB)</strong></p>



<p><strong>Key issues</strong>: The claimant, businessman Mike Ashley, was involved in a tax dispute with HM Revenue &amp; Customs (HMRC) and issued a data subject access request&nbsp;(DSAR) to find out which of his personal data they processed. This case explored the meaning of personal data under Article 4(1) of GDPR, the extent to which a controller needs to conduct a search for it to be considered proportionate, and the rules on what context needs to be given around the personal data of a data subject. &nbsp;</p>



<p><strong>Key decision</strong>: The court found in favour of the claimant regarding HMRC’s data processing failings, but rejected the wider argument that personal data included all data relating to HMRC’s tax enquiry assessment. The lengthy judgment provided a number of insights as to the meaning of personal data in the context of a DSAR:</p>



<ul class="wp-block-list">
<li>The court held that information that is ‘linked’ to an individual should be construed in a broad way, although there should be a ‘continuum of relevance’ (accordingly, a link which is indirect or tenuous ‘at several removes’ is unlikely to make the grade). It also confirmed that data can concern an object rather than an individual, and that subjective opinions, reasoning and assessments concerning an individual can be personal data where interlinked with or connected to information that more specifically relates to the individual. </li>



<li>For a ‘reasonable and proportionate’ search, the court made clear that it is up to a data controller to demonstrate a search would not be proportionate, and that, where a controller processes large amounts of data, it is their obligation under GDPR to design systems which can cope with DSARs in such circumstances.</li>



<li>On the provision of data itself, the court emphasised the need to do so in a transparent and intelligible manner, noting that decontextualised snippets (e.g. in a schedule of extracts, which is becoming standard practice) are unlikely to be adequate. It concluded that a data controller does not have to provide whole documents, but does have to provide enough additional information to enable the data subject to understand the context of the processing. However, it underlined what should be provided should be no more than what is<strong> necessary</strong> to achieve this.</li>
</ul>



<p><strong>Key takeaways</strong>: Businesses need to be mindful of the approaches they are taking in answering DSAR requests and should ensure their teams are trained on the most up-to-date guidance as to what constitutes personal data.</p>



<p>For a deeper dive into these cases, please check out our recent <a href="https://cooley.zoom.us/webinar/register/rec/WN_fMOnCdHORDShCi-wo5LOMA?meetingId=E4SQnESwRzDwAY_2Wy_4W59k7zycPynfy3amd3PT0UWoZgzUqVHgXosxER8Ns_Ym.nBlMVTO193msZk7q&amp;playId=&amp;action=play?accessLevel=meeting&amp;hasValidToken=false&amp;originRequestUrl=https%3A%2F%2Fcooley.zoom.us%2Frec%2Fshare%2FhOCFtaANCsWG_3xljosvCMqfkwnOpb8vWHqiMkBtiVTzxTsiB0NrUbQJcRvP2rtx.yB84hABMVOJGUh2y%3FstartTime%3D1741710613000#/registration">Privacy Litigation webinar</a> and, as always, reach out if you have any questions about how these developments might affect your business.</p>



<p><strong>Authors</strong></p>



<p><a href="https://www.cooley.com/people/bryony-hurst">Bryony Hurst</a>, Partner, London</p>



<p><a href="https://www.cooley.com/people/enrique-gallego-capdevila">Enrique Capdevila</a>, Special Counsel, London</p>



<p></p>
<p>The post <a href="https://cdp.cooley.com/uk-data-privacy-litigation-whats-new/">UK Data Privacy Litigation: What’s New?</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4498</post-id>	</item>
		<item>
		<title>The DOJ’s Bulk Sensitive Personal Data Rule’s Imminent Relevance to Life Sciences Companies</title>
		<link>https://cdp.cooley.com/the-dojs-bulk-sensitive-personal-data-rules-imminent-relevance-to-life-sciences-companies/</link>
		
		<dc:creator><![CDATA[Cooley]]></dc:creator>
		<pubDate>Fri, 04 Apr 2025 12:32:42 +0000</pubDate>
				<category><![CDATA[Compliance, Risk & Strategy]]></category>
		<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4481</guid>

					<description><![CDATA[<p>A new US Department of Justice (DOJ) rule on “Preventing Access to US Sensitive Personal Data and Government-Related Data by Countries of Concern (including China) or Covered Persons” (rule) prohibits and restricts certain covered data transactions that result in the transfer or access to bulk US sensitive personal data by countries of concern or covered [&#8230;]</p>
<p>The post <a href="https://cdp.cooley.com/the-dojs-bulk-sensitive-personal-data-rules-imminent-relevance-to-life-sciences-companies/">The DOJ’s Bulk Sensitive Personal Data Rule’s Imminent Relevance to Life Sciences Companies</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>A new US Department of Justice (DOJ) rule on “Preventing Access to US Sensitive Personal Data and Government-Related Data by Countries of Concern (including China) or Covered Persons” (rule) prohibits and restricts certain covered data transactions that result in the transfer or access to bulk US sensitive personal data by countries of concern or covered persons. The rule will take effect <strong>April 8, 2025</strong>.</p>



<h2 class="wp-block-heading"><strong>Initial considerations for life sciences companies</strong></h2>



<p>To determine whether data transactions trigger the “bulk” thresholds, the rule aggregates transactions over the preceding 12 months to determine the number of US persons’ data implicated. In other words, it is a rolling assessment of whether a particular transaction crosses the relevant bulk thresholds. Different categories of sensitive personal data are associated with different bulk thresholds.&nbsp;Unlike with privacy-focused laws, the thresholds apply regardless of whether the data is anonymized, key-coded, pseudonymized, de-identified or encrypted, which presents significant challenges for life sciences companies. Of particular relevance for life sciences companies are the following:</p>



<h4 class="wp-block-heading">Potentially relevant bulk thresholds</h4>



<figure class="wp-block-table"><table class="has-fixed-layout"><tbody><tr><td><strong>Sensitive personal data category</strong></td><td><strong>Bulk Threshold</strong></td></tr><tr><td><strong>Human genomic data</strong> (data representing the nucleic acid sequences that constitute the entire set or a subset of the genetic instructions found in a human cell) and/or <strong>biospecimen data</strong> (any quantity of tissue, blood, urine or other human-derived material from which human genomic data could be derived)</td><td>More than 100 US persons</td></tr><tr><td><strong>Human ‘omic data other than genomic data</strong> (e.g., human epigenomic data, human proteomic data and human transcriptomic data)</td><td>More than 1,000 US persons</td></tr><tr><td><strong>Personal health data</strong> (health information that indicates, reveals or describes the past, present or future physical or mental health or condition of an individual; the provision of healthcare to an individual; or the past, present or future payment for the provision of healthcare to an individual)</td><td>More than 10,000 US persons</td></tr></tbody></table></figure>



<h4 class="wp-block-heading">Countries of concern or covered persons</h4>



<p>The rule prohibits or restricts bulk sensitive personal data transactions with countries of concern or covered persons. The rule, while providing for future executive branch flexibility, defines countries of concern to include:</p>



<ul class="wp-block-list">
<li>The People’s Republic of China (including Hong Kong and Macau)</li>



<li>The Republic of Cuba</li>



<li>The Islamic Republic of Iran</li>



<li>The Democratic People’s Republic of North Korea</li>



<li>The Russian Federation</li>



<li>The Bolivarian Republic of Venezuela</li>
</ul>



<p>The rule creates four general categories of covered persons:</p>



<ul class="wp-block-list">
<li>Foreign entities that are 50% or more owned (directly or indirectly) by a country of concern, organized under the laws of a country of concern or have their principal place of business in a country of concern (including, potentially, a foreign subsidiary of a US company).</li>



<li>Foreign entities that are 50% or more owned (directly or indirectly) by a covered person.</li>



<li>Foreign employees or contractors of countries of concern, or of entities that are covered persons.</li>



<li>Foreign individuals primarily resident in countries of concern.</li>
</ul>



<h2 class="wp-block-heading">The rule&#8217;s impacts</h2>



<p>Given the rule’s breadth, its departure from existing US data privacy-focused laws, and significant civil and criminal fines and penalties, life sciences companies potentially within the rule’s scope should consider how to minimize risks associated with “prohibited” and “restricted” transactions.</p>



<h5 class="wp-block-heading">Prohibited transactions</h5>



<p class="has-medium-font-size">In relation to bulk US sensitive personal data, the rule generally prohibits a few types of transactions that may result in foreign access to bulk US sensitive personal data.</p>



<ul class="wp-block-list">
<li><strong>Data brokerage transactions</strong>: The rule prohibits “data brokerage” transactions, which include not only transactions that would typically be thought of as “data brokerage,” i.e., the sale, in exchange for money, of data that was not collected directly from the individual to whom the data relates, but also any other transactions (excluding an employment agreement, investment agreement or a vendor agreement) involving the sale, licensing or similar commercial transactions of bulk sensitive personal data with countries of concern or covered persons.
<ul class="wp-block-list">
<li>To avoid circumvention of this requirement, the rule provides that data brokerage transactions with any other foreign person (i.e., not a covered person) must include a contractual provision requiring the foreign person to refrain from subsequent data brokerage transactions with countries of concern or covered persons.</li>
</ul>
</li>



<li><strong>Human ‘omic data and human biospecimen transactions</strong>: The rule prohibits covered data transactions with a country of concern or covered person that involve access by that country of concern or covered person to bulk US sensitive personal data where such sensitive personal data involves human ‘omic or human biospecimens from which bulk human ‘omic data could be derived. This second prohibition, absent the potentially relevant exemptions, could likely significantly impact life sciences companies given the low thresholds for human genomic data or human biospecimens to qualify as “bulk” and the broad definition of “access” under the rule. This prohibition has particular relevance for life sciences companies looking for investments from, or to use vendors or employees in, countries of concern or those who may qualify as covered persons.</li>
</ul>



<h2 class="wp-block-heading">Restricted transactions</h2>



<p>The rule imposes restrictions on (but does not prohibit) covered data transactions involving certain vendor agreements, employment agreements or investment agreements with a country of concern or covered person, unless they involve bulk human ‘omic data or human biospecimens from which such data could be derived.</p>



<p>The rule permits restricted transactions only if the US person complies with Cybersecurity and Infrastructure Security Agency (CISA) security requirements (effective October 6, 2025) and otherwise maintains a data compliance program that, in relevant part, establishes:</p>



<ul class="wp-block-list">
<li>Risk-based procedures for data flows.</li>



<li>Risk-based procedures for vendor identity verification.</li>



<li>An annual certification process of its data compliance program.</li>



<li>An annual certification process of its data security program.</li>
</ul>



<h2 class="wp-block-heading"><strong>Potential exemptions for life sciences data transactions</strong></h2>



<p class="has-medium-font-size">In its background on the rule, the DOJ said it intends to address concerns about the rule’s effects on drug development and biomedical innovation. To that end, the rule exempts certain data transactions from its prohibitions and restrictions, including several exemptions potentially relevant to life sciences companies. These exemptions include:</p>



<ul class="wp-block-list">
<li><strong>Clinical and surveillance exemption</strong>. Data transactions incident to and part of clinical investigations regulated by the FDA, or clinical investigations that support applications to the FDA for research and marketing permits (this includes post-marketing surveillance data, including pharmacovigilance and post-marketing studies for already approved therapies), provided that the clinical data is de-identified or pseudonymized in accordance with applicable FDA regulations.</li>



<li><strong>Regulatory approval exemption</strong>. Data transactions that involve “regulatory approval data,” which are necessary to obtain or maintain regulatory approval to research or market a pharmaceutical product or medical device, provided that such data is de-identified or pseudonymized in accordance with applicable FDA regulations and is required to be submitted to a regulatory entity.</li>



<li><strong>Federally funded research exemption</strong>. Data transactions conducted pursuant to a US grant, contract or other agreement.</li>
</ul>



<p>The breadth of these exemptions remains to be determined as adjudicatory bodies have yet to publicly interpret the rule’s provisions.</p>



<h2 class="wp-block-heading"><strong>Implications for life sciences transactions</strong></h2>



<p>The rule could apply to a variety of transactions involving life sciences companies. Below are just a few examples of scenarios in which life sciences companies (and their data transactions) could be within the rule’s scope, and may or may not fall within the rule’s exceptions:</p>



<ul class="wp-block-list">
<li>License or collaboration agreements between US entities and covered persons during which one of the parties conducts clinical trials in the United States and wants to transfer clinical data and/or biospecimens to a country of concern or covered person.</li>



<li>M&amp;A deals involving covered persons where one or more of the parties conducted clinical trials in the US.</li>



<li>Vendor agreements (such as those with contract research organizations, contract manufacturing organizations or data-hosting providers) and employment agreements in which US sensitive personal data is shared with a country of concern or covered person.</li>



<li>Intra-company sensitive personal data transactions.</li>



<li>Investment agreements with investors who are in a country of concern or are otherwise covered persons.</li>
</ul>



<h2 class="wp-block-heading"><strong>What should life sciences companies do next?</strong></h2>



<p>Given the rule will soon take effect, life sciences companies should evaluate their exposure to the rule, take advantage of potential rule exemptions and, as appropriate, implement compliance strategies to address their obligations under the rule.</p>



<ul class="wp-block-list">
<li><strong>Determine whether you process bulk US sensitive data</strong>. Evaluate whether the relevant data that you process (collect, transfer or receive) falls within the rule’s scope.</li>



<li><strong>Identify potential covered data transactions</strong>. Undertake a review of any data brokerage, vendor, employment and investment agreements to determine whether the rule may apply to such transactions.</li>



<li><strong>Know your company’s data flows and conduct recipient diligence</strong>. Know to whom and for what purposes you will transfer data/biospecimens and whether the recipient will engage in any further transfers. Conduct “know-your-recipient” diligence to assess whether they fall within the scope of the rule’s definitions of countries of concern or covered persons.</li>



<li><strong>Implement compliance strategies</strong>. Update policies to identify potentially covered data transactions as part of the diligence process and implement and maintain:
<ul class="wp-block-list">
<li>Appropriate contractual protections on data transactions (aligned with good general data hygiene practices).</li>



<li>Internal policies, procedures and measures designed to limit access to data (particularly if personnel are in countries of concern or are otherwise covered persons).</li>



<li>Appropriate security measures for the sensitive personal data.</li>
</ul>
</li>
</ul>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong><a href="https://www.cooley.com/people/michael-egan">Michael Egan</a></strong>, Partner, Washington, DC</p>



<p><strong><a href="https://www.cooley.com/people/daniel-grooms">Daniel Grooms</a></strong>, Partner, Washington, DC</p>



<p><strong><a href="https://www.cooley.com/people/alan-tamarelli">Alan Tamarelli</a>,</strong> Partner, New York</p>



<p><strong><a href="https://www.cooley.com/people/andrew-epstein">Andrew Epstein</a></strong>, Special Counsel, Seattle</p>



<p><strong><a href="https://www.cooley.com/people/carlton-forbes">Carlton Forbes</a></strong>, Special Counsel</p>



<p><strong><a href="https://www.cooley.com/people/navya-dasari">Navya Dasari</a></strong>, Associate, New York</p>



<p><strong><a href="https://www.cooley.com/people/richard-koch">Richard Koch</a></strong>, Associate, Washington, DC</p>
<p>The post <a href="https://cdp.cooley.com/the-dojs-bulk-sensitive-personal-data-rules-imminent-relevance-to-life-sciences-companies/">The DOJ’s Bulk Sensitive Personal Data Rule’s Imminent Relevance to Life Sciences Companies</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4481</post-id>	</item>
		<item>
		<title>Model Contractual Clauses for AI Procurement in the EU: Key Takeaways for AI Companies</title>
		<link>https://cdp.cooley.com/model-contractual-clauses-for-ai-procurement-in-the-eu-key-takeaways-for-ai-companies/</link>
		
		<dc:creator><![CDATA[Georgia Grisaffe]]></dc:creator>
		<pubDate>Thu, 20 Mar 2025 12:20:51 +0000</pubDate>
				<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4475</guid>

					<description><![CDATA[<p>The European Commission (EC) has released an updated version of the Model Contractual Clauses for AI Procurement (MCC-AI), providing further guidance for public-sector buyers navigating AI procurement under the European Union Artificial Intelligence Act (EU AI Act). However, these clauses also serve as a practical tool to help any private organisation meet their legal obligations when providing or procuring AI systems, particularly high-risk AI solutions.</p>
<p>The post <a href="https://cdp.cooley.com/model-contractual-clauses-for-ai-procurement-in-the-eu-key-takeaways-for-ai-companies/">Model Contractual Clauses for AI Procurement in the EU: Key Takeaways for AI Companies</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>The European Commission (EC) has released an updated version of the <a href="https://public-buyers-community.ec.europa.eu/communities/procurement-ai/resources/updated-eu-ai-model-contractual-clauses">Model Contractual Clauses for AI Procurement (MCC-AI)</a>, providing further guidance for public-sector buyers navigating AI procurement under the European Union Artificial Intelligence Act (EU AI Act). However, these clauses also serve as a practical tool to help <strong>any private</strong> <strong>organisation</strong> meet their legal obligations when providing or procuring AI systems, particularly high-risk AI solutions.</p>



<span id="more-4475"></span>



<p><strong>Background</strong></p>



<p>The first version of the MCC-AI was published in September 2023 in anticipation of the EU AI Act, offering a structured approach to AI procurement. With the EU AI Act officially enacted on 13 June 2024, the EC has now refined these model clauses to ensure greater alignment with regulatory requirements. The new publication includes:</p>



<ul class="wp-block-list">
<li>A <strong>full version</strong> for <strong>high-risk AI</strong> systems.</li>



<li>A <strong>light version</strong> for <strong>non-high-risk AI</strong> systems.</li>



<li>A <strong>commentary</strong> explaining how to adapt and implement the clauses.</li>
</ul>



<p><strong>Why should companies get acquainted with the MCC-AI?</strong></p>



<p>The MCC-AI provides a valuable framework for companies procuring or providing AI services by establishing a common, minimum standard of obligations. These clauses help ensure that both parties align on key compliance aspects – such as transparency, risk management and accountability – in line with the EU AI Act.</p>



<p>Organisations incorporating MCC-AI clauses tailored to their needs, contracts and businesses can streamline negotiations, reduce legal uncertainties and demonstrate regulatory readiness.</p>



<p>This is particularly beneficial in an evolving legal landscape where AI governance requirements are still developing, as it helps companies proactively address potential risks and responsibilities.</p>



<p><strong>Who has issued the MCC-AI?</strong></p>



<p>The MCC-AI have been issued by the Public Buyers Community Platform, designed to foster collaboration in public procurement across the EU. It serves as a dedicated space where European public procurers and the EC can connect, share insights and drive innovation in public purchasing. The clauses are to be considered as a working document in progress and do not reflect an official position of the EC.</p>



<p><strong>Who should use the MCC-AI?</strong></p>



<p>The MCC-AI are designed for public-sector organisations procuring AI solutions, but they can be selectively adapted by private entities on a clause-by-clause basis.</p>



<ul class="wp-block-list">
<li>The full version applies to high-risk AI systems as defined in Chapter III of the EU AI Act – AI systems that pose significant risks to health, safety or fundamental rights.</li>



<li>The light version is tailored for non-high-risk AI systems, but still addresses key procurement considerations, such as transparency, risk management and data governance.</li>
</ul>



<p>Even in cases where the AI system poses no clear risks, the MCC-AI commentary suggests that contracting authorities include contractual safeguards around:</p>



<ul class="wp-block-list">
<li>Risk management frameworks</li>



<li>Data governance and usage rights</li>



<li>Technical documentation and audit mechanisms</li>



<li>AI registers for accountability</li>
</ul>



<p><strong>How should the MCC-AI be executed?</strong></p>



<p>The clauses are designed to be annexed to procurement contracts rather than functioning as stand-alone agreements. The MCC-AI includes only provisions specific to AI systems and issues covered by the EU AI Act. It does not address obligations or requirements arising from other applicable legislation. For instance, it does not cover intellectual property, acceptance, payment, delivery deadlines, applicable law or liability.</p>



<p><strong>What do the MCC-AI cover?</strong></p>



<p>The MCC-AI are structured around key legal and operational obligations, including:</p>



<ul class="wp-block-list">
<li><strong>AI system requirements:</strong> Ensuring compliance with fundamental legal and ethical standards.</li>



<li><strong>Supplier obligations:</strong> Defining transparency, risk management and compliance expectations.</li>



<li><strong>Data governance:</strong> Establishing rights over data sets used in AI development.</li>



<li><strong>Audit and accountability:</strong> Setting up mechanisms for AI system monitoring.</li>



<li><strong>Costs and liabilities:</strong> Clarifying financial responsibilities for implementation and compliance.</li>
</ul>



<p>Additionally, the annexes provide templates for describing AI system use cases, defining data governance frameworks and documenting compliance measures.</p>



<p><strong>What are the differences between the European Commission standard contractual clauses and the MCC-AI?</strong></p>



<p>The EU standard contractual clauses (SCCs) are legally binding contract templates issued by the EC to ensure that personal data transferred outside the European Economic Area (EEA) complies with the General Data Protection Regulation (GDPR). They impose specific data protection obligations on the parties involved.</p>



<p>The table below outlines the key differences between model contractual clauses (MCCs) and SCCs for data transfers. Although they serve different purposes, they may be included in the same agreement:</p>



<figure class="wp-block-table"><table class="has-fixed-layout"><tbody><tr><td><strong>Criteria</strong></td><td><strong>Model contractual clauses (MCCs)</strong></td><td><strong>Standard contractual clauses (SCCs)</strong></td></tr><tr><td><strong>Purpose</strong></td><td>Provide a contractual framework for industry-specific regulations, such as AI governance</td><td>Ensure GDPR compliance for international data transfers</td></tr><tr><td><strong>Legal basis</strong></td><td>Based on industry best practices or regulatory guidance (e.g., EU AI Act)</td><td>Required under Article 46 of the GDPR for data transfers outside the EEA</td></tr><tr><td><strong>Mandatory use</strong></td><td>Optional, used as guidance or as an annex to an existing contract</td><td>Mandatory for data transfers to third countries without an adequacy decision</td></tr><tr><td><strong>Regulatory scope</strong></td><td>Covers obligations related to the procurement of AI services</td><td>Exclusively focuses on personal data protection and GDPR compliance</td></tr><tr><td><strong>Applicability</strong></td><td>Can be used in various industries (e.g., AI contracts, software agreements, providing of AI-powered solutions)</td><td>Applies only to cross-border personal data transfers outside the EEA</td></tr><tr><td><strong>Enforceability</strong></td><td>Only binding if included in a contract between parties</td><td>Legally binding under the &nbsp;GDPR when used for data transfers</td></tr><tr><td><strong>Key provisions</strong></td><td>Covers AI ethics, liability, transparency and compliance</td><td>Covers data security, third-party obligations, audit rights and data subject rights</td></tr><tr><td><strong>Flexibility</strong></td><td>Can be customized or supplemented by other contract terms</td><td>Must be used as-is, with limited modifications allowed</td></tr><tr><td><strong>Annexed to contracts?</strong></td><td>Yes, typically annexed to broader agreements</td><td>Yes, attached to contracts governing data transfers</td></tr></tbody></table></figure>



<p><strong>Key takeaways</strong></p>



<p>For organisations providing AI systems, tailoring the MCC-AI to their business enhances credibility and trust with customers by showing a commitment to responsible AI practices.</p>



<p>For buyers, these clauses offer a baseline level of protection, ensuring that the procured AI solutions meet essential ethical and legal standards. Additionally, since the MCC-AI can be annexed to existing agreements, they provide flexibility while maintaining consistency across contracts. This not only facilitates smoother transactions but also minimizes disputes, as both parties operate under a shared understanding of AI-related obligations from the outset.</p>



<p>For further insights on AI contracting and compliance, please reach out to your Cooley team.</p>



<p><strong>Authors </strong></p>



<p><a href="https://www.cooley.com/people/patrick-van-eecke"><strong>Patrick Van Eecke</strong></a>, Partner, Brussels</p>



<p><a href="https://www.cooley.com/people/enrique-gallego-capdevila"><strong>Enrique Capdevila</strong></a>, Special Counsel, Brussels</p>
<p>The post <a href="https://cdp.cooley.com/model-contractual-clauses-for-ai-procurement-in-the-eu-key-takeaways-for-ai-companies/">Model Contractual Clauses for AI Procurement in the EU: Key Takeaways for AI Companies</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4475</post-id>	</item>
		<item>
		<title>ICO Releases ‘Consent or Pay’ Guidance</title>
		<link>https://cdp.cooley.com/icoico-releases-consent-or-pay-guidance/</link>
		
		<dc:creator><![CDATA[Cooley]]></dc:creator>
		<pubDate>Fri, 21 Feb 2025 18:25:13 +0000</pubDate>
				<category><![CDATA[Compliance, Risk & Strategy]]></category>
		<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4464</guid>

					<description><![CDATA[<p>What happened? The UK Information Commissioner’s Office (ICO) has released updated guidance on ‘consent or pay’ business models. These models present users with a choice to either consent to the processing of their personal data for purposes like personalised advertising in return for access to a product or service, or pay a fee to access [&#8230;]</p>
<p>The post <a href="https://cdp.cooley.com/icoico-releases-consent-or-pay-guidance/">ICO Releases ‘Consent or Pay’ Guidance</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p><strong>What happened?</strong></p>



<p>The UK Information Commissioner’s Office (ICO) has released <a href="https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/online-tracking/consent-or-pay/">updated guidance on ‘consent or pay’ business models</a>. These models present users with a choice to either consent to the processing of their personal data for purposes like personalised advertising in return for access to a product or service, <strong>or</strong> pay a fee to access the product or service without personalised ads.</p>



<p>For many online services, the consent or pay business model provides an important way of monetizing their product or service, generating essential revenue streams. However, there has been uncertainty about whether companies could obtain valid consent from users through these models under UK data protection laws – and, consequently, whether they could establish a legal basis for the processing of personal data for personalised ads. </p>



<p>The ICO’s guidance therefore aims to help companies navigate the complex intersection between UK data protection laws and online monetization. It shows that companies may be able to operate a consent or pay business model in compliance with applicable UK data protection laws; however, some types of companies (such as large social media platforms) may struggle to satisfy the necessary criteria without offering a third option, such as contextual advertisements.</p>



<p><strong>What does the guidance say?</strong></p>



<p>In order to operate a consent or pay business model, companies must assess whether they can demonstrate that their users’ consent is ‘freely given’. The standard for freely given consent is set out in the UK General Data Protection Regulation (GDPR). In the context of consent or pay business models, freely given consent means that users must have a genuine, voluntary choice to consent (or refuse to consent) to personalised ads. If users feel compelled to provide their consent, it will be invalid.</p>



<p>This means that before companies implement a consent or pay model, they must conduct a data protection impact assessment (DPIA) to:</p>



<ul class="wp-block-list">
<li>Assess the validity of consent.</li>



<li>Identify any risks.</li>



<li>Take necessary steps to mitigate risk or bring the model into compliance.</li>
</ul>



<p>The guidance sets out various issues to consider in the DPIA, such as:</p>



<figure class="wp-block-table"><table class="has-fixed-layout"><tbody><tr><td><strong>Issue</strong></td><td><strong>Action</strong></td></tr><tr><td><strong>Power imbalance between users and service providers</strong>: This can arise from a variety of factors that influence the relationship between a service provider and its users. For example, a power imbalance could occur if a social media user:<br>&#8211; Spends time building a social media profile.<br>&#8211; Relies on that social media network to connect with family and friends.</td><td>Services should consider providing an alternative option – such as ‘consent to contextual ads’ – whereby advertising is targeted based on the content of the page that the user is currently viewing rather than their behavioural profile history. Users who choose this option should be allowed to access the core product or service without being required to consent to personalised ads or paying to avoid personalised ads.</td></tr><tr><td><strong>Inappropriately high fees for the paid option</strong>: This relates to the amount of money that people can pay while freely providing their consent. For example, a service might be charging an inappropriate fee if the fee for the ‘paid’ option is so high that users feel they can only afford the ‘consent’ option.</td><td>Services should consider their pricing structure and keep their company’s specific context in mind when setting their fees, such as the company’s:<br><br>&#8211; Size<br>&#8211; Market position<br>&#8211; Nature of processing<br><br>As above, providing an additional option, such as contextual ads, could be an effective mitigation strategy.</td></tr><tr><td><strong>Lack of equivalent core services between consenting and paying users</strong>: This means that services do not necessarily have to be identical but should be broadly the same under both your ‘consent’ and ‘pay’ options. If a service offers ‘paid’ users a materially worse or completely different core service from ‘consenting’ users, it may not be able to demonstrate equivalence.<br> <br>For example, a social media company could meet this requirement if it allows users who choose contextual ads to access core features, such as the ability to post information and connect with family and friends, but not extra features, such as photo editing or avatars.</td><td>Assess the quality of the services you offer, including functionality, features, content, personalisation and user control over personal data.<br> <br>Ensure that at least one other option:<br>&#8211; Provides the core product or service.<br>&#8211; Does not require consent to personalised ads.<br>&#8211; Does not unnecessarily reduce the overall product or service quality.<br>&#8211; Does not have an inappropriately high fee.<br><br>Keep your assessment under review over time to ensure equivalence is maintained as the core product develops.</td></tr></tbody></table></figure>



<p><strong>What should companies do?</strong></p>



<p>To avoid enquiries from the ICO or complaints from UK individuals about their consent or pay business models, companies subject to UK data protection law should:</p>



<ul class="wp-block-list">
<li>Conduct a DPIA to review current practices and compare them against the ICO’s guidance.</li>



<li>If the DPIA identifies any compliance gaps or risks in relation to the company’s model, take any necessary steps to mitigate or resolve such gaps and risks. This may require offering an alternative option, such as contextual advertising.</li>



<li>Keep the consent or pay model under regular review as the company’s product or service develops over time.</li>
</ul>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>Authors</strong></p>



<p><a href="https://www.cooley.com/people/ann-bevitt"><strong>Ann Bevitt</strong></a>, Partner, London</p>



<p><strong><a href="https://www.cooley.com/people/morgan-mccormack">Morgan McCormack</a></strong>, Associate, London</p>



<p></p>
<p>The post <a href="https://cdp.cooley.com/icoico-releases-consent-or-pay-guidance/">ICO Releases ‘Consent or Pay’ Guidance</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4464</post-id>	</item>
		<item>
		<title>AI Talks: Understanding the EU AI Act – AI Literacy Obligations and Prohibited Practices</title>
		<link>https://cdp.cooley.com/ai-talks-understanding-the-eu-ai-act-ai-literacy-obligations-and-prohibited-practices/</link>
		
		<dc:creator><![CDATA[Elena Potan]]></dc:creator>
		<pubDate>Wed, 19 Feb 2025 09:50:55 +0000</pubDate>
				<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4459</guid>

					<description><![CDATA[<p>Welcome to our latest blog post, where we present the key insights from our first webinar of the series, “AI Talks: Understanding the EU AI Act.” This virtual series is designed to help businesses navigate the complexities of the European Union’s Artificial Intelligence Act (EU AI Act), which is set to revolutionize the regulatory landscape [&#8230;]</p>
<p>The post <a href="https://cdp.cooley.com/ai-talks-understanding-the-eu-ai-act-ai-literacy-obligations-and-prohibited-practices/">AI Talks: Understanding the EU AI Act – AI Literacy Obligations and Prohibited Practices</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Welcome to our latest blog post, where we present the key insights from our first webinar of the series, “<a href="https://www.cooley.com/events/2025/2025-01-30-ai-talks-understanding-the-eu-ai-act--what-it-means-for-companies-worldwide">AI Talks: Understanding the EU AI Act.</a>” This virtual series is designed to help businesses navigate the complexities of the European Union’s Artificial Intelligence Act (EU AI Act), which is set to revolutionize the regulatory landscape for artificial intelligence (AI) in Europe.</p>



<p>The EU AI Act is set to reshape the regulatory landscape for AI systems deployed in the EU. As the first comprehensive AI law of its kind, it introduces strict compliance requirements, including AI literacy obligations, prohibited uses of AI systems and a phased rollout of enforcement mechanisms. Here’s what businesses need to know to stay ahead of the curve.</p>



<h3 class="wp-block-heading"><strong>1. The AI Act at a glance</strong></h3>



<p>The EU AI Act aims to establish a harmonized framework for AI systems, categorizing them based on risk levels. Notably, the EU AI Act introduces:</p>



<ul class="wp-block-list">
<li><strong>A broad scope</strong> covering providers, deployers, manufacturers, importers and distributors of AI systems, even if they are based outside of the EU but operate within its market. Check our blog post, ‘<a href="https://cdp.cooley.com/eu-ai-act-does-it-affect-your-organization-or-not/">EU AI Act: Does It Affect Your Organization or Not?</a>’ to learn more about your role and responsibilities under the EU AI Act.</li>



<li><strong>A definition</strong> for AI systems.</li>



<li><strong>A phased enforcement schedule</strong> starting with certain obligations coming into effect immediately.</li>



<li><strong>Severe penalties for noncompliance</strong>, potentially exceeding those imposed under the General Data Protection Regulation (GDPR).</li>
</ul>



<h3 class="wp-block-heading"><strong>What is an AI system under the EU AI Act?</strong></h3>



<p>The EU AI&nbsp; Act does not apply to all AI systems but specifically targets those that meet the definition of an ‘AI system’ as outlined in Article 3(1) of the AI Act: ‘“AI system” means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments’.</p>



<p>To clarify the scope of this definition, the European Commission (EC) has issued <a href="https://digital-strategy.ec.europa.eu/en/library/commission-publishes-guidelines-ai-system-definition-facilitate-first-ai-acts-rules-application">guidelines on the definition of an AI system</a>, which aim to assist providers and other relevant persons – including market and institutional stakeholders – in determining whether a system constitutes an AI system within the meaning of the EU AI Act.</p>



<p>The EC acknowledges in the guidelines that, due to the rapidly evolving nature of AI technology, it is impossible to provide an exhaustive list of systems that either fall within or outside the definition of an AI system. This may pose challenges for companies when assessing whether their systems are within scope of the EU AI Act and understanding their obligations under it.</p>



<p>While the guidelines provide valuable insight, they are nonbinding, and any definitive interpretation of the EU AI Act will be determined by the Court of Justice of the European Union (CJEU).</p>



<h3 class="wp-block-heading"><strong>2. Immediate compliance obligations as of 2 February 2025</strong></h3>



<h3 class="wp-block-heading"><strong>a. AI literacy (Article 4)</strong></h3>



<h3 class="wp-block-heading"><strong>i. Concept</strong></h3>



<p>Article 4 of the EU AI Act requires providers and deployers of AI systems to ensure a sufficient level of AI literacy for their staff and any other users who are interacting with AI systems.</p>



<p>The EU AI Act does not specify how AI literacy should be achieved by providers and deployers. However, to support this requirement, the EU AI Office has created a living repository of ongoing practices among AI system providers and deployers. The purpose of the repository is to share examples of AI literacy initiatives that organizations are implementing. These practices can help others understand how to build their own strategies for meeting the literacy obligations in Article 4.</p>



<p>The repository of AI literacy compiles practices gathered through a survey shared exclusively with <a href="https://digital-strategy.ec.europa.eu/en/policies/ai-pact">AI Pact</a> pledgers, at least for now. As such, the list of practices provided is not exhaustive and will be updated regularly.</p>



<p>The EU AI Office recalls that adopting the practices listed in the repository does not automatically guarantee compliance with Article 4 of the AI Act.</p>



<h3 class="wp-block-heading"><strong>ii. What should you do now to prepare for compliance with your literacy obligations?</strong></h3>



<ul class="wp-block-list">
<li><strong>Evaluate current practices</strong>: Assess the current AI literacy levels within your organization. Are your staff and users equipped with the right knowledge and training?</li>



<li><strong>Leverage the repository of AI practices</strong>: Consult the living repository to find practices that could be relevant to your organization. Even if your AI literacy efforts are still in the planning phase, it is useful to learn from others.</li>



<li><strong>Assess whether you should engage in the AI Pact</strong>.</li>



<li><strong>Regularly update practices</strong>: Stay updated with evolving AI literacy standards, be proactive in improving your organization’s literacy efforts and make sure you document them to demonstrate compliance.</li>
</ul>



<h3 class="wp-block-heading"><strong>b. Prohibited AI practices</strong></h3>



<h3 class="wp-block-heading"><strong>i. Concept</strong></h3>



<p>Article 5 of the EU AI Act prohibits the placing on the EU market, putting into service or use of certain AI systems which pose an ‘unacceptable risk’, including AI systems used for the following purposes:</p>



<ul class="wp-block-list">
<li>Harmful AI-based manipulation and deception.</li>



<li>Harmful AI-based exploitation of vulnerabilities.</li>



<li>Social scoring.</li>



<li>Individual criminal offence risk assessment or prediction.</li>



<li>Untargeted scraping of the internet or CCTV material to create or expand facial recognition databases.</li>



<li>Emotion recognition in workplaces and education institutions.</li>



<li>Biometric categorisation to deduce certain protected characteristics.</li>



<li>Real-time remote biometric identification for law enforcement purposes in publicly accessible spaces.</li>
</ul>



<p>These prohibitions came into effect six months after the AI Act entered into force, starting from 2 February 2025.</p>



<p>The EC recently adopted the <a href="https://digital-strategy.ec.europa.eu/en/library/commission-publishes-guidelines-prohibited-artificial-intelligence-ai-practices-defined-ai-act">guidelines on the practical implementation of the practices prohibited under Article 5 of the AI Act</a>. These guidelines aim to enhance legal clarity and offer insights into the EC’s interpretation of the prohibitions in Article 5 of the EU AI Act, ensuring their consistent, effective and uniform application. They are intended to serve as practical guidance for competent authorities enforcing the AI Act, as well as for providers and deployers of AI systems to help ensure compliance with their obligations. The guidelines also provide some clarity around exclusions from the scope of the EU AI Act – such as national security, defence and military purpose, judicial and law enforcement cooperation with third countries, research and development, or personal nonprofessional activity.</p>



<p>It is important to note that these guidelines are nonbinding. Any definitive interpretation of the AI Act can only be provided by the CJEU.</p>



<h3 class="wp-block-heading"><strong>c. Is your compliance documentation covered by legal privilege?</strong></h3>



<p>The EU AI Act grants regulators broad investigative powers, including access to AI risk assessments and compliance documentation. However, legal privilege can protect certain internal communications from disclosure. Businesses should consider consulting legal counsel to ensure compliance strategies remain privileged where applicable.</p>



<h3 class="wp-block-heading"><strong>3. Next steps for businesses</strong></h3>



<p>For businesses with a global presence, the guidance issued by the EC is a crucial resource for managing the complexities of complying with the EU AI Act. Below you will find some recommended practices to enhance your compliance strategy:</p>



<ul class="wp-block-list">
<li><strong>Assess whether your AI systems fall under the EU AI Act’s scope.</strong></li>



<li><strong>Identify whether your company is a provider, deployer or both under the EU AI Act.</strong></li>



<li><strong>Review and mitigate risks associated with AI deployments.</strong></li>



<li><strong>Ensure compliance with AI literacy obligations through staff training.</strong></li>



<li><strong>Consult legal counsel to ensure your AI compliance strategies remain privileged.</strong></li>



<li><strong>Monitor regulatory updates, including forthcoming EU guidelines.</strong></li>
</ul>



<p>With the EU AI Act’s phased implementation, now is the time for businesses to align their AI strategies with regulatory expectations. Compliance today will help organizations mitigate risks and avoid significant penalties in the future.</p>



<p>For further legal insights on AI compliance, please do not hesitate to contact us.</p>



<p><strong>Authors</strong></p>



<p><strong><a href="https://www.cooley.com/people/patrick-van-eecke">Patrick Van Eecke</a></strong>, Partner, Brussels</p>



<p><strong><a href="https://www.cooley.com/people/enrique-gallego-capdevila">Enrique Gallego Capdevila</a></strong>, Special Counsel, Brussels</p>



<p></p>
<p>The post <a href="https://cdp.cooley.com/ai-talks-understanding-the-eu-ai-act-ai-literacy-obligations-and-prohibited-practices/">AI Talks: Understanding the EU AI Act – AI Literacy Obligations and Prohibited Practices</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4459</post-id>	</item>
		<item>
		<title>ICO Updates Position on Web-Scraping for AI Development</title>
		<link>https://cdp.cooley.com/ico-updates-position-on-web-scraping-for-ai-development/</link>
		
		<dc:creator><![CDATA[Laura Kemp]]></dc:creator>
		<pubDate>Mon, 23 Dec 2024 17:18:08 +0000</pubDate>
				<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4448</guid>

					<description><![CDATA[<p>What happened? In an attempt to address ongoing regulatory uncertainty about how the UK General Data Protection Regulation (UK GDPR) and UK Data Protection Act 2018 apply to the development and use of generative artificial intelligence (AI), the UK Information Commissioner’s Office (ICO) has released its initial response to its five-part consultation series on the [&#8230;]</p>
<p>The post <a href="https://cdp.cooley.com/ico-updates-position-on-web-scraping-for-ai-development/">ICO Updates Position on Web-Scraping for AI Development</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p><strong>What happened?</strong></p>



<p>In an attempt to address ongoing regulatory uncertainty about how the UK General Data Protection Regulation (UK GDPR) and UK Data Protection Act 2018 apply to the development and use of generative artificial intelligence (AI), the UK Information Commissioner’s Office (ICO) has <a href="https://ico.org.uk/media/about-the-ico/what-we-do/our-work-on-artificial-intelligence/response-to-the-consultation-series-on-generative-ai-0-0.pdf" target="_blank" rel="noreferrer noopener">released its initial response</a> to its <a href="https://ico.org.uk/about-the-ico/ico-and-stakeholder-consultations/ico-consultation-series-on-generative-ai-and-data-protection/" target="_blank" rel="noreferrer noopener">five-part consultation series</a> on the topic which it conducted in 2024. The series covered the following areas:</p>



<span id="more-4448"></span>



<ol class="wp-block-list">
<li>The legal basis for web-scraping to train generative AI models.<br></li>



<li>Purpose limitation – i.e., having a specified, explicit and legitimate purpose – throughout the generative AI life cycle.<br></li>



<li>Accuracy of training data and model outputs.<br></li>



<li>Respecting individual rights in the training and fine-tuning of generative AI models.<br></li>



<li>Allocating controllership across the generative AI supply chain.</li>
</ol>



<p><strong>What has changed?</strong></p>



<p>In its initial response, the ICO updated its seemingly highly permissive position on ‘legitimate interests’ as a legal basis for web-scraping under the UK GDPR. In <a href="https://ico.org.uk/about-the-ico/what-we-do/our-work-on-artificial-intelligence/generative-ai-first-call-for-evidence/" target="_blank" rel="noreferrer noopener">previous draft guidance</a>, the ICO’s position was that data controllers could rely on legitimate interests as a legal basis for training AI models on web-scraped data, provided that they could pass the ‘three-part’ test by demonstrating:</p>



<ol class="wp-block-list">
<li>The purpose of the processing is legitimate.<br></li>



<li>The processing is necessary for the purpose (the ‘<strong>necessity test</strong>’).<br></li>



<li>The individual’s interests do not override the developer’s interests being pursued (the ‘<strong>balancing test</strong>’).</li>
</ol>



<p>Following consultation on the initial draft guidance, the ICO has refined its position somewhat – and highlighted some specific considerations which organisations need to bear in mind if they want to rely on legitimate interests for training on web-scraped data.</p>



<p><strong>Increase transparency</strong></p>



<ul class="wp-block-list">
<li>The ICO says that web-scraping often occurs without people being aware of it (so-called invisible processing).<br></li>



<li>The ICO feels that this type of ‘invisible processing’ creates challenges for the purposes of the balancing test – primarily because where people are unaware that their data is being processed, they are unable to exercise their rights under the UK GDPR.<br></li>



<li>According to the ICO’s updated position, it now expects generative AI developers to significantly improve their approach to transparency.</li>
</ul>



<p><strong>Making sure scraping is necessary</strong></p>



<ul class="wp-block-list">
<li>The ICO questioned whether AI developers really need to use web-scraping to collect training data – i.e., whether they can satisfy the necessity test – when alternative methods of data collection exist.<br></li>



<li>For example, the ICO seems to believe that developers could effectively train models by licensing personal data from organisations that specialise in collecting such data in a transparent way and in accordance with the UK GDPR.<br></li>



<li>It is not clear on what basis the ICO reached the conclusion that the relatively limited personal data available via the nascent training data licensing market would be sufficient for the purposes of training latest-generation models. It remains to be seen whether AI model developers would share this view &#8230;</li>
</ul>



<p><strong>Recommendations</strong></p>



<p>Given this emerging line of thinking from the ICO, AI model developers relying on legitimate interests as their legal basis for web-scraping under the UK GDPR must assess (and document such assessment):</p>



<ul class="wp-block-list">
<li>Whether they really need to use web-scraping to collect personal data for development of their AI model, or whether an alternative approach (such as licensing personal data) could realistically satisfy their needs.<br></li>



<li>How they intend to address their transparency obligations under the UK GDPR – taking appropriate steps to increase transparency and controls for data subjects will be a key part of getting this right.</li>
</ul>



<p><strong>What happens next?</strong></p>



<p>The ICO’s initial response informs its <a href="https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/artificial-intelligence/guidance-on-ai-and-data-protection/" target="_blank" rel="noreferrer noopener">current core guidance</a> on AI and data protection. This guidance is expected to be formally updated when the UK’s new UK GDPR reform legislation – the Data (Use and Access) Bill – is passed into law. This is expected to happen around Easter 2025.</p>



<p class="has-medium-font-size"><strong>Authors</strong></p>



<p><strong><a href="https://www.cooley.com/people/leo-spicerphelps" target="_blank" rel="noreferrer noopener">Leo Spicer-Phelps</a></strong>, Associate, London</p>



<p><strong><a href="https://www.cooley.com/people/morgan-mccormack" target="_blank" rel="noreferrer noopener">Morgan McCormack</a></strong>, Associate, London</p>



<p><a id="_msocom_1"></a></p>



<p></p>
<p>The post <a href="https://cdp.cooley.com/ico-updates-position-on-web-scraping-for-ai-development/">ICO Updates Position on Web-Scraping for AI Development</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4448</post-id>	</item>
		<item>
		<title>Guidelines 02/2024 on Article 48 of the GDPR: EDPB Clarifies Rules for Data Sharing With Third-Country Authorities</title>
		<link>https://cdp.cooley.com/guidelines-02-2024-on-article-48-of-the-gdpr-edpb-clarifies-rules-for-data-sharing-with-third-country-authorities/</link>
		
		<dc:creator><![CDATA[Elena Potan]]></dc:creator>
		<pubDate>Tue, 10 Dec 2024 09:28:09 +0000</pubDate>
				<category><![CDATA[Policy & Legislation]]></category>
		<guid isPermaLink="false">https://cdp.cooley.com/?p=4441</guid>

					<description><![CDATA[<p>In the ever-evolving landscape of data protection and privacy, the General Data Protection Regulation (GDPR) stands as the most significant legislative framework for processing personal data. Known for its extraterritorial reach, the GDPR sets out the rules for transferring personal data from private organizations established in the European Economic Area (EEA) to authorities in third [&#8230;]</p>
<p>The post <a href="https://cdp.cooley.com/guidelines-02-2024-on-article-48-of-the-gdpr-edpb-clarifies-rules-for-data-sharing-with-third-country-authorities/">Guidelines 02/2024 on Article 48 of the GDPR: EDPB Clarifies Rules for Data Sharing With Third-Country Authorities</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>In the ever-evolving landscape of data protection and privacy, the General Data Protection Regulation (GDPR) stands as the most significant legislative framework for processing personal data. Known for its extraterritorial reach, the GDPR sets out the rules for transferring personal data from private organizations established in the European Economic Area (EEA) to authorities in third countries.</p>



<p>Private organizations in the EEA have long faced challenges in managing such requests while ensuring compliance with the GDPR. To address these issues, the European Data Protection Board (EDPB) issued Guidelines 02/2024 on 2 December 2024, offering clarity on the interpretation and application of Article 48. These guidelines, open for public consultation until 27 January 2025, are essential for businesses, legal professionals and privacy experts navigating cross-border data transfers under the GDPR.</p>



<p><strong>Background on Article 48 of the GDPR</strong></p>



<p>Article 48 of the GDPR states that ‘any judgment of a court or tribunal and any decision of an administrative authority of a third country requiring a controller or processor to transfer or disclose personal data may only be recognized or enforceable in any manner if based on an international agreement, such as a mutual legal assistance treaty, in force between the requesting third country and the Union or a Member State, without prejudice to other grounds for transfer pursuant to this Chapter’.</p>



<p>This provision specifically restricts the transfer of personal data to third countries that may not comply with GDPR standards, even if the transfer arises from a court decision or administrative order that requires a controller or processor established in the EEA to disclose personal data, unless certain conditions are met.</p>



<p>Private organizations established in the EEA face a dilemma between complying with third-country requests – often related to national security or law enforcement orders – and adhering to the requirements of the GDPR.</p>



<p><strong>Scope of the Guidelines 02/2024</strong></p>



<p>These guidelines focus on requests for direct cooperation between a third-country public authority and a private entity in the EEA, rather than scenarios where personal data is exchanged directly between public authorities in the EEA and third countries (for example, under mutual legal assistance treaties). These requests may come from a wide range of public authorities, including those regulating the private sector, such as banking regulators and tax authorities, as well as law enforcement and national security agencies.</p>



<p>The guidelines specifically address situations where such requests are directed to controllers or processors in the EEA, whose processing activities are subject to Article 3(1) of the GDPR.</p>



<p><strong>‘Two-step test’ must be fulfilled when responding to a third-country public authority request</strong></p>



<p>The EDPB recalls in these guidelines that the ‘two-step test’ must be applied when transferring personal data to third-country public authorities. First, the transfer of personal data to a third country shall take place only if there is a legal basis together with all relevant provisions of the GDPR. Second, the conditions of Chapter V (‘Transfers of personal data to third countries or international organizations’) must be complied with.</p>



<p><strong>1. Identification of a legal basis under Article 6 of the GDPR</strong></p>



<ul class="wp-block-list">
<li><strong>Compliance with a legal obligation (Article 6(1)(c)):</strong> Article 48 contemplates a situation where a court ruling, tribunal decision or administrative order from a third-country authority requires a controller or processor in the EEA to transfer personal data based on an international agreement, which could establish the request as a legal obligation with potential legal consequences for noncompliance. If the processing of personal data is necessary to fulfill a legal obligation, Article 6(1)(c) of the GDPR provides a clear legal basis for the transfer.</li>
</ul>



<p>In a scenario where there is no legal obligation arising from an international agreement for the EEA organization, the use of other legal bases must be assessed on a case-by-case basis.</p>



<ul class="wp-block-list">
<li><strong>Consent (Article 6(1)(a)):</strong> In principle, a data subject’s consent could serve as a legal basis for transferring data to a third-country authority. However, the EDPB considers that relying on consent in this context would be generally inappropriate, as the data transfer is linked to the exercise of authoritative powers. Hence, there would be an imbalance between the parties.</li>
</ul>



<ul class="wp-block-list">
<li><strong>Vital interests (Article 6(1)(d)):</strong> The EDPB acknowledges that in certain established situations, the vital interests of the data subject could be used as a legal basis for a data transfer triggered by a third-country request, provided that the conditions outlined in international law are met. Regarding the vital interests of other individuals, the EDPB emphasizes that the processing of personal data based on the vital interests of another person should – in principle – occur only when it cannot clearly be justified by another legal basis.</li>
</ul>



<ul class="wp-block-list">
<li><strong>Legitimate interests (Article 6(1)(f)): </strong>The EDPB reminds that any processing based on the legitimate interests of the controller or third parties must be necessary and balanced against the interests, fundamental rights and freedoms of the data subject. The outcome of this balancing test, which is subject to an individual assessment, determines whether the legitimate interest legal basis can be relied upon for transferring personal data to a third-country authority. The EDPB states that, although a controller may, in some cases, have a legitimate interest in complying with a request to disclose personal data to a third-country authority, a private business acting as a controller cannot rely on Article 6(1)(f) for the collection and storage of personal data in a preventive manner.</li>
</ul>



<p>Furthermore, these guidelines remind that the EDPB has previously held that in certain situations, the interests or fundamental rights and freedoms of the data subject would override the controller’s interest in adhering to a third-country law enforcement authority’s request to avoid sanctions for noncompliance.</p>



<p><strong>2. Compliance with Chapter V of the GDPR</strong></p>



<p>The provision of Article 48 itself contains no data protection safeguards for data transfers, but clarifies that decisions or judgments from third-country authorities cannot be directly recognized or enforced in the EEA unless an international agreement provides for this.</p>



<p>Therefore, before responding to a request from a third-country public authority falling under Article 48 of the GDPR, the controller or processor in the EEA must identify an applicable ground for the transfer.</p>



<p>If an international agreement governs cooperation between an EEA controller or processor and a third-country public authority, it can serve as a legal basis for a data transfer, provided that the agreement includes appropriate safeguards in accordance with Article 46(2)(a).</p>



<p>If there is no international agreement that would bound the EEA controller or processor and the third-country public authority, or the agreement does not preclude adequate safeguards for the transfer, the transfer must be based on another ground for transfer under Chapter V – e.g., standard contractual clauses or binding corporate rules – or rely on any of the derogations of Article 49, such as when necessary for important public interest reasons or for the establishment, exercise or defense of legal claims. However, as the EDPB has previously stated, the derogations in Article 49 must be narrowly interpreted and are primarily intended for occasional, nonrepetitive processing activities.</p>



<p><strong>Impact on businesses with an entity in the EEA</strong></p>



<p>For businesses operating globally, the guidance set out in Guidelines 02/2024 is an essential tool for navigating the complexities of handling third-country requests. Companies must remain vigilant in assessing the legal environments of the countries to which they transfer data, ensuring that EEA-established controllers and processors rely on an adequate legal basis and on a ground for transfer under Chapter V when responding to requests from third-country authorities in compliance with the GDPR.</p>



<p>If your organization requires skilled advice and support on how to handle third-country public requests, such as subpoenas, please do not hesitate to contact us.</p>



<p><strong>Authors</strong></p>



<p><strong><a href="https://www.cooley.com/people/patrick-van-eecke">Patrick Van Eecke</a></strong>, Partner, Brussels</p>



<p><strong><a href="https://www.cooley.com/people/enrique-gallego-capdevila">Enrique Gallego Capdevila</a></strong>, Special Counsel, Brussels</p>



<p></p>
<p>The post <a href="https://cdp.cooley.com/guidelines-02-2024-on-article-48-of-the-gdpr-edpb-clarifies-rules-for-data-sharing-with-third-country-authorities/">Guidelines 02/2024 on Article 48 of the GDPR: EDPB Clarifies Rules for Data Sharing With Third-Country Authorities</a> appeared first on <a href="https://cdp.cooley.com">cyber/data/privacy insights</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4441</post-id>	</item>
	</channel>
</rss>
