top of page
Search

What 2025 Taught Us About Data Privacy and How to Mitigate Risk in 2026

By: Lauretta Otoo, Edited by: Ambur C. Smith, Esq.


Privacy is often treated as a policy, a banner pop-up or a clause in a vendor contract. In day-to-day operations, it shows up in far more practical places, how an email list is built, what a donation platform collects, whether a CRM is configured correctly, which analytics tools are turned on by default and what happens to personal data when teams adopt artificial intelligence (AI) tools to write, automate or scale.


That reality became harder to ignore in 2025. Regulators tightened expectations, enforcement actions continued to set benchmarks and new rules and rulemaking made it clear that “privacy compliance” increasingly means being able to explain how your systems work, how you use data and why your practices are reasonable for the risks involved.


For creatives, startups and nonprofit

organizations with lean teams, limited budgets, or fast-moving operations, the goal heading into 2026 is not perfection but control, clarity and defensible decision making. Understand your data flows, reduce unnecessary collection, tighten vendor and tool use and communicate plainly about what you do.


Key Data Privacy Developments in 2025

1. Increased Regulatory Activity and Enforcement


AI governance moved closer to privacy compliance: One of the most significant developments was the EU’s Artificial Intelligence Act, which continues its phased implementation and requires organizations operating in or serving the EU to align data governance and transparency practices with established risk categories, record-keeping and documentation requirements. The bulk of the AI Act becomes applicable in August 2026, with certain provisions taking effect earlier as part of its staged rollout. 

Check out our blog recapping our visit to the UK and live discussions surrounding AI governance on the European continent: All Rights Reserved: Protecting IP in the AI Era (UK Edition) 


State privacy laws kept expanding across the US: A patchwork of requirements is now the practical reality for many organizations, especially those collecting data through websites, mailing lists, membership programs, donation platforms, mobile apps and online communities. For example, the Maryland Online Data Privacy Act (MODPA) went into effect on October 1, 2025 and enforcement begins April 1, 2026. MODPA establishes consumer rights and business obligations related to personal data processing, including requirements for data protection assessments and expanded rights such as opt-out, delete, correct and know. 


California’s latest CCPA regulations have moved beyond “notices and rights” into governance: The California Privacy Protection Agency completed rulemaking on updates to existing CCPA regulations and new requirements covering cybersecurity audits, risk assessments and automated decision making technology. The regulations take effect January 1, 2026, with additional time built into compliance timelines for certain obligations.


Enforcement penalties remained visible and costly: Across jurisdictions, regulators also applied meaningful financial penalties for violations of existing data protection requirements. A high-profile example in 2025 was the €530 million fine imposed by Ireland’s Data Protection Commission on TikTok for transferring European user data to China without appropriate safeguards and for deficiencies in transparency in privacy disclosures.


2. Growing Expectations Around Transparency and User Control

Most organizations disclose privacy information somewhere. The growing question is whether the disclosures help people understand what is happening with their data.

In 2025, regulators and consumers paid closer attention to the gap between what privacy notices say and what actually happens across products, marketing stacks and vendor tools. Where notices are vague, overly dense, inconsistent with actual processing or difficult to navigate, organizations take on two problems at once: users lose confidence and regulators become more skeptical.


For organizations that depend on community trust, donor confidence or brand reputation, clarity is not a “nice to have” but a big part of managing risk.


3. Privacy by Design as an Operational Standard

Another notable development in 2025 was the growing expectation that privacy considerations be addressed early in the lifecycle of products, services and internal systems. Privacy by design is no longer treated as a best practice reserved for regulated industries. Instead, it is increasingly viewed as a baseline operational standard.

Practically, privacy by design looks like:


  • collecting only what you need and being able to explain why you need it

  • limiting access internally and across vendors

  • encrypting and securing personal data appropriately

  • setting retention periods rather than defaulting to indefinite storage

  • running a privacy review when a new tool, feature or campaign changes how data is used


These measures matter because many modern privacy laws and enforcement patterns expect organizations to evaluate risk proactively, especially where processing is sensitive, automated or difficult for users to understand.


Looking Ahead to 2026


1. Increased Emphasis on AI Governance and Privacy Compliance

In 2026, the relationship between AI governance and data privacy will become more difficult to separate. As the EU AI Act moves toward full enforcement and other jurisdictions explore similar frameworks, organizations will be expected to demonstrate how personal data is used within automated systems, how risks are mitigated and how oversight is maintained.


Preparing now by documenting data sources, assessing risk and establishing internal governance processes can reduce uncertainty and help organizations respond more effectively to regulatory inquiries.


2. Privacy as a Business Consideration, Not Just a Legal One

Privacy is increasingly tied to reputation, trust and long-term sustainability. Organizations that take a clear and consistent approach to data practices are better positioned to build credibility with users, partners and regulators. In practice, this often shows up in clearer disclosures, more limited data collection and user controls that are built into products rather than added later.


3. Continued Focus on Sensitive and Biometric Data

Sensitive categories of data, including biometric information, continue to receive heightened regulatory attention. Organizations that collect or rely on this type of data should expect ongoing scrutiny related to consent, purpose limitation, retention and security safeguards. As enforcement patterns continue to develop, this area remains one of the higher-risk aspects of data processing.


Practical Considerations for Organizations

As organizations prepare for 2026, several steps may help reduce uncertainty and risk:

  • Map your data flows at a practical level, what you collect, where it goes, who receives it and how long you keep it.

  • Monitor new and upcoming state privacy laws and assess whether they apply to existing practices.

  • Review vendors and tools that collect personal data, including CRMs, marketing platforms, donation tools, analytics tools, email services and AI assistants.

  • Ensure privacy and AI disclosures match actual practice and update disclosures when workflows change.

  • Conduct risk assessments before major launches or campaigns, especially when introducing AI tools, targeted advertising or sensitive data collection.


The developments seen in 2025 illustrate how quickly the data privacy landscape continues to evolve. For organizations processing personal data or deploying data-intensive technologies, understanding these trends is essential for maintaining trust, managing risk and positioning for growth.


As 2026 approaches, organizations that treat privacy as a core operational consideration, rather than a reactive obligation, will be better equipped to navigate regulatory change, manage risk and protect the interests of users and stakeholders.


A Call to Action

Ready to ensure you and your organization are prepared for the evolving data privacy landscape of 2026?


Book a strategy session with us today to map your data flows, assess your risk, and establish defensible privacy and AI governance processes.




Read the article written by Attorney Smith and Daphne Ekpe for the Journal of IP Professionals surrounding data ownership published earlier this year, starting on page 9, here:



References

 
 
 

Recent Posts

See All
A Year of Momentum: Our Firm's 2025 Recap

By: Ambur C. Smith, Esq. The past year has challenged businesses across industries to adapt to an evolving legal, political, and economic landscape - our Firm was hardly the exception. So much of the

 
 
 

Comments


bottom of page