Interview with Brian McGinnis – Data as a Strategic Asset, Not a Compliance Burden – AI Governance and the Acceptable Use Policy – Website Tracking Tools and the Wiretapping Litigation Wave – IP Fridays Podcast – Episode 174

My co-host Ken Suzan and I are welcoming you to episode 174 of our podcast IP Fridays!

In today’s interview, Ken Suzan interviews Brian McGinnis, partner at Barnes & Thornburg and co-chair of the firm’s data security and privacy practice, about why companies need to stop treating data privacy as a compliance burden and start treating it as a core business asset. McGinnis argues that data is either a managed asset or an unmanaged liability, with no middle ground.

But before we jump into this interview, I have news for you!

The EPO saw a Record Year with 200,000+ Patent Applications in 2025:  German filings dropped 2.2% while China grew 9.7%, overtaking Japan for the first time. Germany remains Europe’s top patent nation but loses ground globally. SMEs and universities now account for nearly half of all Unitary Patents granted to European innovators.

News from the UPC Court of Appeal: Non-Technical Features Count for Inventive Step. An April 17 ruling clarifies that all claim features must be evaluated in their combined effect, including non-technical ones. Companies with software-related or mixed-technology inventions pending at the EPO or UPC should reassess recent inventive step objections at the UPC in light of this decision.

Nokia Withdraws UPC and Munich Suits After Global FRAND Settlement; Following a global FRAND rate-setting decision by the UK High Court, Nokia withdrew parallel suits against Warner Bros. and Paramount at the UPC and in Munich. One UK ruling resolved litigation spanning Germany, the UPC, the US, and Brazil simultaneously.

China Abandons Anti-Suit Injunctions in SEP Disputes: After a WTO arbitration ruling from July 2025, China withdrew its practice of blocking SEP holders from filing suits abroad. The EU Commission continues monitoring compliance, since the former policy was largely informal rather than codified in statute.

The Trump Administration has put 100% Tariffs on Imported Patented Pharmaceuticals: Based on Section 232, the Trump administration imposed 100% tariffs on patented drugs and biologics effective April 2, 2026, with a 120-day transition period until July 31. EU member states face a reduced rate of 15%. Generics and biosimilars are explicitly excluded.

China Rejects 1.27 Million Trademark Applications in Three-Year Crackdown:  China’s CNIPA rejected over 1.27 million trademark applications and invalidated more than 3,300 marks, targeting so-called edge-ball marks designed to mislead consumers about product quality or origin. The announcement was made at an official press conference on April 23, 2026.

Now let’s jump into the interview with Brian McGinnis!

Brian McGinnis is a partner at Barnes & Thornburg and co-chair of the firm’s data security and privacy practice. In this episode of IP Fridays, he argues that companies treating data privacy as a compliance burden are missing the point entirely and leaving significant value on the table.

Data Is Either an Asset or a Liability

Most companies still treat their data as invisible and costless. They do not manage it the way they would manage a patent portfolio or a trademark. That, McGinnis argues, is a fundamental strategic error. Data is either a managed asset or an unmanaged liability. There is no middle ground.

When companies invest in understanding what data they collect, how it is used, and who has access to it, they unlock opportunities to drive real revenue and growth. Done right, a data governance program is not a cost center. It is a foundation for trust, operational efficiency, and competitive advantage.

One Program, Not Twenty

With more than 20 US state privacy laws now in effect, and major economies worldwide introducing their own frameworks, building separate compliance programs for each jurisdiction is neither practical nor smart. McGinnis recommends a single, comprehensive governance framework designed around the core purpose and intent of privacy law, flexible enough to absorb new requirements as they emerge.

Companies that threw together a quick program when California’s CCPA came into force in 2020 are now overdue for an upgrade. The goal is to move from reactive compliance to a mature, proactive program that positions the company ahead of the regulatory curve rather than perpetually catching up.

Website Tracking Tools: An Underestimated Risk

One of the fastest-growing areas of privacy litigation involves tracking technologies built into company websites: pixels, session replay tools, analytics scripts, and chat widgets. Legal teams are often entirely unaware of what IT or marketing has deployed. That gap is expensive.

Plaintiffs’ attorneys are applying 1970s-era telephone wiretapping statutes, including the California Invasion of Privacy Act, to argue that collecting any personal information, including IP addresses, before a user has consented constitutes illegal interception. Demand letters are being sent at industrial scale, with settlements typically running between $10,000 and $20,000 per case. What makes this particularly difficult is that a company can be fully compliant with statutory privacy law and still face these wiretapping claims, because the legal theory turns on the timing of data collection rather than the existence of a privacy notice.

Vendor Contracts: The Hidden Exposure

Marketing and technology agreements are another major source of unmanaged data risk. When a company deploys a third-party tool that handles personal data, the underlying contract needs to define precisely who owns that data, what the vendor is permitted to do with it, and what obligations flow down to any sub-processors involved.

McGinnis draws a direct parallel to IP licensing: owning valuable data and then handing it to a vendor under a poorly drafted agreement is the equivalent of signing a bad IP license. Data processing agreements need to cover ownership, use restrictions, sub-processor obligations, breach notification timelines, audit rights, and deletion obligations. Many companies simply do not have these terms in place. Without them, a vendor who suffers a breach of non-personal business information has no contractual obligation to disclose it.

Consumer Rights Requests: Process Matters

Privacy laws give individuals the right to access, correct, delete, and opt out of the use of their personal data. Responding to these requests effectively requires pre-built processes, trained staff, and the technical ability to locate and act on individual data across all systems and sub-processors. Most companies, before engaging in formal data mapping, are not in a position to do this reliably.

Staff failing to recognize a deletion request as a legal data subject request and routing it through a standard customer service queue instead is one of the most common failures McGinnis sees. The consequences can include regulatory complaints and class action lawsuits, particularly when a company continues to send emails to someone who has already requested deletion of their data.

A newer risk involves Global Privacy Controls: browser-level opt-out signals that regulators and courts are now treating as legally binding deletion and non-collection requests. Companies receiving these signals daily without acting on them face growing exposure under several state laws.

AI Governance: Policy Before Tools

Generative AI tools are now embedded across business functions, from contract review and customer service to content creation and internal search. McGinnis is direct: every company needs an AI acceptable-use policy, and the absence of one is not a neutral position. Without clear rules, employees will use unapproved or publicly available tools regardless, feeding proprietary and sensitive information into open models with no control over how that data is used or retained.

He draws a precise parallel to patent law. Posting proprietary information into an open AI system carries the same risk as publishing it publicly, potentially destroying patentability. The distinction between closed, organization-specific AI systems and open, publicly accessible ones is something employees need to understand explicitly. Making compliance easier than non-compliance is the practical goal.

The Regulatory Outlook: More Laws, More Enforcement

McGinnis expects the regulatory landscape to continue expanding. The EU AI Act is already setting the direction, and several US states have introduced or are developing AI-specific legislation. The pattern mirrors what happened with data privacy: Europe leads, US states follow in a patchwork, and federal legislation remains uncertain.

Enforcement of existing privacy laws is also intensifying. GDPR has been in force since 2018, CCPA since 2020, and regulators are now past the period of extended tolerance for companies that are still catching up. Companies with immature compliance programs should expect less patience from regulators going forward.

McGinnis closes with a clear point of view: if you have to comply anyway, get credit for it. A well-built governance program is a trust signal to customers, a sales asset, and a foundation for responsible AI use. Compliance done right is not a tax. It is a differentiator.

The Full Transcript:

Ken Suzan: Our guest today on the IP Fridays podcast is Brian McGinnis. Brian is a partner with Barnes and Thornburg and a founding member and co-chair of the firm’s data security and privacy law practice group. Brian serves as a member of the intellectual property department and the internet and technology practice. Brian is a Chambers Global and national ranked privacy and data security attorney, a certified information privacy professional, and the firm’s chief privacy officer. Brian brings nearly two decades of experience at the intersection of law and technology. Brian advises on a wide range of technology-driven legal matters, including privacy and data security, intellectual property, artificial intelligence, corporate transactions, software, and internet law. His deep understanding of privacy and technology law enables him to guide clients through rapidly evolving regulatory and operational challenges. Welcome Brian to the IP Fridays podcast.

Brian McGinnis: Hey, thanks Ken. I appreciate it. Great to be here and thanks for having me.

Ken Suzan: Excellent. Brian, the C-suite tends to treat data privacy as a compliance tax, something to hand off to legal and forget about. But when you see how companies actually get into serious trouble, what’s really going on?

Brian McGinnis: Yeah, well, it’s a great place to start Ken and looking forward to the conversation today covering some of these privacy issues and AI issues, which I found in my own practice is really bled into the straight privacy stuff. Companies can’t really handle these things in a silo anymore. It’s really about managing and coming together as a coherent program for governance for the organization. I think if you do that right, the good news is we can become revenue generators and show growth for the company and not just compliance centers and a compliance tax. But I think the core problem that we face in working with most companies is that a lot of companies still treat their data as invisible, costless. They don’t treat it, in other words, like they would a patent portfolio or trademark or other IP portfolio. It’s just not managed as an asset in the ways that we’ve seen more sophistication around IP. And it really should be. Data is either a managed asset for the company or it’s an unmanaged liability. There’s really not an in between. And so for those companies that haven’t gotten their arms around all this data and what can be done with it, I think they’re really missing an opportunity. Having an understanding of what data the organization is collecting, how it’s being used, and having the proper governance around it really unlocks a lot of opportunity for use of that data in new ways — ways that can drive revenue and growth for the company. So I approach privacy not just about compliance, not just about avoiding penalties or doing it because some law out there says that we have to do it. It’s really about knowing and controlling one of the company’s core assets. And if you’re not doing that, you’ve got unmanaged data that you’re not getting value out of and that potentially could be a huge liability for the company. Managed well, it really supports trust, efficiency, and growth of the organization. Otherwise, I think it’s a missed opportunity.

Ken Suzan: Yes, well said. Now let’s talk about state laws. With 20-plus state privacy laws now in effect, how should companies build a program that actually works across the board without starting over every time a new state law kicks in?

Brian McGinnis: Yeah, so the first answer is don’t build 20 separate programs. This really goes back to having a comprehensive, sophisticated, well thought out program that really takes into account not only the 20 state laws, but obviously we’ve got international exposure with laws like GDPR and upcoming privacy laws internationally. Most of the larger economies in the world have some form of laws around privacy and AI. So you can’t really anymore build programs that account for the one, two, three, four, five different laws that in the past we had experience with — where you could just treat California as its own thing, treat New York as something else, and treat Europe as something else. The laws and the pace of these have really forced companies into having comprehensive programs. I don’t expect to see fewer laws. You’re only looking at potentially additional state laws, additional federal laws here in the US, and then certainly additional laws throughout the world. So a lot of the strategy these days is not only where are we today with these laws, but how do we set up our governance program in a way that really cuts to the core of the purpose and intent behind these laws so that we can be better prepared when new laws come about in the future. Historically, at least in the US, most companies just haven’t had laws that force them into compliance postures. As these laws have started to come along, a lot of companies have been playing from behind and saying, oh, the California Consumer Privacy Act, I just read about it and it goes into effect next week — let’s throw something together and call that our compliance program. We’ve now got years of these laws being in place, CCPA came into effect in 2020, and what we’re seeing much more of are companies looking to get more sophisticated in their programs and stop feeling like they’re always rushing to catch up. The goal is to level up their program, going from level one — constantly playing from behind — to level two and then level three, so that they really feel like they’re on top of it and have a sophisticated program that not only accounts for all the various privacy requirements that come at them, but also positions them to take advantage of the data and all the things that come along with having a good governance program.

Ken Suzan: Brian, there’s an explosion of litigation targeting something most companies barely think about — the tracking tools baked into their own websites: pixels, session replay tools, analytics scripts, chat widgets, the list goes on and on. What’s happening, Brian, and what should companies do?

Brian McGinnis: Yeah, and I think a lot of companies — the executives, the business teams — don’t even realize a lot of these tools are on their sites. IT deployed them years ago, the web team deployed them, marketing teams are constantly using them and certainly have a good understanding of it. But in a lot of cases, legal has never touched them and has no idea what’s happening on the website. We also see a lot of cases of companies who, even if they’re generally aware these tools are in use, aren’t aware what other teams are putting on the site or what those pieces of technology are tracking. And that gap can be really expensive. What we’re seeing right now — and this has been a trend for a number of months now and is really continuing to pick up steam — is a series of what I call gotcha lawsuits, where you have some enterprising plaintiffs’ counsel who have taken a look at some 1970s-era telephone wiretapping laws, including a law called CIPA, the California Invasion of Privacy Act, passed in the 70s with the idea that you shouldn’t be able to wiretap people’s telephone conversations. They’ve taken that and applied that theory to the internet. The way it works is: if a website has some sort of cookie, pixel, or other tracking technology on it that collects personal information about an individual — and that can be as simple as an IP address and device ID — and if that collection occurs as soon as the individual shows up at the website, prior to them being able to have notice provided to them or opt in and consent to that collection, then the theory under these lawsuits is that it constitutes wiretapping. We see a lot of this with the Meta pixel, with LinkedIn pixels, and the like. What they’re doing is effectively showing up and suing, threatening to sue, trying to take you to arbitration, depending upon what’s included in the company’s existing privacy notice. If you don’t have a cookie banner, if you don’t have a cookie notice, if you’re not getting opt-in on these things, they’re leaning on those failures and effectively trying to force you into a position where you are forced to make a settlement. Because the cost to litigate one of these to their conclusion would be expensive, whereas a lot of these cases will settle for $10,000 to $15,000 somewhere in that range. They’ve got technology crawling the internet looking for websites that don’t have these risks covered, sending demand letters and then collecting settlements, $10,000 to $20,000 at a time. It’s been very profitable for them and a very dangerous thing for our clients. And it’s a bit unusual because you can be fully compliant with the statutory privacy laws that require notification of the use of tracking technologies and cookies and banners — and still be subject to these lawsuits because of the wiretapping arguments being made. The timing wherein the data is collected from the individual could still subject you to these lawsuits. So it’s a tricky problem, one that I hate seeing companies get hit with and one that we spend a lot of time helping companies avoid.

Ken Suzan: Yes, let’s talk about contracts, Brian, because I know you work with contracts probably on a daily basis. A lot of data risk lives inside vendor and technology agreements — the contracts companies sign with marketing platforms, analytics providers, cloud infrastructure, and SaaS tools. What should those agreements actually contain?

Brian McGinnis: Yeah, so there’s quite a lot of things. You’ve got a world where marketing is constantly under pressure to learn more about their customers. The way they can do that is through any number of different tools and data gathering techniques, and we have all this technology available to help marketing and sales do better at their jobs. But we, at least in this country, got to a position where people really felt like they lost control of their information and their data. And so these privacy laws came along and really started to provide more rights to individuals — to have an understanding of what data exists within various companies that they do business with, who they’re sharing it with, trading it with, selling it to for advertising purposes; to have the right to opt out; the right to delete their information. Not checking through the agreements by which these teams are implementing these tools is a huge issue for companies. As part of an overall compliance program, having some kind of process where people who are aware of the growing numbers of privacy laws are reviewing these marketing contracts to make sure they are aligned with that program and aligned with those laws is absolutely critical. To talk about IP, given the IP Fridays audience: it’s kind of the equivalent of having really bad IP licenses. In other words, you own and control this information and data, and you need to control what the other side can do with one of your most valuable assets — or you’ve effectively given it away. So thinking about it in that way could be useful. In terms of more specifics: a big one is ownership of the data. The agreement itself may or may not have anything that addresses data. If there’s personal information involved, you probably need what we call a data processing agreement or addendum — a DPA — that specifically controls what that third party is able to do with that data, how they’re able to use it, whether they’re able to share it, whether they’re able to get value out of it on their own, or if they’re only allowed to be what we call a service provider, just providing services to the business that hired them. There needs to be explicit prohibition on retaining, using, and disclosing personal information for any purpose other than performing the exact services in the contract. Whether or not they’re permitted to sell or share data under CCPA terms is another key point. Certification that the provider will comply with any restrictions and security requirements you have on your data, and making sure those obligations flow down to any sub-processors they might use. You hire Company A, but Company A works with Company B and C to provide parts of their service. You’re effectively responsible for the protection of personal information throughout its lifecycle. A couple of other key provisions: breach notification triggers and timeline. It’s very possible under a lot of agreements that one of your vendors can suffer the world’s worst hacker breach and have no legal obligation to tell the company that hired them about it — unless there’s personal information involved. State data breach laws apply to personal information, not to other types of sensitive business information. Unless you have a contract that explicitly requires notification, there’s a good chance that vendor may not want to disclose it. And then other things like audit rights and deletion obligations go in there as well.

Ken Suzan: Certainly a lot to cover. Let’s talk about privacy laws and consumer rights. Privacy laws give consumers real rights — to access their data, correct it, delete it, and opt out of how it’s being used. Most companies have a process for this on paper. What does it actually take to get it right, and what happens when it breaks down?

Brian McGinnis: Yeah, it takes pre-planning. It takes a process. Some companies receive many more of these requests than others — some B2B companies receive none or a couple per year, while companies heavily involved in marketing to consumers might receive tens or hundreds a day. To be able to respond to these effectively and efficiently requires some forethought. It requires policy and procedure internally to be set up, and it requires the education of the team. Some of the common ways we see this go wrong: staff isn’t trained to know the difference between what we call a DSR — data subject request — versus a regular customer service inquiry. Maybe somebody submits what would be construed by law to be a deletion request and you just put it into your normal customer service response flow — and then you’re potentially missing timelines and the like. There also need to be systems in place to respond in accordance with the individual’s rights. Somebody submits a request saying, you have my information — what information do you have about me? Can your company determine that right now? Can you look through all your systems and down the line to all the processors and sub-processors you’ve worked with and hired, and identify what information you have about that individual? Most companies, until they engage in a governance program and data mapping, are at a real disadvantage to be able to do that. Why is that a problem? Because two weeks from now your company could be sending emails to the individual who just told you to delete their data, and they get really upset. That’s when they go and complain to regulators or start class action lawsuits. The lack of planning can be really, really expensive for a lot of companies. Making sure you’ve got some kind of process to understand what’s coming in, that the people receiving those requests know the difference between a regular customer service request and a data subject request, and that it gets to the appropriate parties for action — all of that is really, really key. Another one that we’re seeing pop up is what we call GPC, or Global Privacy Controls. It used to be that people would say “do not track” in their browser and most companies would ignore those signals. Now we’ve got advancements in law and browser technology where the browser you’re using to visit a company’s website sends a signal saying, opt me out of this. Regulators and courts are construing those as deletion requests, as opt-out requests that companies are now required to respond to. If your company hasn’t gone through an exercise to understand that, and is probably receiving GPC opt-out requests on a daily basis without acting on them, there’s some exposure there. At the end of the day, a lot of this really is about getting the appropriate people from across the organization — really each department — around a table, figuring out what data you collect, how you use it, who you share it with, where it comes from. That starts the process of your data map. Then you set about mapping that to the various legal requirements and figuring out how to respond, how to make it easy for people to exercise their rights so they’re not complaining, not suing, not going to regulators. Letting these squeaky wheels out of the process — the ones who don’t want you to be processing their information any longer — is really key.

Ken Suzan: Let’s switch gears a bit and talk about AI. I know we’re hearing about it every day. Generative AI tools are now embedded in how companies work — contract review, customer service, content creation, internal search. Before employees start using these tools with customer data, confidential business information, or proprietary content, what has to be in place first?

Brian McGinnis: Yeah. I think we’re long past the days when companies provided individuals access to corporate technology — computers, devices, and the like — without having some kind of acceptable use policy that governs that. We don’t want you downloading stuff that could harm our network or create security issues. We don’t want you using our technology in certain ways, whether that’s a BYOD policy or just general use of company internet or company devices. An AI acceptable use policy is really a continuation of those. Every company needs to have an AI acceptable use policy. Period. In my opinion, things like that are as important as the fire escape policy out in the hallways for these companies. I can tell you with absolute certainty: if your organization has not provided rules to your employees and personnel about the use of AI, what they can and can’t use — or if you’ve said you can’t use any AI — the personnel is still using AI. They’re just not using any approved tools. They’re probably using their own private tools that they subscribe to, or even worse, tools they don’t pay for, in which case they’re putting company information into a wide open public model. The more companies can do to think through this ahead of time, reduce it to policy, and then train and educate people on that company’s particular policy, the better. You need to make it easier for people to comply than not comply. An acceptable use policy should talk about: here’s how we can and can’t use it, here’s the data that should and should not go into the system, here’s some proper uses of AI, here’s some data that’s on the fringe that we need to keep out — more sensitive information, proprietary information, etc. Making sure you’re funneling and educating people about the difference between closed systems and open systems. In other words, this is a tool that only looks at our organization, only uses the data within a certain box, and is not publicly available — the AI system is not training on our data. You have more leeway to put more sensitive information into those types of systems than you do with open systems which potentially lose control of your data. It’s almost like a patent consideration in terms of keeping information secret. If something potentially has some patentability that you want to seek to file in the future, you can’t just go out and post it publicly and use public search engines and all this other stuff at the risk of exposing it. Similar concepts here — really getting a handle and control over what tools people can use and providing some education to them about how the company wants to think about what’s acceptable and what’s not in those uses is really the key starting point.

Ken Suzan: Very useful information. Indeed, we’re coming towards the end of today’s episode. One final question for you, Brian. Where do you think we’ll be two years from now in this developing field, and how best for companies to stay ahead of the curve?

Brian McGinnis: Yeah, this kind of takes us full circle, Ken. I think it’s kind of back to the beginning comments about the privacy space — and we’ve only got more of these laws coming. It’s still a developing field. We’re still really in the early days of enforcement. I mean, GDPR has been around since 2018, CCPA in the US really kicked us off in about 2020, and so there’s been a settling-in period as companies adjust and get used to having these laws and get compliance programs in place at various levels — from not at all prepared to highly sophisticated. We’re still pretty early on in terms of enforcement of these things. We’re already starting to see enforcement of more egregious violations of these various laws, and we’ll only continue to see more enforcement as the laws exist currently and as they continue to come along. The days of not having to pay attention to this are kind of over. And I always tell clients: if you’re going to have to do these things, you’re going to have to be compliant — you might as well get credit for it. By which I mean, let’s put all the policies in place, let’s do all the compliance activities, let’s have a sophisticated governance program, but then let’s also use that as a sales tool, as a way to help grow the company, as a way to sell new products and gain trust and earn trust with our customers — so that they know when they’re doing business with us, or when they’re giving us information, or when they’re using our AI tool, that we respect that and are going to take care of their information and have the structure in place internally to be able to do that. With respect to AI, what I’m seeing is very similar to what we have seen with the growth of privacy law — again led by Europe, with the EU AI Act in this case. Now you’ve got a handful of states in the US that already have AI laws, and others that are interested in continuing to roll those out. There’s friction with the federal government around whether there’s going to be a comprehensive law there. Like the privacy space, you’ve got varying factions — some of which want to develop really quickly with very little guardrails, others which say we’re threatening the future of humanity if we don’t get those guardrails in place. I think ultimately, at least in the US, we’re going to end up with another patchwork of AI laws for the foreseeable future that we’ll have to navigate. So really having a company position, a company philosophy of how do we handle all these various laws, how do we treat people’s data, how do we get our arms around it, how do we respond to whatever legal rights they currently have, and what principles do we put in place so that we can adapt for the future — and then, once we’ve done those things, how do we actually get value out of this and move the business forward. So it’s not a compliance tax, but a benefit to the business. That’s the end goal here, and I think the North Star for us.

Ken Suzan: Fantastic, Brian. This has certainly been a very comprehensive interview. Really appreciate you taking the time to talk about it with us here on the IP Fridays podcast.

Brian McGinnis: Happy to do it, Ken. Thanks for asking me and good to see you. Thank you.