Nixon Law Group

View Original

Episode 33: Privacy and Security Frameworks for Connected Devices and The Internet of Medical Things

What healthcare innovators need to know about the Internet of Medical Things and the evolving privacy and security frameworks for digital health connected devices

See this content in the original post

Subscribe on Apple Podcasts, Spotify, Buzzsprout, or follow the podcast on LinkedIn for new episode drops.

In this episode you’ll discover:

  • How to define digital health-connected devices

  • How the privacy and security and data use challenges for these companies are distinct from the challenges facing other digital health companies

  • What trends they are seeing among external bad actors in the space

  • What the biggest threat to digital health companies is right now from a security/privacy perspective—and what’s the worst that could happen

  • How companies make the risk/benefit tradeoff

Click here to watch it on YouTube.

Learn more from Bethany and Ashleigh

Connect with Bethany Corbin, Esq. on LinkedIn

Connect with Ashleigh Giovannini, Esq. on LinkedIn


Learn more from Carrie and Rebecca: 

Healthcare insights (monthly email) | Telehealth/Virtual Care Mgmt Update (biweekly LinkedIn update)

Website | Carrie on LinkedIn | Rebecca on LinkedIn | NGL on LinkedIn


Read the transcript here:

Rebecca Gwilt (00:15):

Welcome back to Decoding Healthcare Innovation. Today I am excited to be joined by Bethany Corbin and Ashleigh Giovannini two absolute powerhouse attorneys at Nixon Gwilt Law. Bethany and Ashleigh service digital health innovators worldwide in a number of areas, but they are, our firm's go-to experts for data use, privacy and security. I have asked them to join me on today's point podcast and we don't usually do sort of substantive law topics, but this is an area of expertise that's unique in that it touches nearly every single healthcare innovator out there and it's a landscape that's rapidly changing. And in particular today, we're going to be chatting about unique privacy and security risks for digital health connected device companies. So welcome Bethany and Ashleigh.

Bethany Corbin (01:07):

Thank you.

Rebecca Gwilt (01:08):

Yeah I see you in other environments throughout the day, but it's nice to be able to really grill you on camera. I know that's your favorite. So I'd like to start here by a little bit of level setting, just jumping right in. Bethany, can you help me define connected health devices? Because I know this is sort of a subset of the larger digital health sector.

Bethany Corbin (01:33):

Yeah, absolutely. So whenever we think about connected health devices, we're thinking about healthcare technology that has the ability to seamlessly transfer data across either like a wifi connection, internet connection a sensor, anything that doesn't require manual human input. So these would be devices like connected pacemaker. A blood pressure cuff that can send data directly to your provider. And these devices can vary in terms of their complexity, how they're used by healthcare providers and healthcare organizations. But the main goal here is the transfer of data from the patient to an endpoint without human involvement.

Rebecca Gwilt (02:15):

This kind of technology has been around forever. I remember when I was in college, I was part of a psychology, psychology department experiment for some extra cash where they sort of strapped me into a heart rate monitor. That actually wasn't heart rate, it was blood pressure. And I would walk around class and do as if it was a normal day, but it would be sort of off and on taking my blood pressure throughout the day. It actually went terribly, I was a horrible test subject. But in any case, that was enough years ago. I won't name the number, but a lot of years ago. So this technology has been around for a while, that you can sort of remotely transfer information from a device, for instance, connected to a person back to a healthcare provider or other, but we've seen a lot of activity in this market in the last couple years. Am I right?

Bethany Corbin (03:07):

That's absolutely right. So even though there's a lot of legacy devices out there that do transfer data, we've seen a lot of transformation though in terms of their complexity, the types of conditions that we can now monitor. I mean, we have devices now that can actually be implanted inside the human body and regulate organ function. So those pacemakers, for instance, can have wireless sensors sent to that actual heart muscle that can trigger when the heart should continue to beat. We also have things like the different infusion pumps that can be connected into the patient to administer things like insulin. And so the complexity of these devices has just skyrocketed over the past couple of years in conjunction with the rise of digital health technology.

Rebecca Gwilt (03:51):

So Ashleigh now that we have a good level set for what we're talking about here and to Bethany's point, there's a ton of this out there and the indications and the uses for these kinds of devices are expanding fairly rapidly, which is super exciting. But you and Bethany are also keeping a close eye on this for some of the risks that are also proliferating. Can you tell me a little bit about how you perceive the privacy and security and data use challenges for these types of companies connected health companies as distinct from challenges facing other digital health companies in general?

Ashleigh Giovannini (04:27):

Definitely. I think it's important to reiterate that there are a lot of data entry and exit points when it comes to connected health devices. There are also a lot of moving pieces and parts that come with developing a connected health device. Traditionally, when you build a platform, you may use a vendor to provide you with certain functionalities. That is especially true for connected health devices. And with those additional vendors, third parties comes an increased risk of privacy or security issues that you have to be, have heightened awareness about.

Rebecca Gwilt (05:05):

And there's also, for some of these, listening to the examples that Bethany gave for some of this, it's not just sort of loss of data that's a threat, but these are monitoring pretty serious vital signs for some patients. And so this is one of the areas where I think privacy and security is not only important because we want to protect our own confidential health information, but also because of the potential for outsider threats infiltrating a system that that's actually sort of monitoring or maintaining a person's health.

Ashleigh Giovannini (05:45):

Definitely. And I think that it's important to note that this data will only continue to become more lucrative for those who maybe have less than wonderful intentions of using that data. We are headed towards that data becoming a commoditized thing. And with that commodity, comes some motivation from bad actors to try and gain access to that data or limit the ability to use connected devices for patients who rely on them for critical healthcare services.

Rebecca Gwilt (06:20):

Yeah. So let's talk about that. What are the trends that you're seeing among these sort of external bad actors in the connected device space?

Ashleigh Giovannini (06:32):

Definitely Bethany, would you like to begin, because I know you have a number of points here.

Bethany Corbin (06:40):

Yeah, so we've,

Rebecca Gwilt (06:41):

Oh, a number of points.

Bethany Corbin (06:43):

We've seen a couple of different trends here with respect to what bad actors are doing. It really falls into two categories. You have the external bad actors, and then you have internal bad actors as well. So from the external perspective, we're seeing a lot of increased activity with respect to cyber hacking and cybersecurity attacks. So these would be your things like malware, ransomware, phishing denial of service attacks. Those types of things have really ramped up and they've ramped up, particularly in healthcare, just given the value of health data on the black web. So for comparison health record can go for up to about $250 on the black market. Whereas, for instance, financial data, a credit card goes for about $5.40. So the incentive is

Rebecca Gwilt (07:28):

Really why is that, Bethany, I, I've heard this statistic for several years now. It might be helpful for people to understand why that is. You would think that your financial information, your social security number and your bank information would be quite valuable. Why are healthcare records so much more valuable?

Bethany Corbin (07:45):

So healthcare records really derive value from the fact that they contain a massive amount of data. So in that health record, you have all of your diagnoses, your treatments, your medications. You could have behavioral health information. You also have very sensitive information like your date of birth, your medical record numbers, potentially payment information. So the first is that the volume of data in one health record is significantly larger than just a credit card number. The other thing that complicates this is the fact that once your health data is out there, you can't change it. So a credit card, for instance, if that number gets stolen, you can easily call your credit card company, shut that card off, get a new card, and then the value from having that data to the hacker is diminished. With a health record, you can't go and really change your health information. That thing is it. It's out there. It's available now in the black web for people to use. And it also means that identity theft can be triggered and happen a lot more rapidly with that health data just given the volume of data that you have.

Rebecca Gwilt (08:44):

And that's super helpful. And my last question before I move on here is, one of the misconceptions that I see is companies that are sort of smaller companies saying, well, the healthcare data I have is not so valuable cause I've got 900 patients, or I've got, I'm just a small shop that this, nobody's going to be targeting me. They're going to be targeting Anthem, they're going to be targeting Target and Walmart those big companies that have ton of data and they're not going to be targeting me. Is that perception accurate in your opinion?

Bethany Corbin (09:23):

No. No.

Ashleigh Giovannini (09:24):

Yeah, agreed. I don't think so. It is very much the case now, especially with our current geopolitical climate, that any entity that is touching data that pertains to health is at a risk of experiencing a breach of this nature. In fact, most of these breaches, about 50 to 70% of breaches are experienced by small and mid-sized enterprises rather than these large enterprises. A lot of that has to do with planning.

Rebecca Gwilt (09:57):

Wow. Oh, 50 to, wow, okay.

Ashleigh Giovannini (10:00):

So quite a lot of these hackers or people who come in through ransomware, malware, any kind of attack, understand that smaller or early stage companies may not have all of the security and privacy protections in place that a large organization does. So if you expose your platform to the open web or it is known that you are using something that's more vulnerable, the attacker has an incentive to hold your health data hostage because you are kind of an easy target, which is why it's important to commit early to privacy and security when you're early stage and throughout the entire life cycle of your system.

Bethany Corbin (10:45):

And if I can add to that, Ashleigh, another aspect of that is that these hackers see smaller companies as stepping stones into larger healthcare networks. So they may not necessarily be targeting your startup company's data. That might not be their end goal. Their end goal may be that they know you have a contract with a large healthcare organization and you feed data into that system. So if your cybersecurity protections are weak, the hacker can target you and use you as an endpoint to get into that larger system.

Rebecca Gwilt (11:16):

Oh man. Okay. So what I'm hearing is you may not be as lucrative, but you're easier if you're small generally, because in general, the systems are less sophisticated or you're just as lucrative because you're connected to a much larger entity and you're small, so you'd be easier to get into. So anyways, I think it's really important for folks to hear that because I've heard it enough. People say, oh, well we're so early, there's not really anything to get here. We're going to put this off for a while. But there are really simple and very important things that you can have in place from the get-go to protect yourself even when you're a small company. So what would you say is the biggest threat to digital health companies right now? Who's connected device companies now from a security and privacy perspective?

Ashleigh Giovannini (12:14):

So as I mentioned previously, ransomware, malware and phishing are really crucial threats that digital health companies face right now. My most critical piece of advice on that front is to develop your privacy and security plan before those attacks happen. You should really have a deep understanding of your obligation to your users. If you are contracting with any HIPAA covered entities, what your obligations to them are as well, and then your responsibilities and obligations under federal and state law as well as an understanding of the terms of your own liability insurance policy or any insurance policy that covers such a breach. And you should really be able to identify and address privacy and security concerns quickly. And this is an approach that I recommend as a proactive thing instead of a reactive thing. And it only comes with practice. So just as large healthcare providers and systems practice their breach response, digital health companies at any stage have to do that as well. So that means everybody in your organization, from the ceo to the software developer should be required to engage in job specific training to identify cybersecurity attacks and respond to them appropriately. And then also to understand exactly what steps need to be taken to mitigate harmful effects of those breaches.

Rebecca Gwilt (13:43):

So putting myself in the shoes of a digital health company that's got 17 million priorities, building their product figuring out how to monetize it, making sure that it's high quality and the patients are cared for and protected. I just know from real world experience trying to fit in a large scope of work, having to do with privacy and security of data is just a hard thing to prioritize. It's a hard thing to actually do. It's a hard thing to find resources for. What does this process look like for an early stage scaling healthcare company? What does the level of effort look like or what does it look like for them to take the kinds of steps you're sort of describing right now?

Bethany Corbin (14:29):

Yeah, I'm happy to jump in here, Ashleigh too as well. So one of the things that we always recommend first is know where all of your data is coming from and where it is going. So Ashleigh and I routinely always start with a client and say, where is your data map? Show us your data map. What are every endpoint that can have data in or out or access to that system? A lot of times startup companies don't know where all their data's coming from, and that makes it extremely hard to protect against an external or an insider threat. The next thing that Ashleigh and I always recommend is a privacy and security gap analysis. So looking at your policies, your procedures, what do you have in place at this point in time, comparing that to the privacy and security laws, regulations and best practices that you might be subject to. And really identifying the risks that we see in your platform and trying to prioritize those risks because we've completely understand it is not feasible to address everything, especially as a startup company at one point in time. So we've really got to prioritize into high, medium, and low risk and making sure that we tackle and address those high risk priorities first. And then Ashleigh, I'll turn it over to you for some additional thoughts.

Ashleigh Giovannini (15:39):

Yeah, I agree, and I think that the data map that Bethany mentioned is critically important because it doesn't just set up where the endpoints are for your data. It also is helpful in helping you to understand what your contractual obligations are to a particular entity with whom you contract, whether that's a vendor or a customer. You need to know who is holding your data when at all times. And it's important that you have set up an understanding of how that data flows so that you can tell somebody if there is a breach within the time period that you have agreed to, or you understand this vendor is holding this data, this is the root of the issue here and this is how we need to proceed with mitigating those issues.

Rebecca Gwilt (16:27):

Alright, so I think I have a good handle on the threats as they exist, what the process looks like to walk through it. I like the idea of let's just figure out what the steady state is, what is the truth right now? Maybe there are 10 things that need to happen to get you to platinum level, double thumbs up. Maybe you don't have the resources or the time to do that, but let's figure out what that is. I think that's good for any company regardless of the stage. I'm interested in your sort of real life stories here, because I've been in too many privacy and security discussions that are a lot about make sure you do the right thing. This is going to be important, there's risks out there, et cetera. I want to hear some juicy deeds. I want to hear what's sort of the worst thing that could happen. What are some things that have stopped companies in their tracks? What are we really, in real terms in real life, what are we trying to prevent here? And can we?

Ashleigh Giovannini (17:37):

Yeah, so I'll take this one. I have been privy to a number of instances where an early stage or even a mid-stage company had a security issue and it can pan out in a number of ways. It's important to know that these breaches privacy or security incidents can get very expensive very quickly, and they can even put your business out of business if you maybe don't have.

Rebecca Gwilt (18:08):

That sounds pretty bad.

Ashleigh Giovannini (18:09):

Yeah, that's pretty bad. That is the extreme here. Definitely. So if let's say you have 15 customers and all of their data is kind of linked together, and that is where you experience the breach and each of these 15 customers has a thousand lives worth of data that has been imported into your system. You're an early stage company, you've got 50,000 or 15,000 lives, excuse me, on your platform. And if you think about how much the records are worth, as Bethany noted about $250 on a black market, it's going to take about that to recover those records. So when you're mitigating, you know, have to account for the fact that getting that information back can cost between a hundred and $200 or just responding to the breach appropriately can.

Rebecca Gwilt (19:08):

You mean with the costs, with the costs of mitigation teams and lawyers and

Ashleigh Giovannini (19:14):

Right mitigation teams,

Rebecca Gwilt (19:15):

No lawyers, et cetera,

Ashleigh Giovannini (19:17):

Breach notification to the user if you choose to pay the ransom, which is something we can talk about separately that also can have a particular impact on how much it costs you to recover your data. And it's also important to note that if you did pay the ransom, you may not recover that data anyway and you're out additional money in that instance. So these breaches can be very unpredictable and they can be very expensive very quickly. Where so your cyber liability insurance comes into play and making sure that you understand your limits there and what the terms of your policy are is critically important. But worst case scenario, this can put you out of business.

Bethany Corbin (20:04):

And I'll jump in here too because Ashleigh's absolutely right about the financial consequences for your business. There's also patient safety consequences should a breach happen and patient data be essentially removed from an organization. So let's say that you're a small startup company, you have a connected health device, you are contracted with a large hospital institution and a hacker uses you as a stepping stone into that larger organization. That hacker can then take offline a lot of those healthcare network systems, which means that healthcare system is going to essentially be frozen. They're not going to be able to treat the patients that they currently have in their hospital. They're not going to be able to access those medical records, they can't see what a patient is allergic to, what medication they're on. And so there's a lot of patient safety issues that come into play, all because you as a vendor didn't think that you needed to prioritize privacy and security at this stage in your company.

Rebecca Gwilt (20:59):

Yeah, it's a really good point. This has happened to a couple of hospitals and it's been fairly catastrophic for them. I mean if you've got a good backup redundancy system it may just be a couple of hours that you're down and trying to treat patients without access to the medical records. But sometimes hospitals don't even have that. And I would add to, I could bucket it in financial consequences, but I would add to the potential risk to your business, the reputational risk when things like this happen, an investigation follows, and once it is uncovered that you didn't have the right protocols in place, it's hard to come back from that from a reputation perspective. And I would say in general, buyers of digital health technologies are becoming much more stringent when it comes to what they need to see in place before they'll ever even go into business with a digital health provider. So this is becoming pretty significant. Privacy and security controls implemented within a company's becoming a ticket to ride in a lot of cases. So for some folks, they won't even get to the point where they get to have bad security and have a breach. They won't even get in the front door.

Bethany Corbin (22:25):

Actually, right on that.

Rebecca Gwilt (22:27):

Go ahead.

Bethany Corbin (22:27):

No, I just wanted to say that you're right about that because what can happen is especially right if you're a new, newer startup trying to do business with an established healthcare system, the privacy and security controls that they're going to require from you before they ever enter into a contract are going to be much more stringent than the ones that you would typically be subject to at that stage of your company. And so if you don't have enhanced privacy and security protocols in place and you're trying to make those larger industry partnerships and networking connections, you're going to have a six to nine month period where you're going to be rushing to try to enhance and implement all of those privacy and security controls that you need because you weren't able to build them up front.

Rebecca Gwilt (23:10):

That's right. That's absolutely right. I want to,

Ashleigh Giovannini (23:14):

Oh, sorry. I also thought this.

Rebecca Gwilt (23:16):

No, I love it more and more.

Ashleigh Giovannini (23:18):

I think that this manifests in an industry wide way as well. So while you as a digital health innovator may be in investing quite stringently in privacy and security other companies may not. And so you may negotiate or try to enter into an agreement with a large healthcare organization that has been burned by a vendor before and because they've been burned, they know what to look for, know how to discuss with a vendor what the liability looks like in the instance of a privacy or security breach, and they know how to leverage their experience to get you to engage with some very strict privacy and security measures that you may not have otherwise. So this is a very industry-wide concern and you as a vendor may also experience some of the kickback from other vendors who did not invest in privacy and security measures in the past.

Rebecca Gwilt (24:23):

Yeah, it's a good point. This is a absolutely industry wide and moving fairly quickly. There's so many players in the market now that buyers can be pretty picky when it comes to who they're going to bring on board. And like I said, sometimes this stuff is ticket to ride. I want to go back to something you said, Ashleigh which is it's important to have cyber liability coverage. So there are a number of ways to reduce risk for folks in the sort of digital health space as it relates to privacy and security. One of them surely is having a very robust privacy and security system, but there are a couple of other things that you can do including getting sort of cyber coverage. I'd love to hear you say a little bit more about that. And then Bethany, I'd love for you to close with some thoughts about the kinds of things that companies who are listening to this right now could do next week that would meaningfully move them forward from a privacy and security perspective if they're not ready to fully dive in because there are smaller steps that can be taken to reduce these risks in the interim.

(25:35):

So go ahead Ashleigh.

Ashleigh Giovannini (25:37):

Yeah, so cyber liability insurance is critically important. I have noticed in the last few months that a lot of digital health innovators are beginning to realize how critically important cyber liability insurance is for small and mid-sized companies. What we typically hear a broker talking about when it comes to cyber liability insurance is your per occurrence and your aggregate limits. And I'm not going to go super in depth on that today, but it is important to note that per occurrence limits typically relate to a single breach. And if you have multiple customers or other people that you service users who make a claim based on a breach, you may quickly come up on your recurrence limit for your cyber liability insurance. Why is this important? Because if you do have multiple customers, you've agreed to a certain amount of liability for each of those through a contractual arrangement.

(26:44):

It may be the case that you eventually end up paying for a cybersecurity incident out of pocket when you may not have accounted for that in the growth of your company originally. So it is very important as an early stage mid-stage company to understand what the terms of your cyber liability insurance are what that policy requires you to do in the face of a breach, whether you should reach out to them first, what types of services they provide to help you address a breach. And then also understand within your contractual arrangements what types of insurance you are agreeing to have, what types of proof of insurance you need to provide to your customer or whomever you're contracting with. And then also how that will play out in the instance that there is a privacy or security incident going forward. I highly, highly encourage you to have conversations with multiple brokers that sell cyber liability insurance to figure out what is the best for your company at this stage. But you should also revisit that cyber liability insurance policy very frequently, especially as you take in more data, more customers, more patients, more users. It will be important for you to understand what your risk tolerance is juxtaposed with the amount of coverage that you currently have and have as you grow.

Rebecca Gwilt (28:09):

Yeah, and I would add to that, the importance of finding a broker that gets this industry and understands tech and health records in particular. They'll be able to tell you what a reasonable limit looks like, aggregate and per occurrence limit looks like for your company, and they'll be able to tell you about the different kinds of things that can be covered. There's no one size fits all cyber policy. So for instance, you know, have a policy that doesn't have a ransomware component and you get hit with a ransomware attack and they can help you on the breach side, but they can't help you pay that ransom. Those are the kinds of things that you'll want to talk through with your broker and Ashleigh's completely right, revisiting it as you grow will be very important because we've seen clients start with 1 million, 3 million per recurrence in aggregate all the way up through many, many of millions per occurrence depending on the size of their company. Alright, thank you so much Ashleigh and Bethany, I'm going to close with you. What are some things that our listeners can do right now right away other than call the two of you, of course, <laugh> to sort of make sure that they are taking some steps towards protecting their privacy and security. And I will preface this by saying if you think the answer is going to be by HIPAA compliant platform software, it's not right Bethany,

Bethany Corbin (29:41):

Right? Absolutely right. HIPAA compliance software is not going to be the answer here. So there's a couple of immediate steps that you can take

Rebecca Gwilt (29:47):

That's a whole answer, at least not

Bethany Corbin (29:49):

A whole answer, but there are a couple of steps that you can take no matter what speed you're at in your company. So the first is that I would highly encourage you, again, as Ashleigh and I have mentioned, right, to build your data map. That's something that you can do at any stage of your company. It's something that you should do at the beginning and then continually revisiting as your company grows, as you add new products, new employees, that is a very foundational step because you know what you have to protect against if you know where data is coming from. The next step I would say is to take a really hard look at your privacy policy because this is what says how you're going to be using and disclosing the data that you're collecting. We find that a lot of times privacy policies do not get the attention that they deserve.

(30:32):

And a lot of startup companies think, Hey, my competitor is doing this. I'm going to go copy and paste their privacy policy for my website because it's essentially the same product. We very actively discourage that because how your competitor uses and discloses data is going to be different than how you use and disclose that data. And if you don't have an accurate privacy policy that details what you're doing with data, you open yourself up to investigation penalties, et cetera from the Federal Trade Commission. And that's happened recently for Flow, I think it was in 2021, with the fact that their privacy policy didn't match how they were using and disclosing health data. So that would be my second recommendation. My third recommendation is if you have any employees, make sure they are trained on basic cybersecurity hygiene, because a lot of times these external hackers are gaining access to your systems through phishing attacks, ransomware, and that usually requires somebody on your team to click a suspicious file, click a suspicious link, and that's how they get access to your system. So by actively training and educating your workforce members on what to look for, what not to click, you're taking a great step right there in actually enhancing and preserving the privacy of your organization.

Rebecca Gwilt (31:52):

And I would add to that, I am a big evangelist around people being your biggest vulnerability. Everybody thinks it's tech that's going to save you but people are absolutely the largest vulnerability for your company. Not on purpose of course, usually but I would add to what Bethany's saying around that. There are free resources out there on the interwebs through HHS, I believe and elsewhere that have modules already that you can put in front of your people to tell them about privacy and security. You don't have to buy a super super, you don't have to create one yourself. You don't have to buy a super fancy bunch of modules. You can make an incremental step in training your staff with free resources that are available online and I absolutely recommend that you do so. This has been jam-packed of a bunch of information. I don't often do podcasts with two different voices. I want to give both of you the option to give some closing thoughts or add something I may have cut you off that you wanted to share so that we can soak in all the knowledge before we close this out. Anything left?

Ashleigh Giovannini (33:03):

Yeah, I did want to say I think that we have become accustomed to maybe aiming after we shoot on a lot of these issues when we're building a company and trying to get things off the ground into the market very quickly. While privacy and security may seem very kind of looming and daunting, it is a very important investment for you to make as an early stage company to get the ball rolling here and it will pay off for you in the long run, not just in protecting your users, protecting your customer, but also in your ability to contract with bigger entities, close bigger deals. This is an extremely important investment and we highly encourage you to build a program that is tailored to your needs and to revisit that as you go forward.

Rebecca Gwilt (33:58):

Yeah, I will tell you I started my career as a focusing on privacy and security. I wrote a couple of regulations when I was at CMS on privacy and security so that was sort of when I was a baby lawyer where I started. And I will tell you it's quite a thankless area of law to practice, but I would just repeat what Ashleigh said, which is it's absolutely a great investment. It's not the sexiest thing, it's not the most exciting thing. It isn't. Closing isn't wheeling and dealing but it absolutely has an impact on your ability to get in the door to close deals and to protect patients. It really is. These are sort of the unsung heroes of the healthcare tech world and I just appreciate your time so much. Both of you. If you haven't already please subscribe to decoding Healthcare Innovation, follow us on LinkedIn and Twitter. Join us next time. We'll be speaking with Cynthia Cynthia Plot who's the co-founder of Stix. About reproductive rights in telemedicine, in a shifting landscape how companies can do well and lead with their values. I'm super excited about that discussion. As always, you can check out the links and resources in our show notes and you can find out more about our work with Healthcare innovators@nixonwiltlaw.com. Thank you again, Ashleigh and Bethany and that's all for this episode. I'll see y'all next time. Thanks.