Ranica Arrowsmith, Associate Editor03.13.14
"Is there a doctor on board?"
This well-known phrase, a phrase we’ve all heard on TV or in the movie theater during a dramatic scene on screen, actually does get used in real life from time to time. And back in 2011, it was Eric J. Topol, M.D.’s turn to respond during a flight from Washington, D.C., to San Diego, Calif. He and several colleagues were returning home from a National Institutes of Health conference, so of course, there were several physicians on board that flight when the announcement was heard over the plane intercom.
“Topol, this is your gig,” his colleagues said to him, according to David Albert, M.D., in a 2011 interview. Topol is a cardiologist, so his Scripps Health surgeon colleagues thought he would be the man for the job. And it turns out, he was. The passenger in distress turned out to have had, years previously, a stent placed for a coronary artery obstruction, and he appeared to be having a heart attack. Enter Albert: Topol had the AliveCor ECG (electrocardiography) for iPhone on him, which happens to be an invention of Albert’s. He was able to use the device, loaded with Albert’s software, to determine that the patient had 4 or 5 millimeters of ST elevation, which is indicative of an acute heart attack. Topol then instructed the plane to land immediately, which it did—somewhere in the vicinity of Cincinnati, Ohio. The patient was tended to and he lived.
The ECG software Topol used was granted 510(k) clearance from the U.S. Food and Drug Administration (FDA) in 2012. The software now is commercialized by AliveCor Inc., and is intended to give anyone the ability to track his or her heart activity at any time. The company provides a sensor that can wrap around most smart phones, and it works by detecting heart rate and transmitting it to the software downloaded onto the phone, which then provides a reading. The company also provides a free mobile app for iOS (the platform used by Apple’s iPhone) or Android that works with the heart monitor to store single-channel ECGs. The data then can be stored in the cloud from which users can access their information at any times, as well as grant access to their physicians, print PDFs, and email it to caregivers and health professionals.
It’s hard to believe that the question of whether software can be considered a medical device has already been under examination and discussion for almost a decade, give or take a few years. In 2007, Emergo Group Inc., an Austin, Texas-based medical device consulting services company, added an examination of the question “When is software a medical device?” to its resource library. The answer? Sometimes, but it depends. The more critical the function of the software, the more likely it is to be regulated. For instance, software used to plan cancer treatment doses and to control the setting of oncology treatment devices would and should be regulated as a medical device. However, software used within the overall design and manufacturing processes of medical devices, or software used to transmit administrative data such as a patient’s name and address, would not be regulated as a medical device in of itself.
Vice President of Quality and Regulatory for Emergo Group Richard Vincins told Medical Product Outsourcing that going forward, it will in fact be industry that dictates how software for medical devices (and software as a medical device) will be regulated, rather than the FDA. And in fact, that really is how medical device regulation has evolved over the years.
“The FDA has reclassified many medical devices from let’s say class II to I, and that doesn’t come from the FDA,” Vincins said. “That comes from industry saying, “Look, there are all these class II devices out there—pH monitoring catheters and wound care products and so on—that are just not high risk anymore because the risk associated with those products are well known at this point—therefore we want these to be class I,’ and FDA listened. It’s going to be that way with software. Eventually software applications and products are going to become the norm, and the risks and hazards associated with them are going to be better understood. The problem with software nowadays is that FDA still views it as a black box, that things happen in the software that nobody really understands to make sure it works all the time, and that’s why FDA has software at a higher classification: because their main focus is to protect patients’ safety.”
Last year, the FDA issued its final guidance on mobile apps. Medical mobile apps now are ubiquitous, from ECG monitors to pain-tracking apps to apps that remind patients to take their medication on time. The guidance noted that the agency would use discretion in regulating health apps, as while such apps do essentially turn a smart phone into a medical device, in most cases they pose minimal risk to patients. FDA announced it would “focus its regulatory oversight on a subset of mobile medical apps that present a greater risk to patients if they do not work as intended.” In its press announcement, the agency specifically cited ECG apps as an example of an app that turns a mobile device into a high-risk, regulated medical device.
But as Vincins pointed out, the guidance included a veritable “laundry list” of apps that the FDA said it would regulate at discretion, which is not very helpful to software companies attempting to develop medical software.
“It has been difficult for the agency to keep up with [software] technology because it is developing at such a rapid rate,” Vincins said. “However, probably the biggest aspect of regulation is that people don’t want the FDA to regulate it. When you’re talking about an app that can record your glucose measurement, or your weight, or your blood pressure, and then transmit that info to your physician and the physician can provide you some additional information—measurements that you yourself as a person are just taking, as you’re trying to use this app as just a good health benefit—and then you have these regulatory agencies trying to say this is a medical device because a physician uses this to diagnose you and provide treatment, regulation will get pushback.”
But of course, the FDA only has patient safety in mind as it tries to find its way towards an ideal system for medical software regulation. Brian Schmidt, principal engineer at Plexus Corp., a product realization company based in Neenah, Wis., has seen an increase in software development for home-use devices with which patients regularly come into direct contact.
“We’re seeing devices that get the patient out of the hospital and into their home and help them communicate with their doctor so that they don’t need to be monitored on site,” Schmidt said. “We’ve certainly worked on sustaining devices that continuously deliver some sort of treatment, the typical case being wearable drug-dispensing devices. There’s a push to get devices, whether its software on your phone or tablets, or actual devices, that patients take home with them, and give them more continuous and more accessible treatment with less visits to the doctor’s office.”
With the increase in these types of home use devices intended to decrease hospital readmissions and increase hands-on health, both FDA and medical device manufacturers are “cautious” about regulation.
“Like most everything with the FDA, it’s risk based, it’s about how much harm could be caused,” Schmidt continued. They will follow up on what happens with the devices when they’re out there. Then if you start seeing incidences, the industry will follow through on it more closely.”
Full Spectrum Software Inc.’s OEM clients have had good experiences with the FDA in getting their software-related devices to market. President and Chief Technology Officer of the analytical software and medical device company Andrew Dallas told MPO, “We’ve had several clients that have gone through the 510(k) clearance process with a great level of satisfaction with the FDA. The FDA responded to their submission and got them through the process in a very timely fashion.
“The area met with the greatest frustration is medical devices as software such as mobile apps. The pace of change is incredible on those platforms and the innovation is remarkable—we are seeing great ideas from our clients. It’s a brave new world out there, and the FDA, like everyone else, is trying to get their head around it and make good decisions. They are taking an avid interest and doing their best to respond.”
Software Security
The draft guidance on cyber security released by the FDA last year had three general principles: confidentiality, integrity and availability. Confidentiality that data, information, or system structures are accessible only to authorized persons and entities and are processed at authorized times and in the authorized manner, thereby helping ensure data and system security. Integrity means that data and information are accurate and complete and have not been improperly modified. Availability means that data, information, and information systems are accessible and usable on a timely basis in the expected manner—or in other words, information is accessible when it’s needed, where it’s needed. The FDA warned that failure to maintain cybersecurity can result in compromised device functionality, loss of data availability or integrity, or exposure of other connected devices or networks to security threats. These, in turn, have the potential to result in patient illness, injury, or death.
The draft document confirms what most of MPO’s sources have suggested: Ultimately, FDA’s concern lies with patient’s physical safety and wellbeing. Privacy concerns are important, certainly, but they are rightly secondary to a patient’s actual health.
Until this year, there had not been any real-world instances of medical device hacking. The only instance that made news was a non-malicious hack conducted by the computer security firm McAfee in order to prove that hacking was indeed possible. But in February 2014, the San Francisco Chronicle reported that hackers had penetrated the computer networks of three of the worlds largest medical device companies: Medtronic Inc., St. Jude Medical and Boston Scientific Corporation. According to the Chronicle’s unnamed source, the hacker’s access may have lasted as long as several months in the first half of 2013. Signs point to the source of the hack coming from China, the source said.
“Like many companies, Boston Scientific experiences attempts to penetrate our networks and systems and we take such attempts seriously,” Denise Kaigler, senior vice president of corporate affairs and communications, told the newspaper. “We have a dedicated team to detect and mitigate attacks when they occur as well as to implement solutions to prevent future attacks.”
This type of reactive prevention is indicative of an attitude prevalent throughout not just the medical device industry, as Emergo’s Vincins suggested to MPO, but society as a whole. The ubiquity of data sharing and over-sharing via social networking, online shopping, and so on, has made us dependent on a system which we may not know much about. For instance, how much does the layperson really know about the security system of Amazon.com, to which so many freely give their credit card numbers every day? Similarly, how much do diabetes patients know about the security of their smart insulin pump on which they depend to deliver the correct dose of insulin? How much do highly trained, highly educated physicians know about data security and software systems?
“From talking with colleagues, I see that everybody assumes that medical software programs will have SSL and data encryption, etc.; but when I see companies validating software, they are not validating security,” Emergo’s Vincins said. “They’re not looking at the actual security of their programs, so when there is data and information on the cloud and things are being transmitted wirelessly, there’s quite an exposure there. Something drastic might happen and it might become very important at that time, but during validations, people really don’t look at security—rather, security is just assumed. When developers build these programs, whether they are using Linux or Java or some other operating system, they all have some inherent security protocols built in and its just assumed that they’re okay.”
To illustrate what medical software developers already have to consider before they even begin to consider security, Plexus’ Schmidt outlined the chain of development:
“There are several key elements,” Schmidt said. “First, engaging with your customer and understanding the domain. If it’s an imaging device, for instance, there are terminology and requirements that an imaging device must meet, and there are specific ways in which that device is used by the end customer whether it’s a patient or doctor. So first, it’s getting engaged and immersed in that domain.
“Then, we develop a core set of use cases, requirements, that the device must meet. This would mean, what does it have to do to be effective? What does it have to do to meet the patients,’ doctors’ and customers’ needs? And what are the constraints? There are always constraints of schedule, budget and cost – or maybe there are constraints based on size or the environment the device is being placed into.
“Then at Plexus we have a technique called ‘Concept Convergence’ that we apply to try and balance all that out and try to focus on what requirement can be met within which constraints to give the best product to all the stakeholders.
“From there we start designing, implementing and testing. You go through this design phase, and you’re also assessing the risks involved, the hazards that the device could pose, and you’re designing to mitigate those risks.
“Towards the end as the development gets locked down, you will go through a more formal verification phase, and this is probably the heavy process side for medical devices—it’s really to control any change during that process and to do verification.
“Our customers will start taking over at this point. There’s validation to be performed: taking that medical device and applying it in-situ, and making sure it truly is effective. That’s the validation end of it.
“The life cycle doesn’t end there. It goes into production; it starts getting released to patients. There’s still the life cycle of tracking any issues with the device or maintaining its components as things go end of life and so on.”
Medical device developers and software programmers are focused on saving lives and building good software respectively. Security is not their prime focus, by definition. According to a 2012 paper published by the Ann Arbor Research Center for Medical Device Security, “Designers of [implantable medical devices] already balance safety, reliability, complexity, power consumption, and cost. However, recent research has demonstrated that designers should also consider security and data privacy to protect patients from acts of theft or malice, especially as medical technology becomes increasingly connected to other systems via wireless communications or the Internet.”1
The paper, authored by computer science and engineering experts from the University of Massachusetts-Amherst, outlined security goals for implantable medical device (IMD) design. Designers should, the paper said, encrypt sensitive traffic where possible, authenticate third-party devices where possible, use well-studied cryptographic building blocks instead of ad-hoc designs, assume an adversary can discover your source code and designs, not rely on security through obscurity, use industry-standard source-code analysis techniques at design time, develop a realistic threat model, and defend the most attractive targets first. If that sounds like a lot to consider, it is, which is why, as Vincins has observed, it is easy for device developers to assume the responsibility for security lies outside their design parameters.
“Designing for security has many subtleties,” the paper said. “In the context of IMDs, where devices may be physically inaccessible for years, it is particularly important to avoid design errors that lead to failures or recalls later. One common error is believing in security through obscurity—relying entirely on proprietary ciphers or protocols for secrecy … Sound security principles dictate that a system’s security must not depend on the secrecy of the algorithm or hardware; it is better to use well studied standard ciphers and spend more design effort protecting cryptographic keys.”
“It’s interesting that there is this directive to communicate more openly, and yet the security aspects of it are effectively left to the reader,” said Full Spectrum’s Dallas. “The liability there is left to the providers. Providers are doing their best to make sure encryption is put in place. If they are providing the infrastructure for deployment—that is, if they are actually a cloud service providing software as a service—they have to have their security nailed down through standard firewalls, encryption and security methodologies. It’s a pretty daunting task and requires specialized skills.”
Risk Management and Quality Assurance
Security is just as important as software outside the medical device as inside it. Software made for electronic health records, medical device manufacturing process, and any number of other healthcare systems that don’t come into direct contact with the patient, has its own host of security concerns to consider.
“Security for software used in a highly regulated industry such as medical devices is critical, especially as it relates to 21 CFR [Code of Federal Regulations] Part 11,” Deborah Kacera, regulatory & industry strategist at Pilgrim Software Inc., a provider of enterprise quality and compliance management software solutions for highly regulated industries based in Tampa, Fla., told MPO. The code she is referring to is a subchapter of the FDA’s Federal Regulations that deals with the management of electronic records, signatures, and the requirements therein.
“Data as it is relates to any customer-protected health information captured in our quality management solution requires security under Hi-tech HIPAA2 or the European Union version, and has to be taken into consideration and designed into the software,” Kacera continued. “As our customers are mandated by regulations, quality management solution providers must provide a system that aids them in meeting those regulations. However, security for web and privacy for customers is no longer as important as security in the cloud as technology capabilities continue to grow. Now that medical device manufacturers are more willing to adopt cloud-based solutions for applications such as quality—mostly due to on-going pressures for operational efficiency and application portfolio support reductions—security now has to be scrutinized in layers to ensure no breach or release of data can happen in the cloud.”
Risk management is another topic of serious interest in medical device manufacture, according to Tim Lozier, marketing manager for Farmingdale, N.Y.-based ETQ Inc. EtQ makes quality and compliance management software. At its core, risk management is about identifying hazards, Lozier said in a 2012 interview with Technorazzi. It is a process that is objective, universal, and repeatable. Similar to security design, risk management is anticipatory, looking forward to possible problems so they can be fixed preemptively.
“One of the biggest trends in technology adoption around life sciences, especially device, is the concept of risk management,” Lozier told MPO. “Too often, medical device companies are trying to keep up with the pace of the market and release products, while maintaining compliance. The challenge centers on how you can you maintain compliance and make better decisions in an objective and systematic method. Risk tools are helping to achieve this. By placing risk levels to adverse events, companies are able to leverage quantitative risk-based data to help them make a decision on how to handle these events. Building in risk tools is helping to maintain an acceptable level of compliance, while keeping up with the pace of business.”
Other areas that are becoming increasingly important in medical device development software are traceability and reporting and analytics. It is vital, Lozier said, to be able to trace incidents and events in the manufacturing process, both for compliance and validation reasons. Tying analytics to a compliance system fosters continuous improvement, as it can uncover trends, risks, opportunities and areas of improvement.
This intangible thing we call “software” now dominates almost every area of medical device development and application. Software systems manage supply chains, validation systems, records, communication systems, and device function itself. As regulation struggles to keep up, innovation forges on, creating more and more ways to make people healthier, save healthcare systems money, and facilitate healthcare itself.
References:
1. “Design Challenges for Secure Implantable Medical Devices” by Wayne Burleson of the department of electrical and computer engineering and Shane S. Clark, Benjamin Ransford and Kevin Fu of the department of computer science, University of Massachusetts-Amherst.
2. The Health Insurance Portability and Accountability Act of 1996. Title II of HIPAA, known as the Administrative Simplification provisions, requires the establishment of national standards for electronic health care transactions and national identifiers for providers, health insurance plans, and employers.
This well-known phrase, a phrase we’ve all heard on TV or in the movie theater during a dramatic scene on screen, actually does get used in real life from time to time. And back in 2011, it was Eric J. Topol, M.D.’s turn to respond during a flight from Washington, D.C., to San Diego, Calif. He and several colleagues were returning home from a National Institutes of Health conference, so of course, there were several physicians on board that flight when the announcement was heard over the plane intercom.
“Topol, this is your gig,” his colleagues said to him, according to David Albert, M.D., in a 2011 interview. Topol is a cardiologist, so his Scripps Health surgeon colleagues thought he would be the man for the job. And it turns out, he was. The passenger in distress turned out to have had, years previously, a stent placed for a coronary artery obstruction, and he appeared to be having a heart attack. Enter Albert: Topol had the AliveCor ECG (electrocardiography) for iPhone on him, which happens to be an invention of Albert’s. He was able to use the device, loaded with Albert’s software, to determine that the patient had 4 or 5 millimeters of ST elevation, which is indicative of an acute heart attack. Topol then instructed the plane to land immediately, which it did—somewhere in the vicinity of Cincinnati, Ohio. The patient was tended to and he lived.
The ECG software Topol used was granted 510(k) clearance from the U.S. Food and Drug Administration (FDA) in 2012. The software now is commercialized by AliveCor Inc., and is intended to give anyone the ability to track his or her heart activity at any time. The company provides a sensor that can wrap around most smart phones, and it works by detecting heart rate and transmitting it to the software downloaded onto the phone, which then provides a reading. The company also provides a free mobile app for iOS (the platform used by Apple’s iPhone) or Android that works with the heart monitor to store single-channel ECGs. The data then can be stored in the cloud from which users can access their information at any times, as well as grant access to their physicians, print PDFs, and email it to caregivers and health professionals.
It’s hard to believe that the question of whether software can be considered a medical device has already been under examination and discussion for almost a decade, give or take a few years. In 2007, Emergo Group Inc., an Austin, Texas-based medical device consulting services company, added an examination of the question “When is software a medical device?” to its resource library. The answer? Sometimes, but it depends. The more critical the function of the software, the more likely it is to be regulated. For instance, software used to plan cancer treatment doses and to control the setting of oncology treatment devices would and should be regulated as a medical device. However, software used within the overall design and manufacturing processes of medical devices, or software used to transmit administrative data such as a patient’s name and address, would not be regulated as a medical device in of itself.
Vice President of Quality and Regulatory for Emergo Group Richard Vincins told Medical Product Outsourcing that going forward, it will in fact be industry that dictates how software for medical devices (and software as a medical device) will be regulated, rather than the FDA. And in fact, that really is how medical device regulation has evolved over the years.
“The FDA has reclassified many medical devices from let’s say class II to I, and that doesn’t come from the FDA,” Vincins said. “That comes from industry saying, “Look, there are all these class II devices out there—pH monitoring catheters and wound care products and so on—that are just not high risk anymore because the risk associated with those products are well known at this point—therefore we want these to be class I,’ and FDA listened. It’s going to be that way with software. Eventually software applications and products are going to become the norm, and the risks and hazards associated with them are going to be better understood. The problem with software nowadays is that FDA still views it as a black box, that things happen in the software that nobody really understands to make sure it works all the time, and that’s why FDA has software at a higher classification: because their main focus is to protect patients’ safety.”
Last year, the FDA issued its final guidance on mobile apps. Medical mobile apps now are ubiquitous, from ECG monitors to pain-tracking apps to apps that remind patients to take their medication on time. The guidance noted that the agency would use discretion in regulating health apps, as while such apps do essentially turn a smart phone into a medical device, in most cases they pose minimal risk to patients. FDA announced it would “focus its regulatory oversight on a subset of mobile medical apps that present a greater risk to patients if they do not work as intended.” In its press announcement, the agency specifically cited ECG apps as an example of an app that turns a mobile device into a high-risk, regulated medical device.
But as Vincins pointed out, the guidance included a veritable “laundry list” of apps that the FDA said it would regulate at discretion, which is not very helpful to software companies attempting to develop medical software.
“It has been difficult for the agency to keep up with [software] technology because it is developing at such a rapid rate,” Vincins said. “However, probably the biggest aspect of regulation is that people don’t want the FDA to regulate it. When you’re talking about an app that can record your glucose measurement, or your weight, or your blood pressure, and then transmit that info to your physician and the physician can provide you some additional information—measurements that you yourself as a person are just taking, as you’re trying to use this app as just a good health benefit—and then you have these regulatory agencies trying to say this is a medical device because a physician uses this to diagnose you and provide treatment, regulation will get pushback.”
But of course, the FDA only has patient safety in mind as it tries to find its way towards an ideal system for medical software regulation. Brian Schmidt, principal engineer at Plexus Corp., a product realization company based in Neenah, Wis., has seen an increase in software development for home-use devices with which patients regularly come into direct contact.
“We’re seeing devices that get the patient out of the hospital and into their home and help them communicate with their doctor so that they don’t need to be monitored on site,” Schmidt said. “We’ve certainly worked on sustaining devices that continuously deliver some sort of treatment, the typical case being wearable drug-dispensing devices. There’s a push to get devices, whether its software on your phone or tablets, or actual devices, that patients take home with them, and give them more continuous and more accessible treatment with less visits to the doctor’s office.”
With the increase in these types of home use devices intended to decrease hospital readmissions and increase hands-on health, both FDA and medical device manufacturers are “cautious” about regulation.
“Like most everything with the FDA, it’s risk based, it’s about how much harm could be caused,” Schmidt continued. They will follow up on what happens with the devices when they’re out there. Then if you start seeing incidences, the industry will follow through on it more closely.”
Full Spectrum Software Inc.’s OEM clients have had good experiences with the FDA in getting their software-related devices to market. President and Chief Technology Officer of the analytical software and medical device company Andrew Dallas told MPO, “We’ve had several clients that have gone through the 510(k) clearance process with a great level of satisfaction with the FDA. The FDA responded to their submission and got them through the process in a very timely fashion.
“The area met with the greatest frustration is medical devices as software such as mobile apps. The pace of change is incredible on those platforms and the innovation is remarkable—we are seeing great ideas from our clients. It’s a brave new world out there, and the FDA, like everyone else, is trying to get their head around it and make good decisions. They are taking an avid interest and doing their best to respond.”
Software Security
The draft guidance on cyber security released by the FDA last year had three general principles: confidentiality, integrity and availability. Confidentiality that data, information, or system structures are accessible only to authorized persons and entities and are processed at authorized times and in the authorized manner, thereby helping ensure data and system security. Integrity means that data and information are accurate and complete and have not been improperly modified. Availability means that data, information, and information systems are accessible and usable on a timely basis in the expected manner—or in other words, information is accessible when it’s needed, where it’s needed. The FDA warned that failure to maintain cybersecurity can result in compromised device functionality, loss of data availability or integrity, or exposure of other connected devices or networks to security threats. These, in turn, have the potential to result in patient illness, injury, or death.
The draft document confirms what most of MPO’s sources have suggested: Ultimately, FDA’s concern lies with patient’s physical safety and wellbeing. Privacy concerns are important, certainly, but they are rightly secondary to a patient’s actual health.
Until this year, there had not been any real-world instances of medical device hacking. The only instance that made news was a non-malicious hack conducted by the computer security firm McAfee in order to prove that hacking was indeed possible. But in February 2014, the San Francisco Chronicle reported that hackers had penetrated the computer networks of three of the worlds largest medical device companies: Medtronic Inc., St. Jude Medical and Boston Scientific Corporation. According to the Chronicle’s unnamed source, the hacker’s access may have lasted as long as several months in the first half of 2013. Signs point to the source of the hack coming from China, the source said.
“Like many companies, Boston Scientific experiences attempts to penetrate our networks and systems and we take such attempts seriously,” Denise Kaigler, senior vice president of corporate affairs and communications, told the newspaper. “We have a dedicated team to detect and mitigate attacks when they occur as well as to implement solutions to prevent future attacks.”
This type of reactive prevention is indicative of an attitude prevalent throughout not just the medical device industry, as Emergo’s Vincins suggested to MPO, but society as a whole. The ubiquity of data sharing and over-sharing via social networking, online shopping, and so on, has made us dependent on a system which we may not know much about. For instance, how much does the layperson really know about the security system of Amazon.com, to which so many freely give their credit card numbers every day? Similarly, how much do diabetes patients know about the security of their smart insulin pump on which they depend to deliver the correct dose of insulin? How much do highly trained, highly educated physicians know about data security and software systems?
“From talking with colleagues, I see that everybody assumes that medical software programs will have SSL and data encryption, etc.; but when I see companies validating software, they are not validating security,” Emergo’s Vincins said. “They’re not looking at the actual security of their programs, so when there is data and information on the cloud and things are being transmitted wirelessly, there’s quite an exposure there. Something drastic might happen and it might become very important at that time, but during validations, people really don’t look at security—rather, security is just assumed. When developers build these programs, whether they are using Linux or Java or some other operating system, they all have some inherent security protocols built in and its just assumed that they’re okay.”
To illustrate what medical software developers already have to consider before they even begin to consider security, Plexus’ Schmidt outlined the chain of development:
“There are several key elements,” Schmidt said. “First, engaging with your customer and understanding the domain. If it’s an imaging device, for instance, there are terminology and requirements that an imaging device must meet, and there are specific ways in which that device is used by the end customer whether it’s a patient or doctor. So first, it’s getting engaged and immersed in that domain.
“Then, we develop a core set of use cases, requirements, that the device must meet. This would mean, what does it have to do to be effective? What does it have to do to meet the patients,’ doctors’ and customers’ needs? And what are the constraints? There are always constraints of schedule, budget and cost – or maybe there are constraints based on size or the environment the device is being placed into.
“Then at Plexus we have a technique called ‘Concept Convergence’ that we apply to try and balance all that out and try to focus on what requirement can be met within which constraints to give the best product to all the stakeholders.
“From there we start designing, implementing and testing. You go through this design phase, and you’re also assessing the risks involved, the hazards that the device could pose, and you’re designing to mitigate those risks.
“Towards the end as the development gets locked down, you will go through a more formal verification phase, and this is probably the heavy process side for medical devices—it’s really to control any change during that process and to do verification.
“Our customers will start taking over at this point. There’s validation to be performed: taking that medical device and applying it in-situ, and making sure it truly is effective. That’s the validation end of it.
“The life cycle doesn’t end there. It goes into production; it starts getting released to patients. There’s still the life cycle of tracking any issues with the device or maintaining its components as things go end of life and so on.”
Medical device developers and software programmers are focused on saving lives and building good software respectively. Security is not their prime focus, by definition. According to a 2012 paper published by the Ann Arbor Research Center for Medical Device Security, “Designers of [implantable medical devices] already balance safety, reliability, complexity, power consumption, and cost. However, recent research has demonstrated that designers should also consider security and data privacy to protect patients from acts of theft or malice, especially as medical technology becomes increasingly connected to other systems via wireless communications or the Internet.”1
The paper, authored by computer science and engineering experts from the University of Massachusetts-Amherst, outlined security goals for implantable medical device (IMD) design. Designers should, the paper said, encrypt sensitive traffic where possible, authenticate third-party devices where possible, use well-studied cryptographic building blocks instead of ad-hoc designs, assume an adversary can discover your source code and designs, not rely on security through obscurity, use industry-standard source-code analysis techniques at design time, develop a realistic threat model, and defend the most attractive targets first. If that sounds like a lot to consider, it is, which is why, as Vincins has observed, it is easy for device developers to assume the responsibility for security lies outside their design parameters.
“Designing for security has many subtleties,” the paper said. “In the context of IMDs, where devices may be physically inaccessible for years, it is particularly important to avoid design errors that lead to failures or recalls later. One common error is believing in security through obscurity—relying entirely on proprietary ciphers or protocols for secrecy … Sound security principles dictate that a system’s security must not depend on the secrecy of the algorithm or hardware; it is better to use well studied standard ciphers and spend more design effort protecting cryptographic keys.”
“It’s interesting that there is this directive to communicate more openly, and yet the security aspects of it are effectively left to the reader,” said Full Spectrum’s Dallas. “The liability there is left to the providers. Providers are doing their best to make sure encryption is put in place. If they are providing the infrastructure for deployment—that is, if they are actually a cloud service providing software as a service—they have to have their security nailed down through standard firewalls, encryption and security methodologies. It’s a pretty daunting task and requires specialized skills.”
Risk Management and Quality Assurance
Security is just as important as software outside the medical device as inside it. Software made for electronic health records, medical device manufacturing process, and any number of other healthcare systems that don’t come into direct contact with the patient, has its own host of security concerns to consider.
“Security for software used in a highly regulated industry such as medical devices is critical, especially as it relates to 21 CFR [Code of Federal Regulations] Part 11,” Deborah Kacera, regulatory & industry strategist at Pilgrim Software Inc., a provider of enterprise quality and compliance management software solutions for highly regulated industries based in Tampa, Fla., told MPO. The code she is referring to is a subchapter of the FDA’s Federal Regulations that deals with the management of electronic records, signatures, and the requirements therein.
“Data as it is relates to any customer-protected health information captured in our quality management solution requires security under Hi-tech HIPAA2 or the European Union version, and has to be taken into consideration and designed into the software,” Kacera continued. “As our customers are mandated by regulations, quality management solution providers must provide a system that aids them in meeting those regulations. However, security for web and privacy for customers is no longer as important as security in the cloud as technology capabilities continue to grow. Now that medical device manufacturers are more willing to adopt cloud-based solutions for applications such as quality—mostly due to on-going pressures for operational efficiency and application portfolio support reductions—security now has to be scrutinized in layers to ensure no breach or release of data can happen in the cloud.”
Risk management is another topic of serious interest in medical device manufacture, according to Tim Lozier, marketing manager for Farmingdale, N.Y.-based ETQ Inc. EtQ makes quality and compliance management software. At its core, risk management is about identifying hazards, Lozier said in a 2012 interview with Technorazzi. It is a process that is objective, universal, and repeatable. Similar to security design, risk management is anticipatory, looking forward to possible problems so they can be fixed preemptively.
“One of the biggest trends in technology adoption around life sciences, especially device, is the concept of risk management,” Lozier told MPO. “Too often, medical device companies are trying to keep up with the pace of the market and release products, while maintaining compliance. The challenge centers on how you can you maintain compliance and make better decisions in an objective and systematic method. Risk tools are helping to achieve this. By placing risk levels to adverse events, companies are able to leverage quantitative risk-based data to help them make a decision on how to handle these events. Building in risk tools is helping to maintain an acceptable level of compliance, while keeping up with the pace of business.”
Other areas that are becoming increasingly important in medical device development software are traceability and reporting and analytics. It is vital, Lozier said, to be able to trace incidents and events in the manufacturing process, both for compliance and validation reasons. Tying analytics to a compliance system fosters continuous improvement, as it can uncover trends, risks, opportunities and areas of improvement.
This intangible thing we call “software” now dominates almost every area of medical device development and application. Software systems manage supply chains, validation systems, records, communication systems, and device function itself. As regulation struggles to keep up, innovation forges on, creating more and more ways to make people healthier, save healthcare systems money, and facilitate healthcare itself.
References:
1. “Design Challenges for Secure Implantable Medical Devices” by Wayne Burleson of the department of electrical and computer engineering and Shane S. Clark, Benjamin Ransford and Kevin Fu of the department of computer science, University of Massachusetts-Amherst.
2. The Health Insurance Portability and Accountability Act of 1996. Title II of HIPAA, known as the Administrative Simplification provisions, requires the establishment of national standards for electronic health care transactions and national identifiers for providers, health insurance plans, and employers.