If you had to name a safe pair of hands, healthcare and medical practitioners would probably rank at the top. However, the medical technology (MedTech) they use is in a vulnerable state. Recent years have proven this extensively, with an unprecedented number of cyberattacks targeting critical medical equipment in hospitals. Here, Neil Oliver, technical marketing manager at professional battery manufacturer Accutronics, explains what this means for MedTech manufacturers.
In 2015, a hacking convention held in Kentucky found a staggering 68,000 vulnerable MedTech systems in just one healthcare organisation. Subsequently, a senior researcher at security company Kaspersky Lab revealed at 2016’s Security Analyst Summit that he had easily infiltrated an MRI machine and uncovered sensitive information.
The volume of cyberattacks shows that medical information and technology is a valuable resource for those with malicious intent. However, this presents medical original equipment manufacturers (OEMs) with an excellent opportunity to reassess their approach to device security. At the start of 2016, the US Food and Drug Administration (FDA) drafted guidance for securing medical devices. Yet these proposed guidelines only go some way to solving the issue of MedTech security and do not cover hardware security.
Cybersecurity is certainly a big issue for healthcare, but it is not the only aspect of medical security that practitioners must consider. When medical OEMs are designing their devices, it is critical they keep in mind the safety of the physical device itself.
One of the biggest issues facing devices across all industries is that of hardware hacking, which is when a counterfeit or questionable component or peripheral is used in a device and undermines its integrity. This ranges from a USB memory stick in a hospital computer carrying malware or even a counterfeit battery being used in a device, resulting in unexpected and abrupt power failure.
While some of these hardware hacks are the direct result of third parties with malicious intent, many are unintentionally caused by device manufacturers or practitioners. For example, a MedTech OEM might be under financial constraint and choose a grey market or unbranded battery to power its device. While this initially reduces overhead costs, there are no quality assurances with these components and this increases the risk of sudden device failure or premature depletion.
It is for this reason that recent calls to consider security during design stages offer OEMs an opportunity. While the FDA guidelines are currently recommendations and not legally binding, responsible OEMs will abide by them and pay close attention to a device’s security — both cyber and physical — at each stage of the product development life cycle (PDLC).
For example, an OEM might want to consider algorithmic security to ensure that the only battery compatible with the device is one that the manufacturer itself designates. This would require consulting with a medical battery manufacturer early in the PDLC.
Algorithmic security is essentially software encryption. Equipment is assigned with a unique hash value that corresponds with that of the battery. When the battery is connected to the device, it must first solve an equation that correlates with the hash value.
If a battery is unable to successfully complete the algorithm, the OEM determines what happens next. Critical devices will likely be programmed to pop up a warning message to users, while non-critical devices may be programmed to not power up at all. In either case, this eliminates the risk of liability should a practitioner use a counterfeit that subsequently fails during use.
Following these recent proposals on cybersecurity by the FDA, it is likely we will soon see the organisation push for legal guidelines on MedTech security. Should this be the case, it is vital that the FDA also considers the physical security of devices and rules concerning critical components. Only when both sides are effectively governed can healthcare professionals be the safe pair of hands they need to be.