Security is a serious business. And revealing unknown flaws can make or break people – and companies. This is especially true in the healthcare industry. As more health issues are being solved through the use of implantable technologies, security issues will become even more important. But when do “announcements” of implant vulnerabilities go from reasonable disclosure to security theater?
When my wife sent me a link to a CNBC article entitled “Security researchers say they can hack Medtronic pacemakers”, I took notice. As posted previously, I have been a cyborg since July 2002. And in 2010, I received a replacement implant. At the time, I wondered whether (of if) these devices might be hacked. After all these devices could be programmed over-the-air (OTA). Fortunately, their wireless range was (and still is) extremely limited. Indeed, it is fair to say that these devices have only “near-field communications” capability. So unless someone could get close to a patient, the possibility of a wireless attack is quite limited.
But as technology has advanced, so too have the threats of exploitation. Given recent technology advances, there was a fair chance that my device could be hacked in the same way that NFC chips in a mobile phone can be hacked. In fact, when I cross-referenced the CNBC article with other articles, I saw a picture of the very same programmer that my cardiologist uses for me. It was the vert same picture (from Medtronics) that I had posted on my personal blog over eight years ago. So as I opened the link from my wife, my heart was probably beating just a little more quickly. But I was relieved to see that CNBC was guilty of succumbing to the security theater that is Black Hat Vegas.
In this case, the Black Hat demonstrators had hacked a “programmer” (i.e., a really fancy laptop that loads firmware to the implantable device). The demonstrators rightfully noted that if a ‘bad actor’ wanted to injure a specific person, they could hack the “programmer” that is in the doctor’s office or at the hospital. And when the electro-physiology tech (EPT) did a “device check”, the implanted device (and the patient) could be harmed.
This is not a new risk. The programmer (i.e., laptop) could have been hacked from the very start. After all, the programmer is just a laptop with medical programs running on it. It is altogether nothing fancy.
The real risk is that more and more device-assisted health treatments will emerge. And along with their benefits, these devices will come with some risks. That is true for all new technologies – whether medical or not. There is a risk of bad design, or software bugs, or poor installation, or inattention to periodic updates. And there is a risk that this technology might be exploited. Of course, the fact that a pacemaker might be subject to failure during an EMP does not mean that the device should never be used.
It’s just a risk.
Fortunately, this is no different than the countless number of risks that we take every day. We trust car designers, driving instructors, other drivers, and even the weather forecasters whenever we drive our cars. And the threat that our cars are run by computers – and can necessarily be hacked – doesn’t prevent everyone from driving.
Let’s leave the security theater in Vegas. And let’s leave the paranoia to professionals – like Alex Jones.