I woke up Saturday morning to this headline in CSO magazine:
"The US Food and Drug administration is urging patients with heart implants from Abbot Laboratories to visit a healthcare centre and install a firmware update that addresses a remote hacking vulnerability and a rapid battery depletion bug."
Oh, crap. This. Is. Not. Good.
While I don't have a pacemaker, my friend Michael does. He's had one for years. Millions of people around the world rely on these devices too. A remote hacking vulnerability could impact every one of them. This is not good.
This is also not a new headline. We've seen this before:
465,000 Pacemakers Recalled on Hacking Fears | Fortune (August 2017)
Over 8,600 Vulnerabilities Found in Pacemakers | The Hacker News (June 2017)
Fatal flaws in ten pacemakers make for Denial of Life attacks | The Register (Dec 2016)
When it comes to data theft, we've almost grown complacent. Hacks happen.
When it comes to healthy people dropping dead one day, that's a different story. That is not good.
It's a story that starts with software that is secure by design. Starting next month across the European Union, the "secure by design" requirement becomes law as GDPR enforcement goes live. This holds just as true for software powering medical devices, as it does for the applications powering your car, personal IoT devices, and banking services.
You can read more about the secure by design legal requirements under article 25 of GDPR. Think that's a European thing? Wrong. It applies to any business doing business in the EU.
Secure by design was born into law as part of GDPR, but the U.S. government is not far behind. In October 2017, Dr. Suzanne Schwartz of the FDA commented on medical device vulnerabilities and secure software development practices, saying:
"Because cybersecurity threats are a constant, manufacturers, hospitals, and other facilities must work to prevent them. There is a need to balance protecting patient safety and promoting the development of innovative technologies and improved device performance.
This means taking a total product lifecycle approach, starting at the product design phase when we build in security to help foil potential risks, followed by having a plan in place for managing any risks that might emerge, and planning for how to reduce the likelihood of future risks."
Then, earlier this week, the U.S. Food and Drug Administration announced that it is taking action mitigate serious cybersecurity threats to internet connected medical devices that could disrupt their operation. The FDA wants software and firmware in devices directly linked to patient safety (e.g., insulin pumps, pacemakers, cardioverter defibrillators) to be able to be updated on an ongoing basis to mitigate against known vulnerabilities. Their plan was released here on April 17.
As the software development industry, we must do better to protect the lives of those depending on us. This is of the foremost importance when it comes to connected medical devices, but just as critical when it comes to connected cars and other devices we rely upon for safety and security. Time-to-market priorities must not take precedence over life and limb.
Next month, secure by design becomes law. Not far behind that comes software liability for manufactures. It's time we all raised the bar on the quality, integrity, and security of the software we are building.