world of internet security, latest cyber security news,information,updates on technology,it job vacancies,internet security,breaches,and safeguards

Friday 30 May 2014

Can I drop a pacemaker 0day:

with 0 Comment
Can I drop a pacemaker 0day at DefCon that is capable of killing people?

Computers now run our cars. It's now possible for a hacker to infect your car with a "virus" that can slam on the brakes in the middle of the freeway. Computers now run medical devices like pacemakers and insulin pumps, it's now becoming possible assassinate somebody by stopping their pacemaker with a bluetooth exploit.

The problem is that manufacturers are 20 years behind in terms of computer "security". They don't just have vulnerabilities, they have obvious vulnerabilities. That means not only can these devices be hacked, they can be easily be hacked by teenagers. Vendors do something like put a secret backdoor password in a device believing nobody is smart enough to find it -- then a kid finds it in under a minute using a simple program like "strings".

Telling vendors about the problem rarely helps because vendors don't care. If they cared at all, they wouldn't have been putting the vulnerabilities in their product to begin with. 30% of such products have easily discovered backdoors, which is something they should already care about, so telling them you've discovered they are one of the 30% won't help.

Historically, we've dealt with vendor unresponsiveness through the process of "full disclosure". If a vendor was unresponsive after we gave them a chance to first fix the bug, we simply published the bug ("drop 0day"), either on a mailing list, or during a talk at a hacker convention like DefCon. Only after full disclosure does the company take the problem seriously and fix it.

This process has worked well. If we look at the evolution of products from Windows to Chrome, the threat of 0day has caused them to vastly improve their products. Moreover, now they court 0day: Google pays you a bounty for Chrome 0day, with no strings attached on how you might also maliciously use it.

So let's say I've found a pacemaker with an obvious BlueTooth backdoor that allows me to kill a person, and a year after notifying the vendor, they still ignore the problem, continuing to ship vulnerable pacemakers to customers. What should I do? If I do nothing, more and more such pacemakers will ship, endangering more lives. If I disclose the bug, then hackers may use it to kill some people.

The problem is that dropping a pacemaker 0day is so horrific that most people would readily agree it should be outlawed. But, at the same time, without the threat of 0day, vendors will ignore the problem.

This is the question for groups that defend "coder's rights", like the EFF. Will they really defend coders in this hypothetical scenario, declaring that releasing code 0day code is free speech that reveals problems of public concern? Or will they agree that such code should be suppressed in the name of public safety?

I ask this question because right now they are avoiding the issue, because whichever stance they take will anger a lot of people. This paper from the EFF on the issue seems to support disclosing 0days, but only in the abstract, not in the concrete scenario that I support. The EFF has a history of backing away from previous principles when they become unpopular. For example, they once fought against regulating the Internet as a public utility, now they fight for it in the name of net neutrality. Another example is selling 0days to the government, which the EFF criticizes. I doubt if the EFF will continue to support disclosing 0days when they can kill people.

By the way, it should be clear in the above post on which side of this question I stand: for coder's rights.

0 comments:

Post a Comment