Tuesday, December 20, 2016

Hacking Complex Systems


Back in the day, you could download a piece of software, reverse engineer / fuzz it, find bugs, notify the vendor, post on Full Disclosure, watch a patch come out, and move on to the next bug.

These days systems have become very complex. A system might include:
  • A HID (Touch screen, keyboard, other devices)
  • Data Inputs (USB key, Bluetooth, Wireless, Satellite, Cell)
  • Firmware (BIOS or other embedded aspects)
  • OS
  • Applications (both OEM and 3rd party)
  • Media Servers
  • Other control systems
  • Telematics interfaces

This collection of components may be very expensive, on the order of 250k in some cases, or say 10-20k for a car. These components may be made by multiple different vendors, all with NDA's and MSA's between them.

This whole system is then certified and tested by numerous bodies such as FAA, TSA, NHTSA, NAFTA OEMs, Avionics Manufacturers such as Boeing and Airbus, Airlines, etc. There may be regulations and requirements around patch cycle timing, disclosure, and legal.

How in this context, can these systems be tested for security issues in a reliable and effective manner? Right now there are several ways this testing occurs:

1.) Via Testing Contracts.

The vendor puts out a bid or otherwise engages a 3rd party security company to test the system. NDAs and MSAs are exchanged, access to the system is provided, testing performed, and results delivered. Fixes are developed and pushed out according to the schedule and requirements agreed upon by all the organizations outlined above.

PROS

Vendor has a level of protection that their reputation won't be tarnished via media disclosures, their IP stolen, etc. Vendor has some assurance the testers are competent and there is a level of service expected.

CONS

This process is not public and people outside this framework have little to no insight into what is going on, how testing is done (or if), who is doing it, what fixes have been put in place. etc. This also limits the number of bright people who can see and test the system, almost ensuring that some bugs will be missed.

2.) Bug Bounties.

Vendors make some aspect of the system available publicly for anyone to test and pays a bounty for valid vulnerabilities discovered. In some special cases the vendor may make an entire system accessible for a limited amount of time. (Time limited to offset the cost of the system)

PROS

Process is public and many eyes are on the product. Raises the exposure of the product to new testers and approaches. Builds a level of trust in the vendor and assurance that the vendor "cares about security".

CONS

Costs the vendor time and effort and often produces little more than noise, or bugs already known about through internal testing. (I'm basing this on my personal discussions with vendors in the real world). Testing quality is often very low. Often the holistic system cannot be tested in this way, only components.

3.) Rogue Testing.

This is sort of where I came up in the industry initially before moving more into 1.) above. The way this works is that a researcher (or team of researchers) and/or a security company gain access to a system in some way. Examples include buying a piece of the system on eBay or in the case of publicly available systems such as avionics, testing it live. A car could be bought as well. This is sort of a black box approach as access to all the back end systems, telematics, source, .etc. will not be available.

PROS

A researcher can sort of do whatever they want without constraints. A security company can leverage this for media attention (marking / sales), and it drums up interest for conference talks. Real bugs are found this way and the vendor is technically notified, either as a heads up by the finder or via the media.

CONS

No trust is developed between the vendor and the bug finder. In fact the relationship is almost always adversarial by its nature. The public receives an unclear picture of the true threat. Do they trust the finder who is often over hyping to get attention or do they trust the vendor who has a material interest in under hyping and disproving the bug.

I'm sure I am missing other pros and cons to each of these, so please feel free to send me ideas. I'm also sure there are other approaches to testing which is why I am making this post. Here are some questions to consider:

  • Are complex systems such as avionics and automotive substantially the same from a testing perspective as windows hosts or endpoint software?
  • Is live testing on a passenger vehicle really the right way to do security testing?
  • Should only professional security companies with contracts in hand be allowed to test?
  • Are bug bounties in their current incarnation really effective for these types of systems?
My answer to the above questions is probably no.

I propose that we, the security community, collectively try to come up with a better way or framework for doing this. Any ideals will be appreciated and considered. Are you already doing something in this arena that is better than what I have outlined? Is there something you thought would work but have not gotten traction on it?

I'd love to hear from vendors, sec companies, and researchers alike.

I also propose that unethical behavior in our industry be called out. Every time a company brushes up against extortion, over hypes a bug, or claims credit for non-employee's work, just for short term sales, it damages the credibility of all of us and makes our jobs harder. Lets require the best of ourselves. Security has become huge, and is about to become bigger. Over the last year think how many times hacking has been in main stream media. Now contrast that with 10 years ago. This is an industry that is about to explode. Do we really want to be found wanting when the world finally is ready to take us seriously?
valsmith

No comments: