Lets Call Stunt Hacking What it is, Media Whoring.
I recently read this article: http://www.foxnews.com/tech/2015/03/17/ground-control-analysts-warn-airplane-communications-systems-vulnerable-to/ and it brought to mind some thoughts that have been percolating for quite a while. Sometime last year I believe Dave Aitel coined the term Stunt Hacking, which I think is a pretty good way to describe it. We often see these media blitzes about someone hacking a car, or an airplane, or some other device. The public who has a limited understanding of the technology, and the media who has a worse understanding, get in a frenzy or outrage, the security company hopes this translates into sales leads, and the researcher hopes this translates into name recognition leading to jobs, raises, conference talks, etc.
A question that I think we should keep in mind is: Why would a company hire someone who just publicly displayed how little they understand about the technology and made their desired potential client look bad.
There are two problems with this: 1.) The research is often FUD or based on a very limited understanding of real world deployment or 2.) Any actually valuable technical research gets lost in the hype.
Let me be clear, I am not saying that researchers like Charlie Miller or Barnaby Jack haven't contributed meaningful or ground breaking research to the community, (they have), but many ride a hype wave that is often unwarranted. Unscrupulous infosec companies take advantage of such researchers work to drive sales of mediocre consulting services as well.
The practice of companies pushing their best researchers to drop and overhype controversial or gimmicky bugs makes no sense from a business perspective either from the security vendor or the services purchaser point of view. Who wins in the long run? The vendor loses credibility and the purchaser suffers in the PR space.
Stunt hacking often works something like this:
1.) Purchase from Ebay or otherwise some component of a system widely in use that doesn't look like a computer but uses underlying computeresque technology.
2.) Since physical access to the device is ensured (unlike in the real world), spend a period of time analyzing and understanding the device.
3.) Develop or acquire some tool set to interact with the device.
4.) Make the device do something that the public perceives is out of the ordinary or unusual.
5.) Issue a number of hyping press releases. (The media has a vested interest in producing spectacular stories)
6.) Jump on the security conference talk circuit and present the research as many times as possible.
There are several issues with this and I can use some real world examples to explain them. When you state that you can hack an airplane based on something you saw (or worse did) on a flight, and that a particular vendor is or is not security responsible, you are missing a number of things:
- FAA Involvement - There are processes for approval, auditing, development and release cycles that pass through FAA policies. This affects time frames for patches to be pushed, what kinds of software can be installed, and how things are updated and inter-connected.
- Airline Involvement - What a particular vendor develops is often heavily modified or integrated into an airline's customized product suite. This means that company A could develop a piece of hardware or software for airplanes, the airline buys it, then the airline drastically changes it. It may not be immediately obvious where the responsibility for a security issue lies.
- Aircraft Manufacturer Involvement - Essentially the same as the previous point.
- Air crews - Maintenance and flight crews have the ability to modify some settings and make changes to the system.
- Product Vendor - The originator of a particular product. If they want to push a change, such as a security fix, all the above stakeholders and more have to be involved in that process. That means that an issue can be known, a fix developed and released, and it can take months or even years while it transitions all the stakeholders and each makes a business decision about applicability and severity before it reaches a particular airplane.
- Safety Concerns - Any technology that goes on to an aircraft is rigorously analyzed and tested for any potential impact to flight safety. Even if this technology doesn't touch the flight systems, its presence on the plane requires that it be checked. This leads to a slow down in the deployment of both new technologies, as well as fixes.
- End of Life Cycles - An airline can purchase a particular system, but that doesn't mean that they will purchase a new system or upgrade the old one. Serious fixes will likely be implemented, but as technology changes, older systems may fall by the wayside in security maintenance. It is a valid business decision for an airline or other org. to look at the cost of general technology upgrades across a fleet.
Just because a company doesn’t want to hire YOU in particular, or tell you about what they are doing security wise, doesn’t mean that they don’t care about security! Or that they are doing nothing! For all you know they have a team of well credentialed people working on it and external factors make the release of fixes slower than you would personally prefer. Such hubris in this industry.
Do you want electronics and backpacks with gear in them banned on airplanes? Because that is how you get there. Do you want the adversarial, but slowly healing, relationship between hackers and business to become openly hostile and driving research totally underground? That's how you get there.
Have some professionalism! Try to work with the vendor so that you get a fuller picture and can provide more value to the world. If they don't want to work with you, understand there may be many factors at play that you are unaware of, and rely on the fact that you are creative and move on to a new technology.
The 1990's and early 2000's were a valuable time where groups such as the l0pht pushed companies to develop security programs and fix bugs. They succeeded for the most part. We now live in a world with bug bounties, security budgets, and companies that actually care about their security. Its time to evolve our tactics on the researcher side to match the evolution business has made. Unless you are an underground hacker / blackhat. In that case, don't promote yourself as a professional researcher and try to get contracts! Do your thing but own it, don't pretend to be something you're not.
Let’s take another example; ATMs. When you buy a used ATM off of Ebay or something similar and develop an attack for it, there are assumptions that are made and important things left out of the equation.
- What is the physical protection regime and tamper evident posture for a particular location, bank, or deploying maintenance company?
- What vendor modules are enabled or disabled via licensing on the individual ATM?
- What is the middleware in use and how is it configured to protect or configure a particular ATM?
- What are the interconnects to the bank and what transports are used? Cell, modem, Ethernet, etc.
- What card tracks are in use?
- Is it a modified XP, OS2warp, or other OS?
- How and where does an HSM come in to play?
All of these things apply or have corollaries in the automotive, satellite, medical, SCADA, and other industries. In the end, they are just computers of one sort or another.
Next we need to discuss what our industry is really doing with all of this. I've seen many researchers feign outrage that something is "so insecure" and wanting to "protect users". After sitting through 10 years of conference private parties, I have serious doubts that this is always the case. I think fame, media attention, hacker cred, etc. are more frequently the drivers than some sort of user centric altruism. Not always, but often.
This is exacerbated by the fact that it is a common tactic for security companies to hire one or two "rock star" researchers, have them pull off a bit of stunt hacking, often of dubious impact, and then push the FUD as hard as possible across whatever conferences will take them and whatever news shows will interview them.
I feel I can speak about this because I spent a lot of time speaking at conferences (at one point I think I held the record for the most talks in one week, 7.) and I was interviewed by media here and there. This personal experience is how I learned it is a bunch of BS. The media, for the most part, doesn't care or understand what you are talking about, really. They care about viewers for a short news cycle and FUD is sensational and achieves this goal. As far as the conference circuit, well that's full of BS as well. I remember attending a highly technical talk on rootkits by Joanna Rutkowska, a brilliant researcher in her own right, so please don't mistake this for me bagging on her, I'm not. The material in the talk was compelling and she broke new ground. However eavesdropping on other audience members, few knew what she was talking about. Multiple times I heard "I have no idea what she is talking about, but she's really smart". They paid thousands of dollars for that privilege. And rootkits have little impact on day to day security for most businesses. The value of highly technical security conferences is rather low, except to the researchers themselves, and pushing the field forward. But it is a money maker. I think it is rather telling that you don't see many talks from her anymore, perhaps she figured out the same issues I am talking about, I don't know. She does however continue to conduct highly technical, academically and business valuable work, quietly, without unnecessary hype.
I gave both technical talks as well as conceptual ones full of pictures. Other researchers somewhat respected the former while general audiences got little out of it. Audiences enjoyed and found the latter valuable, while researchers couldn't take me seriously. I tested this over 10 years and my conclusion is that for me, conferences have little value. But stunt hacking plays deeply into this dysfunction. It generates press for the conference and the researcher, dazzles and outrages attendees, and generates money and fun for many. But is it really helping anything?
If a "researcher" spends all their time on the conference circuit and talking on cable news shows, how much of a researcher are they really versus a marketing professional? A wise man once told me; "Let your work speak for itself."
And now we can proceed into the darker side of all this. High pressure sales in infosec. Most of my clients are former clients of big name, well known security companies. After a period of trust building they often show me the reports, deliverables, and emails from previous infosec "professionals" that they have engaged before me. THIS is where the real outrage and disappointment comes into play. Extremely poor deliverables for big bucks, arrogant "recommendations" (more like demands) with little business value, and a focus on upselling versus doing a good job on the current project is the norm. Several times I have seen the following:
Infosec company / individual: "Hire me/us to be your security researcher". Often this is after an initial first gig that didn't work out well.
Potential Client: "No thanks, we already have someone and we don't like the way you do business". Doing business often refers to everything but the technical work. For example communications, documentation, status reports, pricing, honoring NDA's, etc.
Infosec company / individual: "You better hire us or we will tell everyone how insecure and irresponsible you are!". Telling everyone involves conferences and media.
Potential Client: "That seems like a bad idea, especially since we have someone good working on it and you have an NDA with us, which you would be violating."
Infosec company / researcher: "We don't care, hire us or else!"
I may have oversimplified the exchange slightly in the interest of brevity, but this borderlines on extortion and is unacceptable. This kind of short sighted behavior is dragging our industry down and hurting the credibility of everyone, especially since it is so common. The focus on short term, scan and bang profits versus long term relationship building and iterative, incremental, business benefiting improvement is damaging the ability of legitimate researchers and companies to engender real change. Organizations are becoming disillusioned with engaging in real infosec, even as it becomes a hot industry.
This must stop. Stunt hacking must die. Researchers must learn to look beyond a overhyped-bug-snapshot in time and LEARN the industries and technologies they research. In the old days hackers knew more about a technology than the people building and maintaining it, not just how to break something and move on to the next trademarked bug. Let’s get back there before we lose all credibility.