Last updated at Thu, 31 Aug 2017 13:55:59 GMT

Maybe I'm being cynical, but I feel like that may well be the thought that a lot of people have when they hear about two surveys posted online this week to investigate perspectives on vulnerability disclosure and handling. Yet despite my natural cynicism, I believe these surveys are a valuable and important step towards understanding the real status quo around vulnerability disclosure and handling so the actions taken to drive adoption of best practices will be more likely to have impact.

Hopefully this blog will explain why I feel this way. Before we get into it, here are the surveys:

A little bit of background…

In March 2015, the National Telecommunications and Information Administration (NTIA) issued a request for comment to “identify substantive cybersecurity issues that affect the digital ecosystem and digital economic growth where broad consensus, coordinated action, and the development of best practices could substantially improve security for organizations and consumers.” Based on the responses they received, they then announced that they were convening a “multistakeholder process concerning collaboration between security researchers and software and system developers and owners to address security vulnerability disclosure.”

This announcement was met by the deafening sound of groaning from the security community, many of whom have already participated in countless multistakeholder processes on this topic. The debate around vulnerability disclosure and handling is not new, and it has a tendency to veer towards the religious, with security researchers on one side, and technology providers on the other. Despite this, there have been a number of good faith efforts to develop best practices so researchers and technology providers can work more productively together, reducing the risk on both sides, as well as for end-users. This work has even resulted in two ISO standards (ISO 29147 & ISO 30111) providing vulnerability disclosure and handling best practices for technology providers and operators. So why did the NTIA receive comments proposing this topic?  And of all the things proposed, why did they pick this as their first topic?

In my opinion, it's for two main, connected reasons.

Firstly, despite all the phenomenal work that has gone into developing best practices for vulnerability disclosure and handling, adoption of these practices is still very limited. Rapid7 conducts quite a lot of vulnerability disclosures, either for our own researchers, or on occasion for researchers in the Metasploit community that don't want to deal with the hassle.  Anecdotally, we reckon we receive a response to these disclosures maybe 20% of the time. The rest of the time, it's crickets. In fact, at the first meeting of the NTIA process in Berkeley, Art Manion of the CERT Coordination Center commented that they've taken to sending registered snail mail as it's the only way they can be sure a disclosure has been received.  It was hard to tell if that's a joke or true facts.

So adoption still seems to be a challenge, and maybe some people (like me) hope this process can help. Of course, the efforts that went before tried to drive adoption, so why should this one be any different?

This brings me to the second of my reasons for this project, namely that the times have changed, and with them the context. In the past five years, we've seen a staggering number of breaches reported in the news; we've seen high-profile branded vulnerability disclosures dominate headlines and put security on the executive team's radar. We've seen bug bounties starting to be adopted by the more security-minded companies. And importantly, we've seen the Government start to pay attention to security research – we've seen that in the DMCA exemption recently approved, the FDA post-market guidance being proposed, the FTC's presence at DEF CON, the Department of Defense's bug bounty, and of course, in the very fact that the NTIA picked this topic. None of these factors alone creates a turn of the tide, but combined, they just might provide an opportunity for us to take a step forward.

And that's what we're talking about here – steps. It's important to remember that complex problems are almost never solved overnight. The work done in this NTIA process builds on work conducted before: for example the development of best practices; the disclosure of vulnerability research; efforts to address or fix those bugs; the adoption of bug bounties. All of these pieces make up a picture that reveals a gradual shift in the culture around vulnerability disclosure and handling. Our efforts, should they yield results, will also not be a panacea, but we hope they will pave the way for other steps forward in the future.

OK, but why do we need surveys?

As I said above, discussions around this tend to become a little heated, and there's not always a lot of empathy between the two sides, which doesn't make for great potential for finding resolution. A lot of this dialogue is fueled by assumptions.

My experience and resulting perspective on this topic stems from having worked on both sides of the fence – first as a reputation manager for tech companies (where my reaction to a vulnerability disclosure would have been to try to kill it with fire); and then more recently I have partnered with researchers to get the word out about vulnerabilities, or have coordinated Rapid7's efforts to respond to major disclosures in the community. At different points I have responded with indignation on behalf of my tech company client, who I saw as being threatened by those Shady Researcher Types, and then later on behalf of my researcher friends, who I have seen threatened by those Evil Corporation Types. I say that somewhat tongue-in-cheek, but I do often hear that kind of dialogue coming from the different groups involved, and much worse besides. There are a lot of stereotypes and assumptions in this discussion, and I find they are rarely all that true.

I thought my experience gave me a pretty good handle on the debate and the various points of view I would encounter. I thought I knew the reality behind the hyperbolic discourse, yet I find I am still surprised by the things I hear.

For example, it turns out a lot of technology providers (both big and small) don't think of themselves as such and so they are in the “don't know what they don't know” bucket. It also turns out a lot of technology operators are terrified of being extorted by researchers. I've been told that a few times, but had initially dismissed it as hyperbole, until an incredibly stressed security professional working at a non-profit and trying to figure out how to interpret an inbound from a researcher contacted me asking for help. When I looked at the communication from the researcher, I could absolutely understand his concern.

On the researcher side, I've been saddened by the number of people that tell me they don't want to disclose findings because they're afraid of legal threats from the vendor. Yet more have told me they see no point in disclosing to vendors because they never respond.  As I said above, we can relate to that point of view! At the same time, we recently disclosed a vulnerability to Xfinity, and missed disclosing through their preferred reporting route (we disclosed to Xfinity addresses, and their recommendation is to use abuse@comcast.net).  When we went public, they pointed this out, and were actually very responsive and engaged regarding the disclosure. We realized that we've become so used to a lack of response from vendors that we stopped pushing ourselves to do everything we can to get one. If we care about reaching the right outcome to improve security – and we do – we can't allow ourselves to become defeatist.

My point here is that assumptions may be based on past experience, but that doesn't mean they are always correct, or even still correct in the current context. Assumptions, particularly erroneous ones, undermine our ability to understand the heart of the problem, which reduces our chances of proposing solutions that will work. Assumptions and stereotypes are also clear signs of a lack of empathy. How will we ever achieve any kind of productive collaboration, compromise, or cultural evolution if we aren't able or willing to empathize with each other?  I rarely find that anyone is motivated by purely nefarious motives, and understanding what actually does motivate them and why is the key to informing and influencing behavior to effect positive change.  Even if in some instances it means that it's your own behavior that might change J

So, about those surveys…

The group that developed the surveys – the Awareness and Adoption Group participating in the NTIA process (not NTIA itself) – is comprised of a mixture of security researchers, technology providers, civil liberties advocates, policy makers, and vulnerability disclosure veterans and participants. It's a pretty mixed group and it's unlikely we all have the same goals or priorities in participating, but I've been very impressed and grateful that everyone has made a real effort to listen to each other and understand each other's points of view. Our goal with the surveys is to do that on a far bigger scale so we can really understand a lot more about how people think about this topic. Ideally we will see responses from technology providers and operators, and security researchers that would not normally participate in something like the NTIA process as they are the vast majority and we want to understand their (your?!) perspectives. We're hoping you can help us defeat any assumptions we may have - the only hypothesis we hope to prove out here is that we don't know everything and can still learn.

So please do take the survey that relates to you, and please do share them and encourage others to do likewise:

Thank you!

@infosecjen