Yesterday, I had the privilege of attending the Aspen Institute’s Cambridge Cyber Conference. It was put on by CNBC and the Aspen Institute, and was at the Edward M. Kennedy Institute for the US Senate. It certainly exceeded my expectations:
I don’t think I want to type up all the copious notes I took, but here are nine key takeaways:
Cyber is a noun now
This always makes me squirm. But when industry CISOs and the White House Cybersecurity guru use “cyber” as a noun, I guess I can’t drag my heels and oppose it anymore.
The government is still uneasy with encryption
Several officials from government agencies talked about encryption. They agree that it’s essential to our security and privacy, and that they advocate its use. They also say they don’t advocate for backdoors.
What they do want, though, is a way for companies to be able to bypass encryption with a warrant. I’m not sure how bypassable encryption isn’t, by definition, a backdoor. They complain, not entirely unreasonably, about “warrant-proof encryption,” which they argue has no analogue in the hardware world. (A safe can be drilled, or even blasted with dynamite.)
Nation-state actors are a growing threat
I don’t know the source, but it was stated that North Korea has stolen over a billion dollars through exploits. Russia has been influencing other country’s elections online. And then there’s China…
Security threats no longer come from dudes sitting in their basement. National governments—including our own—have teams dedicated to offensive cyber. (See, it’s a noun!) While there’s been a call for something like a “Cyber Geneva Convention” to set boundaries—such as not targeting individuals or companies—none exists yet.
Almost by definition, these adversaries are quite well-financed and organized.
The “crown jewels” aren’t always the target.
North Korea’s attack against Sony over The Interview caught a lot of people by surprise. We’re used to attacks being against the obvious targets, and the ones we guard carefully.
Leaking emails has become a common tactic, too. Just look at Hillary or Macron—or, again, Sony.
And remember the Target breach? They got in through the Internet-connected HVAC target.
You need a plan!
Rod Rosenstein, Deputy Attorney General at the DOJ gave a short speech before lunch. Talking about the the threat of cyberattacks, he remarked: “If you think it won’t happen to you, you are probably wrong.”
Of course you should seek to do everything you can to prevent an attack in the first place. But speaker after speaker emphasized the importance of planning ahead of time your reaction to a compromise.
What was interesting to me is that the plan is not a particularly technical thing. Of course it should involve forensics and then audits to ensure whatever was exploited is fixed throughout your company. But the plan is arguably a Crisis Management function, not a technical runbook. It’s critical that it outline communications—both to the media and between departments.
Equifax came up again in this discussion. While their breach was enormous and arguably irresponsible, what set it apart was that their response to the breach was utterly incompetent. Executives sold off stock; public statements came too late; sleazy releases of liability were forced upon those who tried to find out if they had been compromised. If they had had a plan in place that did things like freeze stock trading, and get top executives in charge of prompt and honest communication, the incident would have been less of a dumpster fire.
You need to follow the plan!
An executive from Booz Allen Hamilton talked about working with clients on drills and rehearsals. An astonishing number of them, under pressure from the incident, completely failed to follow their plan. He likened it to youth versus professional soccer: at the youth games, all the players would chase the ball wherever it went, which strikes me as a really apt analogy. What’s needed, of course, is to get to the level of the pros, in which everyone knows their role and follows it.
The way to get here isn’t “if something happens, remember the plan!” It’s, as myriad speakers talked about, something that’s acquired from practice. Across industries, across functions, everyone insists that companies should regularly be practicing their incident response.
We need a societal shift, not security policy
“Societal shift” may not be the right word, but security can’t remain an IT policy. It needs to be something everyone in the company is aware of and committed to.
Someone at my small lunch session compared this to company polices on sexual harassment or diversity training. A company should have a sexual harassment policy, sure, but it doesn’t mean much on its own as a written document. The way we moved the needle there was to make awareness of sexual harassment and how to prevent it absolutely mandatory, and violation unacceptable.
Having security policies is only a small step in the right direction. You need to make it a part of your culture before it will be successful.
The government wants to collaborate with industry
As an individual, and perhaps a naïve one, I figured reporting cybercrime to law enforcement, aside from enormous exploits, was pretty useless.
It was pointed out that about 80% of our infrastructure—things like the power grid or telecom—is run by private industry. The government has substantial resources they can bring to bear, and they implored us to bring them in. Whether it’s sharing knowledge of prevention or having law enforcement investigate attacks, they want to work with us.
Basic “hygiene” remains important.
Equifax’s breach is thought to have come from a software vulnerability months after the fix was released. Microsoft still struggles to get everyone to accept the security fixes they try to push out. These sort of, “We fixed that ages ago!” exploits continue to happen with alarming frequency.
A major financial firm talked about their campaign of red-teamed phishing emails. Initially, something like half of recipients fell for it. Education was put in place, and the campaign was repeated periodically. The number of employees clicking got down to about 4%.
But 4% is still unacceptably high. In a massive company, that’s still thousands of employees putting the company at risk. (They apparently ran with the “putting the company at risk” bit, and made clear that they’re willing to start letting go of employees who repeatedly fail to take the most basic of security precautions.)