Advertisement

OPM cybersecurity chief pushes ‘security through visibility’

The government's focus on defense in depth and creating a pristine IT environment is not working, argues OPM's Director of Security Operations Jeff Wagner.

(iStockphoto)

Federal cybersecurity officials should move away from the government’s obsession with creating a “pristine” IT environment from which to construct a security baseline, and should instead assume their networks have been compromised and adopt a “security through visibility” approach to defending their agencies, according to a white paper authored by the Office of Personnel Management’s information security director.

“For the past 20 years, Defense in Depth has proven ineffective,” wrote OPM Director of Security Operations Jeff Wagner, in a white paper released Wednesday by the Bethesda, Maryland, chapter of the Armed Forces Communications and Electronics Association. “Layers of ‘walls’ to let good guys in and keep bad ones out hasn’t worked very well and yet agencies continue to follow many of these ineffective practices to satisfy audits and attempt to prove due diligence. Getting serious about layered defenses requires intimate knowledge of critical data, the paths to that data, and the potential check points and monitoring mechanisms to watch for suspicious activities.”

Wagner argues that agencies should assume the opposite of what is implied when a system is provided an authority to operate. “Given what they’ve seen with regard to highly sophisticated malware that’s been hidden for years (Energetic Bear, Poodle Bug, APT1, and Heartbleed) and the even more insidious activities of trusted insiders, agencies should approach security as if they’ve already been compromised,” Wagner said. “By beginning here we can take a proactive approach to searching for those intruders rather than a reactive approach that focuses on known incidents – government has to start searching for the unknown.”

Advertisement

According to Wagner, this so-called “security through visibility” approach is gaining momentum. But he acknowledges that moving from a reactive incident response approach to a proactive posture that seeks out malicious behavior and software will require dedicated resources and is a significant challenge.

“When you start tracing a user, any user, through the network as if they were the bad guy, it becomes incredibly real and scary when they realize they don’t always know what the user is doing,” Wagner said. “Can agencies effectively say they know the data within each application, each function and how they tie together? Become the enemy and step through the network. Map out what the enemy would see for each user type and each application. This is a massive task and the basis of penetration testing — dedicate some resources to ongoing penetration tests that mimic varying levels of access and locate the critical monitoring junctures along those paths.”

Defense_In_Depth_-_Onion_Model

The traditional “onion model” of defense in depth. (Photo credit: K. Bolino Under a Creative Commons Attribution SA 3.0 license.)

The right approach?

Wagner does not propose abandoning the defense-in-depth model altogether. In fact, he urges federal cybersecurity leaders to use security and risk management controls developed by the National Institute of Standards and Technology to ensure whatever baselines are established are as secure as possible.

Advertisement

“Security isn’t just about controls, or defense or protections. It is about all that combined with the knowledge of how they work together and what can be seen, identified, tracked and contained,” Wagner said.

Cybersecurity experts throughout industry tend to agree with Wagner’s approach.

“Jeff Wagner doesn’t recommend discarding defense in depth; in fact, he recommends a decent baseline,” said Steve Riley, technical director at Riverbed. “He acknowledges that this isn’t enough, though. A CISO can spend budget in three areas: prevention, detection, and response. For far too long, most budget has been allocated for prevention—after all, that’s what security vendors like to sell.”

Sol Cates, chief security officer at data security firm Vormetric Inc., said the old model of starting with endpoint security and working backward toward the data is no longer viable. “It’s safe to stay this approach doesn’t work, so the movement towards security through visibility’ is a smart one. It recognizes that endpoint protection is full of holes and compliance isn’t enough,” he said.

“I think there is merit in the concept and adoption and success will have to be measured through even better visibility and communications,” said Morey Haber, vice president of technology for Phoenix-based cybersecurity firm BeyondTrust. “Having visibility into an attack can help make more intelligent and thoughtful decisions of how to react versus just being defensive in a layered approach.”

Advertisement

“The bad guys are going to get in, if they’re not already there,” said Jacob Olcott, vice president of business development at BitSight Technologies and a former counsel to the House Homeland Security Committee. “Identifying what’s important and where it is located can help organizations prioritize risks and resources.”

Ivan Shefrin, vice president of security solutions at security vendor TaaSera, said the growing onslaught of data breaches makes it clear a new model for cybersecurity is needed. “One key element is developing a new frame of reference for detection, containment and response,” Shefrin said. “Before a data breach occurs, the most dangerous activities occur after initial intrusion and infection. These post-infection, pre-breach behaviors include staging, reconnaissance, propagation, obfuscation, data acquisition, insider threats, and ultimately, the loss and exfiltration of data.”

Dan Lohrmann, chief strategist and chief security officer at Security Mentor, warned that pitting defense in depth against real-time network visibility “is not a winning strategy” for cybersecurity. “Some organizations do a better job at the protection function than others, but the defense-in-depth strategy is still very useful to stop the bad guys,” Lohrmann said. “Timeliness in detecting the intruder and responding in the appropriate way is paramount, and better visibility is certainly important in that process.”

If you don’t control the cloud, how are you going to put up defenses?

Mav Turner, director of security for SolarWinds, said the real challenge for security professionals is to understand and find the right balance between both approaches.

Advertisement

“Unfortunately security professionals don’t get to pick a single approach. You can’t ignore the tenants of defense in depth and just focus on finding vulnerabilities and breaches,” Turner said. “The critical idea is the need to balance securing their environment, validating those measures are effective and incident response procedures. You need all three.”

“We need pervasive protection across the full attack continuum–before, during, and after an attack,” said Anthony Grieco, principal engineer at Cisco Systems. “This means security technologies moving towards having flight-recorder like capabilities to reduce the time needed to scope malware breaches.”

This type of visibility amounts to being able to continuously analyze and track files after they’ve entered a company’s network and to never lose sight of where a file goes or what it does, Grieco said. “You can’t have visibility without control, so should a file pass through initially thought to be good or unknown is later identified as malicious, do we have the technology in place to go back in time to retrospectively remediate?”

Richard Moulds, vice president of product strategy at Thales e-Security, said with the rise in popularity of cloud services the old arguments and approaches to security don’t work.

“Unfortunately as the corporate environment becomes more distributed, more fragmented and more dynamic the first approach becomes less and less practical,” Moulds said. “Couple this with the rise of cloud computing and the attack surface becomes so large and so difficult to control that the concept of a secure perimeter essentially evaporates. If you don’t control the cloud, how are you going to put up defenses? Taking the visibility approach means focusing on agility in engineering, and using a risk-oriented security strategy rather than a compliance-oriented one.”

Advertisement

Although the visibility approach may be more agile, it “relies on a set of capabilities that are far from universal,” Moulds acknowledged.

Caution

But some experts, like Jonathan Sander, strategy and research officer at STEALTHbits Technologies, caution against the perception that because defense in depth hasn’t been completely effective in stopping attacks that organizations should abandon sound security best practices.

“Security through visibility is a label for emerging threat intelligence and analytics that aim to mine all the data you have to get a better idea about the bad guys you know are attacking and infiltrating your organization,” Sander said. “You absolutely want to employ security through visibility. The recent growth in breach numbers shows us that we all need to do more. So when new methods like security through visibility emerge you want to deploy them [as soon as possible]. Just be careful not to throw out the baby with the bathwater by letting security through visibility replace what worked from defense in depth.”

While intrusion detection remains a critical component for building layered defense in depth, it stops short of helping organizations identify post-infection behaviors as malware or insiders take hold across the network, Shefrin said. “Security analysts are left to fill in the blanks based on analysis after the damage is done. Detecting intrusions at the network perimeter has become today’s Maginot Line.”

Latest Podcasts