Year One: SWAMP a catalyst for improving cyber-security

As an exercise in the high-stakes world of software security, Patrick Beyer ran an open-source medical technology software package through the Software Assurance Marketplace (SWAMP) to see what would happen.

The answer: 25,800 vulnerabilities were flagged. Most of the catches were minor language inconsistencies, but some were considered high-priority, and at least one had the potential to crash the system.

Beyer, who is program manager of the SWAMP, run out of the Morgridge Institute for Research, says university researchers are writing this software package to process Magnetic Resonance Imaging (MRI) scans. While the software is not used in patient treatment, Beyer adds: “It’s out there. Anybody can grab this and use this.”

After its first full year in operation, the SWAMP is working to make software security problems like this yesterday’s news. The marketplace is meant to give software code developers, especially those in the write-and-share open-source world, a simple, one-stop resource to examine code with a multitude of both open-source and commercial assessment tools.

“Everything right now is software-centric, from your fridge to your toaster to your car, to pacemakers and other medical devices,” says Beyer. “As a result, it is more critical to quickly identify weaknesses in software so people can’t exploit it.”

Funded by a $23.4 million Department of Homeland Security (DHS) grant and led by Morgridge Chief Technology Officer Miron Livny, the SWAMP draws together computer experts from the University of Wisconsin–Madison, Indiana University and the University of Illinois Urbana-Champaign.

By the numbers, the marketplace is off to an active start:

  • 50,000 software assessments were run in its first year by outside users and public sector partners;
  • 20 full-time assessment experts have been hired in Madison alone to support the project;
  • 12 outside assurance tools have been added to the SWAMP, including four from commercial vendors giving the open-source community free access to their products, which is unprecedented in the industry;
  • 700 core computers from the Morgridge Institute are powering the assessment tools;
  • Four coding languages are supported, including C/C++, Java Source code, Java Bytecode and Python;
  • and 400 software applications with known vulnerabilities are used the test the capabilities of assessment tools.

Beyer says critical developments in the SWAMP’s long-term sustainability include partnerships with a range of private vendors such as Parasoft, Veracode, GrammaTech, Red Lizard and Secure Decisions. Single tools, by and large, can catch only about one-third of the bugs in a given set of software. By offering scores of tools that work simultaneously in assessing code, the odds of finding everything of importance increases, Beyer says.

That point underscores why individual software developers need a more comprehensive resource. “We want to simplify and automate all of these tasks,” he says. “Our message to programmers is, ‘Do it early and do it often.’”

Beyer’s team also is addressing the academic community through programs that incorporate the SWAMP into university courses. The group is working with computer science undergraduate programs at Bowie State University in Maryland and the Rochester Institute of Technology to get future programmers versed in secure coding.

“Our message to programmers is, ‘Do it early and do it often.’”
Patrick Beyer

Barton Miller, a UW–Madison computer science professor and partner in the SWAMP, has developed tutorials on security assessment and secure programming techniques that have been taught at academic, government, and company sites in the U.S., Europe, Asia, and South America, including at local medical records leader Epic Systems.

Miller currently is teamed with a University of Texas-Dallas professor to create an undergraduate curriculum in secure coding. It’s an underdeveloped subject for many computer science programs.

Year two of the SWAMP features several new priorities. The team plans to add a new type of assessment viewing tool that conducts dynamic code assessment. All tests today are analyzing static lines of code; the new system will assess performance while the software is booted up and running.

The team also will offer the ability to create local installations of the SWAMP for large companies with sensitive competitive issues. This will allow the SWAMP to be deployed within a company’s own walls.

And the SWAMP itself is a complex collection of software. Does it pass its own assessment test?

“Absolutely,” says Beyer. “We eat our own dog food, as we like to say.”