Skip to main content

“They were very unsure what to do with us”: How Golden Goose Award-winning researchers at UW and UCSD put the brakes on automobile cybersecurity threats and transformed an industry

Collage of portraits of Stephen Checkoway, Karl Koscher, Stefan Savage and Tadayoshi Kohno
UW and UCSD Golden Goose Award recipients (clockwise from top left): Stephen Checkoway, Karl Koscher, Stefan Savage and Tadayoshi Kohno

In 2010 and 2011, a team of researchers led by Allen School professor Tadayoshi Kohno and Allen School alumnus and University of California San Diego professor Stefan Savage (Ph.D., ‘02) published a pair of papers detailing how they were able to hack into a couple of Chevrolet Impalas and take command of a range of functions, from operating the windshield wipers to applying — or even denying — the brakes. A decade later, Kohno, Savage, and University of Washington alumni Karl Koscher (Ph.D., ‘14), now a research scientist in the Allen School’s Security & Privacy Research Lab, and Stephen Checkoway (B.S., ‘05), a UCSD Ph.D. student at the time who is now a faculty member at Oberlin College, have received the Golden Goose Award from the American Association for the Advancement of Science for demonstrating “how scientific advances resulting from foundational research can help respond to national and global challenges, often in unforeseen ways.”

“More than 10 years ago, we saw that devices in our world were becoming incredibly computerized, and we wanted to understand what the risks might be if they continued to evolve without thought toward security and privacy,” explained Kohno in a UW News release.

Achieving that understanding would go on to have significant real-world impact, influencing “how products are built and how policies are written,” noted Savage. It would also transform not just the automobile manufacturing landscape, but the computer security research landscape as well. 

“The entire automotive security industry grew from this effort,” recalled Kohno. “And I imagine that neighboring industries saw what happened here and didn’t want something similar happening to them.”

“What happened here” was that Kohno and his colleagues demonstrated how a motor vehicle’s computerized systems could be vulnerable to attackers, theoretically endangering the car’s occupants and those who share the road with them. The quartet was aided and abetted by collaborators that included, on the Allen School side, then-student and current professor Franziska Roesner (Ph.D., ‘14), fellow student Alexei Czeskis (Ph.D., ‘13), and professor Shwetak Patel; on the UCSD side, they were joined by postdoc Damon McCoy, master’s student Danny Anderson, professor Hovav Shacham, and the late researcher Brian Kantor.

This “dream team,” as Kohno describes it, set out to reverse-engineer the various vehicle components. The goal was to figure out how they communicated with each other so that they could use that to gain access to the systems that control the vehicle’s functions. The researchers published two papers in rapid succession detailing their findings; the first established how a car’s internal systems were vulnerable to compromise, while the follow-up explored the external attack surface of the vehicle by demonstrating how an attacker could infiltrate and control those systems remotely. The team presented the former at the 2010 IEEE Symposium on Security & Privacy and the latter at the 2011 USENIX Security Symposium.

In a way, Savage recalled, the researchers’ ignorance about how the vehicle’s systems were actually designed to work ended up working to the team’s advantage; it enabled them to approach their task without any preconceptions of what should happen. An example is the brake controller, which they broke into via a technique known as black-box testing or “fuzzing.” As the label suggests, these efforts involved less precision and more “throwing stuff at it,” according to Savage, to see what would stick. The results were enough to stop anyone in their tracks — including the technical experts at GM.

“We figured out ways to put the brake controller into this test mode,” Koscher explained to UW News. “And in the test mode, we found we could either leak the brake system pressure to prevent the brakes from working or keep the system fully pressurized so that it slams on the brakes.”

As the senior Ph.D.s on the project, Koscher and Checkoway spearheaded that discovery, which involved calling into the car’s OnStar unit and instructing it to download and install remote command and control software that they had written. With that in place, they were able to compel the system to download the software that would enable them to remotely control the brakes from a laptop — as demonstrated later in a famous “60 Minutes” segment in which the team surprised correspondent Leslie Stahl by bringing the car to a complete stop while she was behind the wheel.

While that made for good television, what is most gratifying for the researchers are the industry and regulatory frameworks that grew out of their discovery. 

For example, GM — along with other manufacturers — hired an entire security team as a direct result of the UW and UCSD research; likewise, the National Highway Traffic Safety Administration (NHTSA) — which previously had no one on staff with computer security expertise and “were very unsure what to do with us,” according to Savage — wound up creating an entire unit devoted to cybersecurity, complete with its own testing lab. In other positive changes, the Society of Automotive Engineers — later renamed SAE International — established a set of security standards that all automobile manufacturers adhere to, and the industry created the Auto-ISAC, a national Information Sharing and Analysis Center, to enhance vehicle cybersecurity and address emerging threats.

The team’s work also paved the way for new research outside of the automotive industry. For example, its results inspired the U.S. Defense Advanced Research Projects Agency (DARPA) to create its HACMS project, short for High-Assurance Cyber Military Systems, to examine the security of cyber-physical systems. And that was just the start.

“My gut tells me that the attention directed at this project helped to build up expertise in this embedded systems realm,” observed Koscher. “What was initially focused on automotive security was then applied to other industries, such as medical devices.”

The project also served to highlight the advantages of working as part of a larger team, as Checkoway discovered to his delight. While various members of the group may have approached a problem from different angles, they would often meet in the middle to come up with a solution.

“This was an extremely collaborative effort,” Checkoway explained last year. “No task was performed by an individual researcher alone. I believe our close collaboration was the key to our success.”

At the time the researchers quietly revealed their results to GM, they couldn’t be sure such a happy outcome was a foregone conclusion. At first, the company representatives didn’t believe they could do some of the things they had done — or how they could have possibly done them. But the team’s non-adversarial approach, in which they opted to walk company representatives through their process and findings while refraining from naming the manufacturer publicly, went a long way toward steering the conversations in a positive direction.

“As academics, we have the opportunity to approach the dialogue around vulnerabilities without really having a stake in the game,” explained Kohno. “We’re not selling vulnerabilities, we’re not selling a product to patch vulnerabilities, and we aren’t a competing manufacturer. So we discovered something, and once we had the results, we wanted to figure out, how can we use this knowledge to make the world a better place?”

The team is quick to credit the federal government for driving investment in a project for which they didn’t have a precise destination in mind when they started. According to Savage, the National Science Foundation’s willingness to back a project that was not guaranteed to pan out was key to enabling them to identify these latent security risks. “We’re extremely grateful to NSF for having flexibility to fund this work that was so speculative and off the beaten path,” Savage said.

Two men wearing masks standing next to car, one is typing on laptop set on car roof
Checkoway (left) and Koscher reunite with Emma the Impala in the UW’s Central Garage Mark Stone/University of Washington

It is just the kind of work that the Golden Goose Award was created to recognize. In answer to the late U.S. Senator William Proxmire’s “Golden Fleece Award” ridiculing federal investment that he deemed wasteful, U.S. Representative Jim Cooper conceived of the Golden Goose Award to honor “the tremendous human and economic benefits of federally funded research by highlighting examples of seemingly obscure studies that have led to major breakthroughs and resulted in significant societal impact.”

For Kohno, that impact and this most recent recognition — the team previously earned a Test of Time Award from the IEEE Computer Society Technical Committee on Security and Privacy — are motivation enough to explore where the next security risk may come from. 

“The question that I have now is, as security researchers, what should we be investigating today, such that we have the same impact in the next 10 years?”

The team was formally honored in a virtual ceremony hosted by the AAAS last week. Read the feature story on the UW and UCSD team here, the UW News release here and a related story by Oberlin College here. Learn about all of the 2021 Golden Goose honorees here.