AWARDEES: Stephen Checkoway, Tadayoshi Kohno, Karl Koscher, and Stefan Savage
FEDERAL FUNDING AGENCIES: National Science Foundation
Once upon a time, on a sunny day at the University of Washington’s (UW) campus, a graduate student typed a few lines of code into a laptop. Nearby, a car started. Its doors unlocked. And just like that, with no need for a key to unlock the car or start the ignition, another graduate student was able to get in and drive away. It could have been a scene from a heist movie. But those graduate students weren’t trying to steal the car they were hacking into, and neither were their partners at the University of California, San Diego (UCSD), 1,200 miles away. They were trying to make it—and millions of cars just like it—safer.
Led by Tadayoshi “Yoshi” Kohno and Stefan Savage along with lead senior PhD students Stephen Checkoway and Karl Koscher, a team of researchers from the UCSD and UW proved that internet-connected vehicles could have their critical functions (including the engine, lights, and brakes) overridden by a remote attacker via a range of digital pathways. Their trailblazing work, published in a pair of landmark papers in 2010 and 2011, led to a revolution in how automakers, the federal government, and other stakeholders approached automotive security.
UCSD North and UW South
All heists need a mastermind, and this one began with two: Tadayoshi Kohno and Stefan Savage.
A seasoned karate practitioner and instructor who grew up in Boulder, CO, Kohno sees similarities between the martial arts discipline and computer security, both of which are about pitting two sides against each other. “For almost as long as I can remember, I’ve been interested in computer security and cryptography and the cat-and-mouse game between the adversary and the defender,” he says.
Savage’s road to computer security took quite a few more twists and turns. Born in Paris and raised in Manhattan, Savage studied history in college before becoming involved in computer science. He credits his humanities background, particularly its emphasis on writing persuasively and defending ideas, for giving him the tools that later helped him to succeed in the sciences. He tells his students that, along with performing good research, understanding how to tell a story and frame a problem are just as key for developing careers.
Before graduating and taking a position at the University of Washington, Kohno was a PhD student at the University of California, San Diego. Savage did the opposite, attending graduate school at UW before taking a faculty position at UCSD. “UW and UCSD have this great tradition of exchanging students and faculty,” says Savage. The two computer science departments are so enmeshed that UCSD students sometimes jokingly refer to the University of Washington as “UCSD North,” while UW students refer to the school in San Diego as “UW South.” In fact, many other members of the team have various connections with both schools.
Kohno wasn’t Savage’s student at UCSD, but they knew each other, and they would occasionally talk about computer security-related topics. One area of interest to both of them was the increasing connectivity of cars. At the time, OnStar was prioritizing direct-to-consumer marketing, and Savage and Kohno discussed how it might be fun to look into the telematics system. But they were both busy with other projects and pushed the idea into a future “someday.”
Motion’s Eleven: Assembling the Team
A few years went by. Kohno accepted a faculty position at UW and new students joined his lab, including Karl Koscher. Now a research scientist with UW’s Security and Privacy Research Lab, Koscher brought a wealth of experience with embedded systems (computers that don’t necessarily announce themselves as computers, like those in a car) to the team. According to his teammates, he also has a time-saving superpower: “When you’re looking for bugs and vulnerabilities, there are all kinds of techniques, but they take time,” says Savage. “And there is a thing that Karl has that very few people have, which is this incredible intuition about where to start hunting for bugs."
Another key member of the team at UCSD, Stephen Checkoway, now an assistant professor at Oberlin College, had just finished a grueling project with another car hacking researcher, Hovav Shacham, investigating the vulnerability of voting machines to hacking. Checkoway was exhausted, but when Savage approached him about the project, he ultimately couldn’t say no. “It’s pretty easy to talk me into research,” he laughs.
In a coincidence guaranteed to please fans of heist movies, the ultimate team was composed of eleven individuals representing a vast range of experience. On the UCSD side, there was Stephen Checkoway, then-postdoc Damon McCoy, research staff member Brian Kantor, then-master’s student Danny Anderson, professor Hovav Shacham, and Stefan Savage. On the UW side, there was Karl Koscher, Alexei Czeskis, and Franziska Roesner, all PhD students at the time, professor Shwetak Patel, and Tadayoshi Kohno. Although the teams were working in two separate states, every member worked closely together, sharing discoveries and ideas through frequent conference calls and over the group chat.
As the project’s leads, Checkoway, Kohno, Koscher, and Savage worked together to set the direction of the research. However, all four stress that the project’s success was only possible because of the combined efforts of every member of the team. “We had a shared vision and shared belief in the potential of this project,” says Kohno, “and everyone believed in each other and knew that this was going to be a lot of work.”
Another key member of the plan, the funder, was a little more hesitant to get on board. The team applied for a National Science Foundation (NSF) Cyber-Physical Systems (CPS) grant. Back then, according to Koscher, the CPS program was focused more on the power grid and related systems than on connected devices. The reviewers didn’t have a firm grasp on what the researchers were planning to do—“we didn’t either,” jokes Savage in a nod to the project’s curiosity-driven, experimental quality—and there was an erroneous belief that the industry must be taking care of the problem. Ultimately, though, the proposal was approved under a different funding stream, NSF’s Trustworthy Computing Systems (TWC) program, with one catch: NSF wouldn’t buy the cars.
Using private funds, the researchers were able to buy two identical cars, one for each campus. Although at the time the research was published the team declined to identify the cars, it has since been revealed that they bought two 2009 Chevy Impalas, a car manufactured by the American automotive corporation General Motors (GM). All the members of the team, however, emphasize that cyber security and privacy for automobiles was not strictly a GM problem. In fact, the Impala was chosen in part because it was representative of many cars, made by many different manufacturers, on the market at the time. The UW team named their car Emma, while the UCSD team named theirs Vlad. Vlad the Impala.
Once all the players were assembled, the funding was in place, and cars were acquired, the team had to figure out what to do with them. And to do that, they needed to understand what a modern car is… and what it isn’t.
Cracking the Code
When we think of a car, many of us still think of a mechanical device—a gas-powered engine, controlled by a steering wheel, gearshift, and brakes, on four wheels. And for most of their history, that’s what cars were. But no longer, according to Savage. “A car is a big distributed system that has wheels connected to it.” In fact, Savage continues, “It’s probably the most complex distributed system that you personally own.” By “complex distributed system,” Savage means a network of computers, and not just two or three. In their 2010 paper, the team quotes research that suggests that the average luxury sedan is controlled by 50-70 independent computers, called Electronic Control Units (ECUs), all of which communicate through one or two wires, an internal “party line” called the Controller Area Network (CAN) bus. The first thing the researchers had to figure out was the party line’s language.
Using a tool built by Koscher, the researchers were able to tap into the CAN bus using the car’s Onboard Diagnostics II (OBD-II) port, a standard connector used for emissions and other diagnostic tests. Kohno likens the process that came next to learning a truly alien language. “There’s lots of different ways to learn a foreign language,” he says, “but let’s say no one has ever learned this foreign language before. And you’re suddenly plopped into a foreign planet and you’re trying to figure out what people are speaking.” What would you do next? You’d want to be able to observe your new alien neighbors interacting with each other—Kohno imagines a café.
So you sit in the café (i.e., tap into the CAN bus) and observe, Kohno says. “You see someone say something and you see they’re brought coffee and a cake. And you watch someone else, and you see they’re brought tea and some other type of pastry.” This teaches you something about the alien’s grammar. The next step is to try speaking.
Kohno continues the metaphor: “We saw someone do something and they got a coffee and a cake, so we might just repeat half of that sentence and see which we got back. Did we get the coffee or the cake? And that gives us a little more understanding of the language that’s being used within the vehicle.” As they gained knowledge of the car’s grammar, the researchers were then able to expand their vocabulary through random trial and error, a process called “fuzzing.” “You know the sentence structure begins ‘may I please have a [something],’” Kohno continues, “so we would just say ‘may I please have a—’ and then generate random syllables and observe what happens as a result.”
As the researchers learned more about the members of the CAN bus—the ECUs that controlled nearly everything in the car, from the door locks to the steering to the braking system—they became able to take individual components, reverse-engineer the software, and replace it with their own. Along the way, they discovered that the interconnection of all the ECUs on the CAN bus meant that if they could take over one ECU, they could functionally take over any of them.
“Once you find a flaw, it’s game over,” says Savage. “You control everything.”
A Bug in the System
Tapping into the OBD-II port allowed the team to learn what they could do once they’d taken the car over remotely (in brief: nearly everything). Hacking the car in that way, however, required direct physical access. The next step was to figure out all the other ways they could attack the car.
Using the knowledge of the car’s internal workings they had acquired through the long, slow process of learning its language, the team discovered a multitude of ways they could exploit the car without ever having to touch it. Some methods required indirect physical access—that is, someone (not necessarily the attacker) had to access the car. Standard tools used at dealerships and mechanics could be remotely compromised such that, when they were used on a car, they could then covertly take it over. The team also discovered a vulnerability in the CD player that allowed them to encode a seemingly normal CD such that it could infiltrate the car (the song they used for the test? Beethoven’s Ninth).
Other attack vectors didn’t require anyone to touch the car at all. For example, the team was able to hack the car’s telematics system. Initially intended to provide assistance in case of an emergency, a telematics system, such as OnStar, requires that a car be equipped with its own phone number. Once the team had an individual car’s number, they could call it and play a carefully coded sound that would allow them to take over key functions.
How could this happen? How could millions of cars be so vulnerable to attack? Savage explains that one of the most startling things they discovered in their researcher is that, because of the way supply chains are organized, no car manufacturer has the code to their car because no car manufacturer is making all the parts of its own cars. On the contrary, most of a car’s ECUs are outsourced to third-party companies, who protect each component’s code as their intellectual property. Furthermore, these vendors aren’t only selling to one company or consumer. Consequently, their products may have more functionality than any individual car manufacturer is aware of. And that’s where you tend to get vulnerabilities.
“It was at this interface between bodies of code written by different entities, where one assumed there was a more restrictive use, and the other offered more capability, where we would always gain purchase,” Savage says. “Some interface allowed you to do more than GM had any conception of.”
Adventures in Car Hacking
Listening to the team talk, you get the sense that, after they cracked the car’s code, working on this project felt like careening from one adventure to another. There was the time when the team was conducting experiments with the car’s horn and a campus police officer came over to tell them, in so many words, to knock it off (they replaced the horn with a buzzer during experiments).
There was the time when Checkoway accidentally hacked into the UW car’s audio and caught snippets of the UW team’s private conversation. “I was trying to figure out how I could turn on the microphone and then stream the audio from the car’s cabin back to my computer,” he says. He tried it with the UCSD car, got it to work, and then decided to test it on the UW car. “I turned on the microphone, and then realized I could hear them.” Because of the covertness of the attack method, however, the UW team didn’t know that Steve was able to hear them. He quickly disconnected and told them about the accidental hack later. He recalls the UW team’s shock: “They were very disturbed, and rightly so!”
And then there was the time when the UW team requisitioned a defunct airport.
For the bulk of the project, the team worked on individual parts of the car, spread out on lab tables, or with the full cars on jacks. It was important to perform a road test to ensure that all of their hacks would work while the car was driving under normal conditions, but it wasn’t like they could just take the car on the highway. They needed to consider the safety of both their driver and other cars on the road.
Ultimately, it was a creative Program Director named Melody Kadenko who saved the day. She identified a state reciprocity provision in Washington law that allowed the University of Washington to requisition access to a decommissioned airport runway in Blaine, WA. Alexei Czeskis, a member of the team and a motorcycle rider who had both the appropriate safety gear and a certain tolerance for automotive risk, enthusiastically volunteered to be the driver.
After some wrangling with UW’s legal team and the institution of a host of backup safety measures, the test was on. Czeskis rode in the test car with a laptop hooked up through the OBD-II port, while Koscher controlled the laptop from another car driving beside him. Communicating through two-way radios, Koscher inputted the protocol that prevented the use of Czeskis’ brakes. It worked. They had real-life control.
Progress, Not Panic: A Restrained Approach Leads to a Big Impact
The team knew they had uncovered something important, but they weren’t initially sure what to do with it. “In the beginning, we had no idea how to disclose,” says Savage. It might seem strange to think about now, but before the team disclosed their findings in 2010, none of the major car manufacturers had given much thought to the security risks of increasingly networked cars. None of the major automakers had product security groups, there were no industry standards for the cybersecurity of vehicles and the US regulator of record (the National and Highway Traffic Safety Administration [NHTSA]) had no cybersecurity guidelines or evaluatory capability.
The team eventually was able to connect with the right people at General Motors and the federal government. Meanwhile, they also faced another, more philosophical question: whether to “name and shame” their object of research, the Chevy Impala, or to be more muted in their approach. Ultimately, the team declined to name the cars in their research, opting instead to approach General Motors privately.
“The decision to be more subdued in our approach was because we felt that was the most responsible way to share the results with the public and the various stakeholders,” says Kohno. “Because ultimately we wanted to see an improvement in the cybersecurity of future vehicles across all manufacturers, and work by the government to secure vehicles, and we didn’t want to see panic.”
That worry about causing a panic also led the team to push their research showing that cars could be hacked remotely to the second of the two papers. “The [first] paper could have said—but did not say—‘And we can currently remotely take over three million cars that are on the road today,’ which would have been an accurate statement, but not really a scientific statement,” Savage says. “It would be correct, but it wouldn’t advance people’s understanding of the problem.”
Thanks to the tireless, meticulous work performed by Checkoway, Kohno, Koscher, and Savage, along with the UW and UCSD teams, understanding of the problem has advanced. “You can’t talk to someone inside the automotive security industry who does not know these papers intimately,” says Savage. Together, the team’s two papers unlocked (if you will) a new sense of urgency within the automotive industry, prompting manufacturers to rethink car safety concerns and to adopt a range of new security practices as standard procedures. Following the team’s disclosure of their work, GM appointed a Vice President of product security to lead a new division. Other car companies followed suit, as did the federal government. In 2012, DARPA launched a massive program, the High Assurance Cyber Military Systems (HACMS) project, with the goal of creating hacking-resistant cyber-physical systems.
“This research is certainly the most impactful work that I have done,” says Checkoway in a conversation with Koscher, who immediately replies: “Pun not intended.”
From door locks and refrigerators to baby monitors and thermostats, the devices with which we surround ourselves are becoming increasingly connected. In the future, the team hopes that researchers will continue looking into other technologies that are integrated into our lives but that may not have received the same level of security analysis they gave automobiles. “We will never encounter a world, in my mind, where people stop finding vulnerabilities,” says Kohno. “It is better to be in a world where people are finding vulnerabilities in an ethical and responsible way and are fixing them.”
By Haylie Swenson