Sean Smith, the research director of the Institute for Security, Technology, and Society at Dartmouth College, likes to hold office hours at Umpleby’s Bakery in downtown Hanover.
With unruly reddish-brown hair and a beard, Smith, who wears glasses, sits at a table, back to the wall, coffee mug in hand. He works on a slim Mac laptop plastered with stickers that proclaim his interest in mountain biking, skiing and The Devil Makes Three, a band originally from Brattleboro, Vt., that has a jangly, not-easily-categorized blues and bluegrass sound with some old-timey, 1920s vaudeville music thrown in.
Smith, 50, who also is a professor of computer science at the college, spends much of his time at work thinking about the intricate, evolving relationship between humans and computers, in particular how to design networks that are relatively secure. The stakes are high: Technology has the capacity, he said, to “affect the security balance between nation states.”
National cybersecurity isn’t the only issue. The larger questions are these: Because we rely on computer technology in nearly every aspect of our lives, what sort of society are we creating? How do we safeguard privacy? In our rush to embrace technology, are we thinking through the implications of what is being called Big Data?
“We’re transitioning into a knowledge-based economy,” said Bill Nisen, associate director of the Institute for Security, Technology, and Society at the college.
By 2020, Smith said, it’s estimated that there will be as many as 25 billion connected network devices in homes and businesses worldwide. This has been referred to as the “Internet of Things,” a term for the potential for such everyday objects as cars, phones, household appliances and medical devices to be connected to the Internet — and each other. Wireless information technology embedded in objects could enable machines to communicate, producing, its proponents claim, greater efficiencies in time, energy saving and cost.
If you are interested in the issue of cybersecurity, this degree of interconnection poses increasing challenges. Although Smith said he rarely wakes up in the night hyperventilating about a computer security issue, he does say that if new networks are built “the way we built the current system, we’re going to be in a lot of trouble.”
“It’s like 9/11: No one expected the towers to fall,” Smith said. “A lot of us are worried about the potential for something like that to happen with cybersecurity.”
Smith points to the American power grid, one of his areas of expertise, as an example. The system has been designed to withstand lightning strikes , but not someone “who could strike or bring down the electric grid in five different places,” Smith said.
Electronic medical records are another area of concern. In February, Anthem, one of the largest health insurance companies in the U.S., was hit by a massive data breach in which names, birthdays, medical IDs, Social Security numbers, street addresses, email addresses and employment information, including income data, were stolen. It’s not yet known who the hackers were.
The more integrated computer networks are, the more vulnerable they are to cyberattack. The more we rely on computers to store fundamental information about ourselves, the more vulnerable we are to having that information hacked and used in ways we don’t want. And generally speaking, the security systems of computers that store data are so porous that infiltrating them is not difficult for a smart hacker, or team of hackers.
“As we move more and more information into the digital world, we create a bigger attack surface,” Nisen said.
“It should be understood that large amounts of centralized data is a liability. It’s a big target,” said Sergey Bratus, a colleague of Smith’s in Dartmouth’s computer science department.
A recent case in point was this May’s breach of data banks at the federal Office of Personnel and Management. It is estimated to have affected some 21.5 million federal employees and contractors whose personal information, including Social Security numbers, was hacked.
The working theory, in the media at least, is that it was carried out by the Chinese government, although there is no proof of that.
And, Smith said, what was the hackers’ objective? To obtain the data? To show that they could? Or both?
“That it was breached is very, very bad,” Bratus said. “A breach of that scale is not just a failure, but a big failure of policy.”
People frequently use such military terms as “attack” or “security perimeter” when talking about cybersecurity, casting the problem as one of offense and defense. One might imagine a castle surrounded by a moat, with a drawbridge and battlements: an attacker must vault over or work around these successive impediments to get to the heart of the castle.
But Smith looks at it differently — less war games, more architecture. “I think of it more in terms of Lego,” he said. “I can take these building blocks and turn them into something the designer never intended.”
To circumvent the potential for cyberattack, he said, you have to think about the myriad ways someone could get around a security system.
If you think of a security system as a residence, how do you secure doors and windows? What are the ways someone could push against a door, jimmy a lock, or get around a locked window? What if there was another way to get in that you hadn’t thought of, like going down a chimney, digging up through a basement, or going through an adjoining wall?
“You really need to think about the ways somebody could misuse the system,” he said.
Smith and his colleagues at the Institute want to upend our assumptions about the way computer security systems work, which means that they want to challenge computer security system designers to think about the way people really use computers. Smith calls it “finding the holes.”
Instead of trying to eliminate hackers, which would be next to impossible, maybe the answer is to take a page from their book.
“For a lot of us, hacker isn’t a bad word,” Smith said. “What are they learning that standard engineers don’t? Are their brains different? How do we learn these skills? We want people to learn these skills. We need to have more people thinking like hackers, not servers. … The best way to protect systems is to learn how to attack them.”
Smith, who lives in Hanover with his wife and has two daughters from a previous marriage, grew up in Holland, Pa., and graduated from Princeton University with a bachelor’s degree in mathematics. He went on to study computer science at Carnegie Mellon University in Pittsburgh, where he received his doctorate in 1994.
He did a post-doc at Los Alamos National Laboratory in New Mexico, and worked from 1996 to 2000 for IBM’s Thomas J. Watson Research Center in Yorktown Heights, N.Y., before deciding to move to Dartmouth which had, as he says on his website, the added advantage of being the only Ivy League institution on the Appalachian Trail.
He has been granted, with numerous collaborators, more than a dozen U.S. patents that include Authentication for Secure Devices with Limited Cryptography, Authenticated Electronic Coupon Issuing and Redemption, and Establishing and Employing the Provable Untampered State of a Device, among others.
He also is the author of Trusted Computing Platforms: Design and Applications ( Springer ) and a co-author of The Craft of System Security ( Addison-Wesley), as well as numerous journal and magazine articles.
For an Ivy League academic, his website is rather unorthodox: there’s the “Serious Bio” (his academic resume) and the “Not-So-Serious Bio,” which outlines his numerous interests: rugby (no longer plays the game), mountain biking, running ultramarathons, disc jockeying, trail running, orienteering, snowshoeing and cross-country skiing. To name a few.
He claims to be one of the few individuals who has lived in all four U.S. states that begin with “New”: New Hampshire, New York, New Mexico and New Jersey.
He also issues a warning to doctoral candidates: “Don’t do what I did in graduate school, which was to spend more time doing long training rides … than I did in the office.”
“I often give in to the temptation to be irreverent,” Smith said cheerfully.
‘Asking the Right Questions’
He came to Dartmouth in 2000 because he judged it to be “a better way of changing the world. … Standing in front of a room of smart students takes the rust off pretty quickly.”
While some American universities have computer science departments that employ hundreds of faculty, the advantage of a liberal arts college like Dartmouth, Smith said, is that there is greater cross-discipline research that brings in such fields as sociology and psychology.
This yields benefits because cybersecurity is as much about the human element as it is the technological one. Humans have unconscious blind spots and biases that play into the questions they ask and how they design systems.
Smith’s ability to come up with the questions that other people don’t think of is one of the reasons he’s good at what he does, Nisen said. “He’s smarter than the average bear. He thinks differently than most people.”
“The science of security is nascent,” Bratus said. “Sean’s great contribution is asking the right questions.”
Smith seems to be the kind of person for whom small talk is a waste of time. Assumptions are there to be challenged, and you can’t know the answer if you don’t know which question to ask.
He speaks rapidly, and even when sitting still seems to be in motion: he drums his fingers on the tabletop, bounces his knee up and down, taps his feet on the floor, and sometimes waves his arms while he talks. When asked a question, he doesn’t always answer immediately, but stares into space at length before coming up with an explanation or answer.
Working in the field of cybersecurity seems to require thinking not just one or two steps ahead, but five.
So is it like chess?
No, Smith said. “Chess has rules and at every step you know what the rules are. There are no rules in hacking.”
There is, however, the law of unintended consequences. He pointed to the derailment in May outside Philadelphia of a New York-bound Amtrak train, which was recorded as traveling at more than 100 mph around a sharp curve in a 50- mph zone. Eight people died in the crash.
In the aftermath, there were suggestions that Amtrak install at that junction “positive train control,” which is in use in other areas of the Northeast Corridor, but hadn’t been installed there. “Positive train control” is designed to slow a speeding train or avoid a collision between two trains.
That sounds good, Smith said, until you start looking at it the other way around.
“If you put in devices to slow the train, it can go the other way. Someone could fiddle with them,” Smith said.
“What can go wrong when you feed the wrong information to people?” he asked.
Balancing the Trade-Offs
Smith, who is of Irish and French-Canadian descent, was raised Catholic. He admires the church’s venerated history of vigorous intellectual discourse, most notably in the works of St. Thomas Aquinas. However, he “defected” to the Episcopalian Church for reasons he doesn’t go into, although he said “the Episcopalian Church has a better balance of faith and reason.” He attends St. Thomas Episcopal Church in Hanover.
But while science goes a long way toward explaining how the world works, it cannot explain human awareness or consciousness, Smith said. Where does that come from?
Computers do not have a consciousness in the human sense, he said, but it’s possible that one day they might. The field of computer science evolves at lightning speed, which is one reason Smith chose to pursue the field. “I wanted to keep learning,” he said.
One of the most fascinating — or vexing — elements of cybersecurity is that no matter how many security safeguards software and network designers dream up, humans will circumvent them routinely.
We’re counseled on how to create passwords for the websites we use, how to make them stronger and less “hackable.” We’re advised not to use the same password for different websites, not to use the name of our pet or our birthday, we’re given protocols on how to minimize our vulnerability to online identity theft and how to safeguard the integrity of electronic medical records.
But, Smith said, “real users break rules all the time. Unless we understand why and how, how do you design a safe system? When something goes wrong, the tendency is to blame the humans. But what can we do to make (technology) better for humans to use?”
It is possible, Nisen said , to make a network that would be “perfectly secure, but it would be unusable. It comes down to accessibility and convenience on one side and security on the other.”
Smith has a waggish analogy for the cost-benefit trade-off between ease of use of technology and security on the other.
“I presume you came here in a car, not a tank,” he said. “We might all be safer driving in tanks, but it costs more than making a car. We as a collective community have made a decision about where the trade-off is.”
But society is still reckoning where the trade-off is when it comes to cybersecurity. “We need a national conversation on how we want things to be digitized,” Bratus said.
“Is this new universe we’re building going to be safe and secure and wonderful?” Smith said.
Or, he asked, is it going to be like Love Canal, the landfill in a neighborhood of Niagara Falls that was revealed, during the 1970s, to be the site of a toxic waste dump that sickened hundreds of families living there — and now is regarded as a massive failure of civic and corporate planning and government oversight?