Seattle — For PGS, an oil-imaging company in Oslo, Norway, finding pockets of oil and natural gas in the ground essentially starts by taking a large ultrasound picture of Earth.
“It involves huge amounts of data,” said Guillaume Cambois, PGS’ executive vice president of imaging and engineering. “And, of course, time is of the essence.”
PGS, short for Petroleum Geo-Services, this year tried to speed up that work by buying a supercomputer built by Seattle-based Cray. The computer, housed in several pantrylike cabinets, takes PGS’ massive library of images and data, and applies algorithms to get crystal-clear pictures that speed the complex task of finding oil.
PGS is one of the several businesses that have been buying Cray’s supercomputers, a shift for a company that has traditionally sold to government agencies and academic institutions. Cray says 15 percent of its revenue last year, expected to fall between $720 million and $725 million, came from sales to businesses. That’s double the percentage from 2014.
Cray’s shift comes at a time when demand for cloud computing — which allows businesses access to greatly expanded computing power — is rising as corporate big-data needs increase. But demand for supercomputers is also strong; the massive pieces of technology do some things that the cloud just can’t.
Cray isn’t alone in this. At IBM, sales grew last year for just one of its more than a dozen lines of business, according to estimates by investment bank UBS. That would be mainframes, the giant computers with tons of processing power that Big Blue has been selling for decades. Sales by IBM’s System Z unit soared 30 percent, to $2.8 billion, UBS estimates.
Market-research firm IDC says sales of high-performance computers reached $10.22 billion in 2014 and estimates the market will grow 8.6 percent a year in the following five years, topping $15 billion in 2019.
The uptick in sales of giant computers by Cray, IBM and others bucks decades of struggles to compete with smaller computers and the cloud. It’s also a reminder that established technologies sometimes show surprising staying power in the face of rapid change.
The history of technology is largely a story of new innovations competing to elbow out the old. Personal computers were the death knell for the typewriter. The iPhone started a wave of change that would dethrone cellular-phone giants Nokia and BlackBerry.
The Seattle area, home to Amazon Web Services and Microsoft’s Azure platform, is the epicenter of what technology analysts say is a once-in-a-generation shift in how people and businesses deal with their digital goods.
But that move toward cloud computing, or using giant data centers to store data and run software programs, hasn’t spelled the end of the line for the business of selling refrigerator-sized computers.
As Jefferies analysts noted after IBM reported its fourth-quarter financial results recently, “the entire world is not moving to the cloud all at once.”
For some companies with heavy-duty computing needs, “the economics don’t make sense” to move to the cloud, said Donna Dillenberger, a technical fellow with IBM who specializes in business-focused computer systems. “It would be cheaper to have their own on-premise data center.”
Many buyers of mainframes or other high-performance computers belong to industries like insurance or finance. Because of regulatory or other restrictions on how they use data, they tend to remain plugged in to powerful computers they own and operate themselves.
In other cases, complicated software developed over decades would be tough to rework for the cloud. That includes things like airline-reservation systems or complex logistics and scheduling software for railroads or utilities.
“There are a lot of applications running on mainframes that have been there for a long time and are hard to move,” said Mark Russinovich, chief technology officer of Microsoft’s Azure unit.
Microsoft and rival Amazon.com are introducing increasingly powerful computers that customers can rent, but analysts say high-performance computers can clear technical hurdles that most “public clouds” of pooled servers can’t.
Steve Conway, an analyst with IDC, said the cloud is great at simpler technical problems. But supercomputers are often needed for complex problems where one small design change may have a ripple effect that changes 50 other inputs and everything needs to be calibrated as one, he said.
“Calculation that takes everything into consideration at the same time takes a hell of a lot of computing power,” Conway said.