In its 18th year, SC|05 has become the most prominent annual event of the supercomputing industry. About 8,700 people are at the show, being held this year at the Washington State Convention and Trade Center.
A crowd milled around Sun Microsystems’ booth Tuesday hoping to score one of the most popular freebies at the SC|05 conference: a T-shirt with the phrase, “No, I will not fix your computer.”
The shirt’s popularity could mean two things: Either the swag was sadly lacking or this conference was one big mob of tech-heads.
A tour around the conference hall proved both to be true.
In its 18th year, SC|05 has become the most prominent annual event of the supercomputing industry. About 8,700 people are at the show, running through Friday at the Washington State Convention and Trade Center.
The conference was once the domain of academics and government researchers who worked with enormous computer systems known as supercomputers.
It has expanded and become more mainstream in recent years, as has the definition of supercomputing.
But that doesn’t mean the conversations are dumbing down.
“We’re hitting over a teraflop of sustained global bandwidth!” exclaimed John Levesque, a senior technologist at Seattle-based Cray, to a crowd of about 45 people.
As Levesque spoke at the supercomputer company’s booth, Eric Rudder wandered by, alone and unnoticed. That Microsoft Chairman Bill Gates’ top company adviser could attract so little attention at the show says something about the challenge Microsoft faces in an industry that embraces open-source computing.
Gates did appear at the show Tuesday, outlining in a keynote address Microsoft’s vision of “personal supercomputing.”
That sounds like an oxymoron, but Microsoft believes businesses will increasingly use ultra-high-performance clustered systems, as prices fall and new software makes it easier to manage and run the systems.
Gates said technologists on the high and low end of the computing spectrum can work together on common challenges, such as parallel computing.
He also suggested that products such as Microsoft’s Visual Studio 2005 programming toolkit can help the supercomputing crowd simplify software development.
Showing his chops, Gates dove into the technical details of a University of Washington research project monitoring undersea activity along fault lines. The researchers are taking advantage of XML software language to collect and analyze data.
He also described how the UW, Fred Hutchinson Cancer Research Center and Microsoft Research are working together on methods for developing an AIDS vaccine. Outside the speech, the conference exhibits lacked the glitz of better-known tech gatherings, such as the annual Consumer Electronics Show in Las Vegas. Exhibitors mostly used poster boards with diagrams on them or PowerPoint presentations of grids and helixes.
At the Sun booth, a magician performed tricks for a large crowd. The company spares no expense in reaching these attendees, said Rich Brueckner, Sun’s marketing manager for high-performance computing.
“A lot of them, this is the only show they go to and they are shopping for multimillion-dollar computer systems,” he said. “This is the only [conference] that’s targeted at this market.”
SC|05 is unusual in that the exhibitors are also the customers. Some of the most sought-out buyers at the show were the research facilities and supercomputing centers that exhibited their work for the crowd.
Companies also checked out what rivals were up to.
“It’s a lot of competitors talking to competitors about the state of the industry,” said Alex Lesser, a vice president at PSSC Labs, a Lake Forest, Calif., company focusing on high-performance computing. Lesser displayed new water-cooling technology designed to absorb the internal heat created by powerful computing machines.
The San Diego Supercomputer Center offered drinks from an espresso cart and showed off its research. The center conducted an earthquake simulation last year that depicted the potential impact from a major quake originating near Palm Springs, Calif.
The simulation took four days to compute and created 43 terabytes of data, said Lynn Ten Eyck, associate director for scientific research and development at the center. A terabyte is 1,000 gigabytes.
The center is an exhibitor but also does careful comparison shopping at the conference, he said.
“We are high-profile customers,” he said. “Our user base is a very large scientific community, and these people talk not only to one another but also to policymakers.”
Numerous research institutions discussed their supercomputing projects. Indiana University showed how it is simulating physical systems to understand how geologic oil deposits are formed.
Researchers are also studying the way air flows over airplane wings.
At the Maui High Performance Computing Center, part of the Air Force Research Laboratory, scientists are studying combat models in battlefield simulations.
The University of Iowa is creating a digital human that performs virtual missions, such as maneuvering around obstacles.
The Ohio Supercomputer Center was touting an initiative called “blue-collar computing,” which aims to make supercomputing more accessible to companies and researchers. The project is pushing the industry to review how supercomputing software is written and how hardware is designed.
Down the road, the center said, the large jobs of today could become routine tasks.
Following the blue-collar theme, the center’s researchers wore blue shirts with their names sewn on patches.
“A lot of people come around here looking for an oil change,” joked Paul Buerger, a systems and operations leader with the center.
Seattle Times technology reporter Brier Dudley contributed to this story. Kim Peterson: 206-464-2360 or email@example.com