The Register has an interesting article on the International Supercomputing Conference keynote speech by Burton Smith, who used to be Chief Scientist at Cray and now works at Microsoft. Smith believes that parallel computing, now the domain of mostly High Performance Computing (HPC) and Grid folks, will reinvent computing. Since I didn’t get the nod to travel out to it this year, and, frankly, with everything that’s going on, I’m not disappointed, it was good to see some coverage.
“We are now at the point where we are breaking the Von Neumann Assumption that there is only one program counter that allows the proper ordering and scheduling of variables,” he said. “Parallel programming makes this hazardous, but we are also now at the point where serial programs are becoming slow programs.”
It’s disturbing to think about it if you follow out the implications of his speech. And I work in grid technologies, part of the whole massively parallel world. (Depending on how you use the term.) One of our biggest problems in our grid (really a coalition of local grids more than a grid itself) is that scientists do not know how to think in parallel. It turns out that although High Energy Physics (HEP) is extremely data intensive and sees major improvements from the parallelism of the grid, its computing requirements really aren’t all that complex. It’s more of a brute force problem than one of finesse. Other sciences have more complex computing needs or have to address a much more complex, interconnected problem. These scientists have to totally rethink their computing solutions, rebuild their applications from scratch. It’s these problems, rather than the MPP ones that we normally deal with here, that are what Smith is talking about.
So this reinvention of computing, which Smith has spent awhile talking about really,
Worth a read.