In late September, I’ll be giving a presentation on grid computing for the Kansas City Computer measurement Group (KCCMG). And since I am responsible for getting speakers to this year’s event, I thought I’d do the intro presentation. I’ll do this for several reasons. First, I’ll get to define terms and set the tone for the conference. Second, as the kickoff, I won’t have to be an undeniable subject matter expert.
So I’ve been putting some stuff together concerning various forms of grid computing. Moreover, I have been putting together some history concerning peer-based computing and its development over the years. So one of the key historical points I was going to highlight was the development of the SETI@Home project. I had spent many thousands of computer hours trying to perform all the fast Fourier transforms required to find E.T. And after a few years of spending lots of idle cycles, I finally stopped keeping up with the project. BTW, I’m not a wacko nut-job. I truly loved the notion of taking “leftover” computer cycles and putting them to good use. And as a sci-fi junkie, this was a diverting application of a fascinating principle.
Since SETI@Home was an early (and successful) form of a scavenging grid, I thought it deserved a slide or two. At the same time, I figured I could always get a laugh or two out of the crowd. So I went to the Berkeley site where SETA@Home used to reside. And I found the new Berkeley Open Infrastructure for Network Computing (BOINC). Not only did BOINC support the ongoing work of the Planetary Society (the SETI folks), but they also developed an open framework that would support any number of distributed computing projects.
And what a cool selection of projects they support. I won’t go through all of them. But three really caught my fancy – and the cycles of a number of machines I administer. These projects include: climatepredictions.net (a global climate prediction tool), Einstein@Home (an applicaion to detect pulsars through the theoretical gravitational waves that Einstein postulated, and LHC@Home (a quantitative tool that performs offline calculations for the Large Hadron Collider at CERN).
While in college, I became infatuated with particle physics – or at least as infatuated as an Economics student can get. Consequently, the use of ‘spare” computer cycles to support the collision of hadrons (e.g., protons) really sparked my long-dormant physics curiosity. And while I am sure that I will speak further of the LHC (and BOINC), I was struck by the realization that CERN has always been at the center of computing innovation.
In 1980, Tim Berners-Lee worked as a consultant at CERN. Then in 1990, he presented his seminal paper on information managemnt to his colleagues at CERN. This paper helped to launch TBL’s (and CERN’s) development of HTTP (and httpd). For those unfamiliar with the history, the W3C has a great page with the historical highlights.
Well, CERN is at it again. CERN’s current support of an open grid infrastructure is testimony to their continuing commitment to the progress of the computing sciences. If you want to see some of CERN’s recent efforts concering grid computing, head on over to the GridCafe (http://gridcafe.web.cern.ch/gridcafe/). It’s a nice site about how The Grid is currently under development – and how it will impact our lives.
-CyclingRoo-