Monday 7 December 2009

Big Data on Grids or on Clouds?

http://cloudcomputing.sys-con.com/node/1199664

Now that we have a new computing paradigm, Cloud Computing, how can Clouds help our data? Replace our internal data vaults as we hoped Grids would? Are Grids dead now that we have Clouds? Despite all the promising developments in the Grid and Cloud computing space, and the avalanche of publications and talks on this subject, many people still seem to be confused about internal data and compute resources, versus Grids versus Clouds, and they are hesitant to take the next step. I think there are a number of issues driving this uncertainty.

Grids didn't keep all their promises
Grids did not evolve (as some of us originally thought) into the next fundamental IT infrastructure for everything and everybody. Because of the diversity of computing and data environments, we had to develop different middleware (department, enterprise, global, compute, data, sensors, scientific instruments, etc.), and had to face different usage models with different benefits. Enterprise Grids were (and are) providing better resource utilization and business flexibility, while global Grids are best suited to complex R&D collaboration with resource sharing. For enterprise usage, setting up and operating Grids was often complicated and did not remove all the (data) bottlenecks. For researchers this characteristic was seen to be a necessary evil. Implementing complex applications on supercomputers has never been easy. So what.

Grid: the way station to the Cloud
After 40 years of dealing with data processing, Grid computing was indeed the next big thing for the grand challenge R&D expert, while for the enterprise CIO, the Grid was a way station on its way to the Cloud model. For the enterprise today, Clouds are providing all the missing pieces: easy to use, economies of scale, business elasticity up and down, and pay-as you go (thus getting rid of some capital expenditure). And in cases where security matters, there is the private Cloud, within the enterprise's firewall. In more complex enterprise environments, with applications running under different policies, private Clouds can easily connect via the Internet to (external) public Clouds -- and vice versa -- forming a hybrid Cloud infrastructure that balances security with efficiency.

Different policies, what does that mean?
No data processing job is alike. Jobs differ by priority, strategic importance, deadline, budget, IP and licenses. In addition, the nature of the code often necessitates a specific computer architecture, operating system, memory, storage, and other resources. These important differentiating factors strongly influence where and when a data processing job is run. For any job, a set of specific requirements decide on the set of specific policies that have to be defined and programmed, such that any of these jobs will run just according to these policies. Ideally, this is guaranteed by a dynamic resource broker that controls submission to Grid or Cloud resources, be they local or global, private or public.

No comments: