Cloud computing and Grid computing are the two words that end up confusing many people as they are similar in theory. Cloud computing and Grid computing involves a massive computer network infrastructure.
On the front end, cloud computing and grid computing are newer concepts compared to other large computing solutions. Both concepts have been developed for the purpose of distributed computing, that is, computing an element over a large area, literally on computers that are separated by some of the other means.
Well there are many reasons people prefer Distributed computing over single processor computing, and here they are:
- The reason to opt for distributed computing is to offer parallel or concurrent computational resources to the users. The concept of the queue has been overtaken. Requests don’t actually have to wait in a queue to get serviced one after the other.
- Distributed computers make use of every spare moment your processor is idle.
- Distributed computing systems are made up of many systems, so if one crashes other is unaffected.
- Distributed model scales very well. Need more compute resources? Just plug them in by installing a client on additional desktops or servers.
Cloud computing vs Grid computing
To understand the basic and complex differences between cloud and grid computing, we really need to explain both the technologies. Here’s how they are defined.
Cloud is basically an extension to the object-oriented programming concept of abstraction. Here cloud means the Internet. For the end-users it is just getting outputs for certain inputs, the complete process that lead to the outputs is purely invisible. Computing is based on virtualized resources that are placed over multiple servers in clusters.
Also within the “cloud computing” family, are what’s known as an SPI model SaaS, PaaS, and IaaS. These are the services available on the cloud and do all the heavy lifting using someone else’s infrastructure. Cloud computing eliminates the costs and complexity of buying, configuring, and managing the hardware and software needed to build and deploy applications; these applications are delivered as a service over the Internet (the cloud).
Grid systems are designed for collaborative sharing of resources. It can also be thought of as distributed and large-scale cluster computing. A Grid is basically the one that uses the processing capabilities of different computing units for processing a single task. The task is broken into multiple sub-tasks, each machine on a grid is assigned a task. As when the sub-tasks are completed they are sent back to the primary machine which takes care of all the tasks. They are combined or clubbed together as an output.
- Server computers are still needed to distribute the pieces of data and collect the results from participating clients on the grid.
- Cloud offers more services than grid computing. In fact, almost all the services on the Internet can be obtained from the cloud, eg web hosting, multiple Operating systems, DB support, and much more.
- Grids tend to be more loosely coupled, heterogeneous, and geographically dispersed compared to conventional cluster computing systems.
Let me know if you have any questions.
UPDATE: In the interest of clarity, based on the comments received, certain lines/sections of the post have been suitably edited.