Which of the following is the most correct definition of Grid computing?
A.
Grid computing refers to the ability to run computers off a power grid.
B.
Grid computing refers to the aggregation of multiple, distributed computing resources, making
them function as a single computing resource with respect to a particular computational task.
C.
Grid computing refers to the vertical scaling of resources to add more capacity to the
Infrastructure.
D.
Grid computing allows computing resources to be operated and managed independently,
creating a distributed architecture.
Explanation:
Grid computing is a technology architecture that virtualizes and pools IT resources,
such as compute power, storage, and network capacity into a set of shared services
that can be distributed and re-distributed as needed. Grid computing involves server
virtualization, clustering, and dynamic provisioning.
Note: With Grid computing, groups of independent, modular hardware and software
components can be pooled and provisioned on demand to meet the changing needs of
businesses. Grid computing is really a form of distributed computing and it aims to
deliver flexible and dynamic infrastructures using tiered optimization. It uses
virtualization at various levels of the middleware and database layer to achieve it.
Reference: Oracle Reference Architecture, Application Infrastructure Foundation, Release 3.0