Grid Computing

  • Grid computing (also called distributed computing) is a collection of computers working together to perform various tasks. It distributes the workload across multiple systems, allowing computers to contribute their individual resources to a common goal.

    A computing grid is similar to a cluster, but each system (or node) on a grid has its own resource manager. In a cluster, the resources are centrally managed, typically by a single system. Additionally, clusters are usually located in a single physical space (such as a LAN), whereas grid computing often incorporates systems in several different locations (such as a WAN).

    In order for systems in a computing grid to work together, they must be physically connected (over a network or the Internet) and run software that allows them to communicate. The software used in grid computing is called middleware since it translates the information passed from one system to another into a recognizable format. This allows the data computed by one node within the grid to be stored or processed by another system on the grid.

    Grid computing has many different scientific applications. For example, it is used to model the changes in molecular structures, analyze brain behavior, and compute complex physics models. It is also used to perform weather and economic simulations. Some companies also use grid computing to process internal data and provide services over the Internet. Cloud computing, for instance, is considered to be a subset of grid computing.


Related Words

View More

© Define Dictionary Meaning. All rights reserved