What Is Parallel Computing?

Parallel computing or parallel computing is relative to serial computing. It is an algorithm that can execute multiple instructions at one time, the purpose is to improve the calculation speed, and to solve large and complicated calculation problems by expanding the problem solving scale. The so-called parallel computing can be divided into parallel in time and parallel in space. Temporal parallelism refers to pipeline technology, while spatial parallelism refers to the use of multiple processors to perform calculations concurrently.

Parallel Computing

This entry lacks an information bar . Supplementing related content makes the entry more complete and can be quickly upgraded. Hurry up!
Parallel computing or parallel computing is relative to serial computing. It is an algorithm that can execute multiple instructions at one time, the purpose is to improve the calculation speed, and to solve large and complicated calculation problems by expanding the problem solving scale. The so-called parallel computing can be divided into parallel in time and parallel in space. Parallelism in time means
Parallel computing refers to the process of using multiple computing resources to solve computing problems at the same time. It is an effective means to improve the computing speed and processing capacity of computer systems. Its basic idea is to use multiple processors to solve the same problem cooperatively. The problem to be solved is decomposed into several parts, and each part is calculated in parallel by an independent processor. A parallel computing system can be either a specially designed supercomputer containing multiple processors, or a cluster of independent computers interconnected in some way. The data processing is completed through the parallel computing cluster, and the processing result is returned to the user.
Parallel computing can be divided into parallel in time and parallel in space.
Parallelism in time: refers to the assembly line technology. For example, when the factory produces food, the steps are divided into:
1. Wash: Rinse food thoroughly.
2. Disinfection: Disinfect food.
3 Cut: Cut the food into small pieces.
4 Packaging: Pack food into bags.
If an assembly line is not used, the next food is processed after one food has completed the four steps, which is time consuming and affects efficiency. But with pipeline technology, four foods can be processed simultaneously. This is time parallelism in a parallel algorithm. Two or more operations are started at the same time, which greatly improves computing performance.
Spatial parallelism refers to the concurrent execution of calculations by multiple processors, that is, connecting more than two processors through a network to achieve simultaneous calculation of different parts of the same task, or large problems that cannot be solved by a single processor.
For example, Xiao Li is preparing to plant three trees in the tree planting section. If one person needs 6 hours to complete the task, on the day of the tree planting festival, he calls his good friends Xiao Hong and Xiao Wang. After hours, everyone completed a tree planting task. This is the spatial parallelism in parallel algorithms. A large task is divided into multiple identical subtasks to speed up problem solving.
In order to take advantage of parallel computing, computing problems are usually characterized by the following characteristics:
(1) Separate the work into discrete parts, which helps to solve at the same time;
(2) Execute multiple program instructions at any time and in time;
(3) It takes less time to solve a problem under multiple computing resources than a single computing resource.
The major research in parallel computing science is spatial parallelism. From the perspective of program and algorithm designers, parallel computing can be divided into data parallel and
Parallel computers rely on networks to connect various processors or processors. Generally speaking, there are several ways: a type of network with fixed connections between processing units. During program execution, this point-to-point link remains A typical static network has a one-dimensional linear array, two-dimensional meshes, tree connections, hypercubic networks, cubic rings, shuffled switching networks, and butterfly networks. Static connection
Constructed with an exchange switch, you can press

Basic indicators for parallel computing

execution time
Workload
Storage performance

Parallel computing speedup evaluation

Amdahl's theorem
Gastofson's theorem
Sun-Ni theorem

Parallel computing scalability standards

Equal Efficiency Standard
Constant speed standard
Average delay criterion
Parallel computing and cloud computing
Cloud computing is a concept created after parallel computing, which is developed from parallel computing. The two have similarities in many aspects. Learning parallel computing is a great help in understanding cloud computing. Parallel computing is a basic course that must be learned to learn cloud computing.
But parallel computing is not the same as cloud computing, and cloud computing is not the same as parallel computing. The difference between the two is as follows.
(1) Cloud computing sprouts from parallel computing
The budding of cloud computing should start with the parallelization of computers. The emergence of parallel machines is that people are not satisfied with the growth rate of the molar rate of the CPU, hoping to connect multiple computers in parallel to obtain faster computing speed. This is a very simple and naive approach to high-speed calculations, which has since proven to be quite successful.
(2) Parallel computing and grid computing are only used in specific scientific fields, professional users
Parallel computing and grid computing are proposed mainly to meet the professional needs in the fields of science and technology, and their application fields are basically limited to the scientific field. The use of traditional parallel computers is a fairly professional job, requiring users to have high professional quality, most of which are command line operations. This is a nightmare for many professionals, not to mention ordinary amateur users.
(3) High performance pursued by parallel computing
In the era of parallel computing, people are striving for high-speed computing and the use of expensive servers. Countries at all costs surpass other countries in terms of computing speed. Therefore, the high-performance cluster in the era of parallel computing is a "fast consumer product" and the world's top 500 high-performance The ranking of computers is constantly refreshing. If a mainframe cluster cannot be effectively used in about three years, it will be far behind, and huge investments cannot be recovered.
(4) Cloud computing requires low computing power for a single node
In the era of cloud computing, we do not pursue the use of expensive servers, nor do we need to consider the TOP500 ranking. The computing power and storage power of the cloud center can gradually increase as needed. The cloud computing infrastructure supports this dynamic increase. High-performance computing will become a "durable consumer product" in the era of cloud computing.

IN OTHER LANGUAGES

Was this article helpful? Thanks for the feedback Thanks for the feedback

How can we help? How can we help?