What Are Mainframes?

Mainframe, or mainframe, English name mainframe. The mainframe uses a dedicated processor instruction set, operating system, and application software. The term mainframe originally referred to a large computer system housed in a very large boxed iron box to distinguish it from smaller minicomputers and microcomputers. Most of the time it refers to a series of IBM computers starting with system / 360. The term can also be used to refer to compatible systems made by other vendors, such as Amdahl, Hitachi Data Systems (HDS). Some people use the term to refer to IBM's AS / 400 or iSeries systems. This usage is inappropriate; even IBM itself only treats these series of machines as medium-sized servers, not mainframes. [1]

Since the 1980s, networkization and miniaturization have become increasingly obvious. Traditional centralized processing and host / dumb terminal models are increasingly unable to meet people's needs.
Enterprises that produce mainframes have
Reducing mainframe CPU consumption is an important task. Save each
The first IBM360 to use integrated circuit technology
In 1958, IBM released the first computer RCA501 made entirely of transistors. In 1959, IBM introduced large-scale transistor computers such as 7070 and 7090 and small- and medium-sized transistor computers such as I401 and I620. In 1964, IBM continued to develop the world's first IBM 360 series computer using integrated circuit technology. As shown in Figure 1.
Figure 1 IBM360
Computers during this period were expensive and basically provided data processing services for national departments and pillar industries such as defense, finance, transportation, and energy. For example, the development cost of IBM360 is as high as US $ 5 billion, which is even 2.5 times the cost of the first atomic bomb.
To solve this problem, scholars at the time proposed the concept of utility computing, whose ideological origin comes from the power industry. When incandescent lamps and generators came out, although in theory every household can achieve lighting, this requires that each household must configure and maintain a generator, which is obviously not economically or technically feasible. To this end, scientists and engineers have invented power plants, DC power supply systems, and AC power supply systems one after another. Eventually, power plants and long-distance AC power supply systems constitute the core technology of the power industry and continue to this day.
Today, when we turn on the electric lighting or start the air-conditioning heating, we don't care which power plant is behind it. As long as we pay the electricity bill according to the monthly electricity meter readings, we can enjoy the services brought by electricity. In addition to electricity, public services such as piped gas, tap water, and fixed telephones that we often use in our daily lives have adopted this approach. Can computing services or information technology services, as they are today, develop in this way? This is the idea of utility computing.
In order to enable a mainframe to serve multiple customers at the same time, IBM has adopted a design concept of time-sharing multiplexing and virtualization in the software, so that when multiple customers use the same mainframe simultaneously, it is like Split into multiple miniaturized virtual hosts, this is actually the prototype of utility computing.

IN OTHER LANGUAGES

Was this article helpful? Thanks for the feedback Thanks for the feedback

How can we help? How can we help?