The history of the development of computers briefly. The main stages in the history of the development of computers

One of greatest inventions of his time. Billions of people use computers in their Everyday life worldwide.

Over the decades, the computer has evolved from a very expensive and slow device to today's highly intelligent machines with incredible processing power.

No one person is credited with inventing the computer, many believe that Konrad Zuse and his Z1 machine were the first in a long line of innovations that gave us the computer. Konrad Zuse was a German who gained fame for creating the first freely programmable mechanical computing device in 1936. The Z1 Zuse was created with a focus on 3 main elements that are still used in today's calculators. Later, Konrad Zuse created the Z2 and Z3.

The first computers in the Mark series were built at Harvard. MARK was created in 1944, and this computer was the size of a room measuring 55 feet long and 8 feet high. MARK could perform wide range calculations. It became a successful invention and was used by the US Navy until 1959.

The ENIAC computer was one of the most important advances in computing. It was commissioned during World War II by the US military. This computer used vacuum tubes instead of electric motors and levers for fast calculations. Its speed was thousands of times faster than any other computing device of the time. This computer was huge and had a total cost of $500,000. ENIAC was in service until 1955.

RAM or Random Access Memory was introduced in 1964. The first RAM was a metal detector plate placed next to a vacuum tube that detected a major difference in electric charges. It was an easy way to store computer instructions.

There were many innovations in 1940. Manchester developed the Telecommunications Research Establishment. It was the first computer to use a stored program, and it became operational in 1948. The Manchester MARK I continued into 1951 and showed tremendous progress.

UNIVAC was built by the creators of ENIAC. It was the fastest and most innovative computer capable of handling many calculations. It was a masterpiece of its time and was highly acclaimed by the public.

IBM, the first personal computer widely used and available to people. The IBM 701 was the first general purpose computer developed by IBM. A new computer language called "Fortran" was used in the new 704 model. The IBM 7090 was also a great success, and dominated both office computer over the next 20 years. In the late 1970s and into the 1980s, IBM developed the personal computer known as the PC. IBM has had a huge impact on the computers used today.

With the growth of the personal computer market in the early and mid 1980s, many companies realized that the graphical interface was more user friendly. This led to the development of an operating system called Windows by Microsoft. The first version was called Windows 1.0 and later came Windows 2.0 and 3.0. Microsoft is getting more popular today.

Today computers are extremely powerful and more affordable than ever. They have penetrated almost every aspect of our lives. They are used as a powerful communication and trading tool. The future of computers is huge.

Today's personal computers are very different from the massive, ungainly devices that emerged during World War II, and the difference is not only in their size. The "fathers" and "grandfathers" of modern desktops and laptops did not know how to do much of what they effortlessly cope with modern machines. However the very first computer in the world was a breakthrough in science and technology. Sit back in front of your monitor and we'll tell you how the PC era was born.

Who created the very first computer in the world

In the 40s of the last century, there were several devices at once that could claim the title of the first computer.

Z3

Konrad Zuse

An early computer created by the German engineer Konrad Zuse, who worked in complete isolation from the developments of other scientists. It had a separate block of memory and a separate console for data entry. And their carrier was an eight-track punched card made by Zuse from 35 mm film.

The machine had 2,600 telephone relays and was freely programmable in binary floating point code. The Z3 was used for aerodynamic calculations, but was destroyed during the bombing of Berlin at the end of 1943. Zuse led the reconstruction of his brainchild in the 1960s, and now this programmable machine is on display in the Museum of Munich.

The Mark 1, conceived by Professor Howard Aiken and released by IBM in 1941, was America's first programmable computer. The machine cost half a million dollars, and was used to develop equipment for the US Navy, such as torpedoes and underwater detection. Also "Mark 1" was used in the development of implosion devices for the atomic bomb.

It is "Mark 1" that can be called the very first computer in the world. Its characteristics, unlike the German Z3, made it possible to perform calculations in automatic mode, without requiring human intervention in the work process.

Atanasoff-Berry Computer (ABC)

In 1939, Professor John Vincent Atanasoff received funds to build a machine called the Atanasoff-Berry Computer (ABC). It was designed and assembled by Atanasov and graduate student Clifford Berry in 1942. However, the ABC device was not widely known until the patent dispute related to the invention of the computer. It was only resolved in 1973, when it was proven that ENIAC co-inventor John Mauchly saw the ABC computer shortly after it became functional.

The legal outcome of the litigation was landmark: Atanasoff was declared the initiator of several major computer ideas, but the computer as a concept was declared unpatentable and therefore freely open to all developers. A full-scale working copy of the ABC was completed in 1997, proving that the ABC machine functioned as Atanasoff claimed.

ENIAC

ENIAC

ENIAC was developed by two scientists from the University of Pennsylvania - John Eckert and John Mauchly. He could solve "a wide range of numerical problems" by reprogramming. Although the car was presented to the public after the war, in 1946, it was important for calculations during subsequent conflicts such as the Cold War and the Korean War. It was used for calculations when creating hydrogen bomb, engineering calculations and creation of firing tables. She also made weather forecasts in the USSR so that the Americans knew where they might fall fallout in the event of a nuclear war.

Unlike Mark 1 with its electromechanical relays, ENIAC had vacuum tubes. It is believed that ENIAC has carried out more calculations in its ten years of operation than all of humanity up to that time.

EDSAC

EDSAC

First stored-in-memory computer software called EDSAC. It was assembled in 1949 at the University of Cambridge. The project to create it was led by a Cambridge professor and director of the Cambridge Computational Research Laboratory, Maurice Wilks.

One of the major advances in programming was Wilkes' use of a library of short programs called "subroutines". It was stored on punched cards and used to perform general repetitive calculations as part of the lager program.

What did the first computer in the world look like?

The American "Mark 1" was huge, taking up over 17 meters in length and over 2.5 meters in height. The machine, encased in glass and stainless steel, weighed 4.5 tons, and the total length of its connecting wires barely reached 800 km. A fifteen-meter shaft was responsible for synchronizing the main computing modules, which drove a 4 kW electric motor.

Mark 1 at the IBM Museum

Even heavier than Mark 1 was ENIAC. It weighed 27 tons and required 174 kW of electricity. When it was turned on, the city lights dimmed. The machine had neither a keyboard nor a monitor, occupied an area of ​​135 square meters and was entwined with kilometers of wires. To get an idea of ​​ENIAC's appearance, imagine a long row of metal cabinets lined from top to bottom with light bulbs. Since the computer did not yet have high-quality cooling, it was very hot in the room where it was located, and ENIAC was malfunctioning.

ENIAC

In the USSR, they did not want to lag behind the West and carried out their own developments on the creation of computers. The result of the efforts of Soviet scientists was (MESM). Its first launch took place in 1950. The MESM used 6 thousand lamps, it occupied an area of ​​60 square meters. m and required power up to 25 kW for operation.

MESM

The device could perform up to 3 thousand operations per second. MESM was used for complex scientific calculations, then it was used as a teaching aid, and in 1959 the machine was dismantled.

In 1952, MESM had elder sister- (BESM). The number of vacuum tubes in it has increased to 5 thousand, and the number of operations per second has also increased - from 8 to 10 thousand.

BESM

The world's first commercial computer

Introduced in the US in 1951, it can be called the first computer designed for commercial use.

He became famous after using data from a poll of the eligible 1% to correctly predict that General Dwight Eisenhower would win the 1952 election. When people realized the possibilities of computer data processing, many enterprises began to purchase this machine for their needs.

The very first personal computer in the world

For the first time the term "personal computer" was applied to the creation of the Italian engineer Pier Giorgio Perotto called Programma 101. Produced by Olivetti.

Programma 101

The device cost $3,200 and sold about 44,000 copies. Ten pieces were bought by NASA to use for calculations of the Apollo 11 landing on the moon in 1969. The ABC (American Broadcasting Company) network used Programma 101 to predict the 1968 presidential election. The US military used it to plan their operations during the Vietnam War. It was also purchased for schools, hospitals and government offices and marked the beginning of an era of rapid PC development and sales.

The first mass-produced home computer abroad

In 1975, an article about a new computer set, the Altair 8800, appeared in an issue of Popular Electronics magazine. Within weeks of the device's introduction, customers flooded its manufacturer, MITS, with orders. The machine was equipped with 256-byte memory (expandable to 64 KB) and a universal interface bus, which became the "S-100" standard widely used in hobby and personal computers of that era.

"Altair 8800" could be bought for $397. After the purchase, the owner-radio amateur had to solder and check the performance of the assembled nodes on his own. The difficulties did not end there, we still had to master writing programs using zeros and ones. The Altair 8800 did not have a keyboard or monitor, hard drive, or floppy drive. To enter desired program the user flipped the toggle switches on the front of the device. And the verification of the results was carried out by observing the lights flashing on the front panel.

BUT In 1976, the first Apple computer was born., designed and handcrafted by Steve Wozniak and advertised by his friend as the company's first product Apple Computer company. The Apple 1 is considered the first PC to ship off the shelf.

apple 1

In fact, the device had neither a monitor nor a keyboard (it was possible to connect them). But there was a fully equipped circuit board, on which there were 30 microcircuits. The Altair 8800 and other devices that entered the market did not have this either, they had to be assembled from a set. The Apple 1 originally had an almost "hellish" price of $666.66, however it was dropped to $475 a year later. An add-on board was later released that allowed data to be recorded to a cassette recorder. It cost $75.

The first mass-produced home computer in the USSR

Since the 80s of the XX century, a computer called Pravets began to be produced in Bulgaria. It was a clone of the second version of Apple. Another clone included in the Pravets line was the "Soviet" IBM PC, based on Intel 8088 and 8086 processors. It was produced from 1985 to 1992. Pravets computers were installed in many schools in the Soviet Union.

Those wishing to build their own home computer could use the instructions in the Radio magazine 1982-83. and reproduce the model called "Micro-80". It was based on the KR580VM80 microprocessor, similar to the Intel i8080.

In 1984, the Agat computer appeared in the Soviet Union, which was quite powerful compared to Western models. The amount of RAM was 128 KB, which was twice the amount of RAM in Apple models of the early 80s of the twentieth century. The computer was produced in several modifications, had an external keyboard with 74 keys and a black and white or color screen.

The production of Agats continued until 1993.

Modern computers

Today, modern computer technology is changing very quickly. of modern times are billions of times superior to their ancestors. Every company wants to surprise already jaded users, and so far many succeed in this. Here are just a few of the main themes in recent years:

  • The laptop that had an important impact on the development of the industry: Apple Macbook (2006).
  • Smartphone that has had an important impact on the development of the industry: Apple iPhone (2007).
  • Tablet that has had an important impact on the development of the industry: Apple iPad (2010).
  • The first "smart watch": Pulsar Time Computer (1972). They can be seen on the arm of James Bond in the 1973 action movie Live and Let Die.

And, of course, various game consoles: Playstation, Xbox, Nintendo, etc.

We live in interesting time(even though it sounds like a Chinese curse). And who knows what awaits in the near future. Neural computers? Quantum computers? Wait and see.

Back in the first half of the 19th century. The English mathematician Charles Babbage tried to build a universal computing device, that is, a computer (Babbage called it the Analytical Engine). It was Babbage who first thought of the idea that a computer should contain memory and be controlled by a program. Babbage wanted to build his computer as a mechanical device, and he was going to set the programs using punched cards - cards made of thick paper with information applied using holes.

However, Babbage could not complete this work - it turned out to be too complicated for the technology of that time.

The German engineer Konrad Zuse built a small computer in 1941 based on several electromechanical relays. But because of the war, Zuse's work was not published.

And in the USA in 1943, at one of the enterprises of IBM, the American Howard Aiken created a more powerful computer called Mark-1. It already made it possible to carry out calculations hundreds of times faster than by hand, and was actually used for military calculations.

Starting in 1943 in the USA, a group of specialists led by John Mauchly and Presper Eckert began to design the ENIAC computer based on vacuum tubes. The computer they created worked a thousand times faster than the Mark-1. To simplify and speed up the process of setting programs, Eckert and Mauchly began to design a new computer that could store the program in its memory.

In 1945, the famous mathematician John von Neumann was involved in the work, who prepared a report on this computer. The report was sent to many scientists and became widely known, because in it von Neumann clearly and simply formulated the general principles of the functioning of computers, i.e. universal computing devices. And until now, the vast majority of computers are made in accordance with the principles that John von Neumann outlined in his report in 1945. The first computer in which von Neumann's principles were embodied was built in 1949 by the English researcher Maurice Wilkes.

In the 1940s and 1950s, computers were built using vacuum tubes. Therefore, computers were very large (they occupied huge halls), expensive and unreliable - after all, vacuum tubes, like ordinary light bulbs, often burn out.

But in 1948, transistors were invented - miniature and inexpensive electronic devices that could replace vacuum tubes. This led to a reduction in the size of computers by hundreds of times and an increase in their reliability.

The first computers based on transistors appeared in the late 50s, and by the mid 60s. After the advent of transistors, the most time-consuming operation in the manufacture of computers was the connection and soldering of transistors to create electronic circuits. But in 1959, Robert Noyce (the future founder of Intel) invented a method that allows you to create transistors and all the necessary connections between them on a single silicon wafer. The resulting electronic circuits became known as integrated circuits, or chips.

In 1968, Burroughs produced the first integrated circuit computer, and in 1970, Intel began selling memory integrated circuits. In the future, the number of transistors that could be placed per unit area integrated circuit has roughly doubled every year, which is what keeps the cost of computers down and speed up.

At the beginning of 1975, the first commercially distributed personal computer, the Altair-8800, based on the Intel-8080 microprocessor, appeared. And although its capabilities were very limited (RAM was only 256 bytes, there was no keyboard and screen), its appearance was met with great enthusiasm: several thousand sets of the machine were sold in the first months.

At the end of 1975, Paul Allen and Bill Gates (the future founders of Microsoft) created an interpreter for the Altair computer. Basic language, which allowed users to simply communicate with the computer and easily write programs for it. It also contributed to the popularity of personal computers. Personal computers began to be sold already in a complete set, with a keyboard and a monitor, the demand for them amounted to tens, and then hundreds of thousands of pieces a year.

In 1976, the new Apple Computer entered the market with the $666 Apple I computer. His motherboard was screwed to a piece of plywood, and there was no case or power supply at all.

But the Apple II computer, which appeared in 1977, became the prototype for most subsequent models, including the IBM PC.

In the late 70s, the spread of personal computers even led to some decrease in demand for large computers and minicomputers (minicomputers). This became a matter of serious concern for IBM (International Business Machines Corporation), a leading company in the production of large computers, and in 1979 IBM decided to try its hand at the personal computer market.

First of all, the newest then 16-bit microprocessor Intel-8088 was chosen as the main microprocessor of the computer. Its use made it possible to significantly increase the potential capabilities of a computer, since the new microprocessor made it possible to work with 1 MB of memory, and all computers available at that time were limited to 64 KB.

Users have the opportunity to independently upgrade their computers and equip them with additional devices from hundreds of different manufacturers.

All this led to a reduction in the cost of IBM PC-compatible computers and a rapid improvement in their characteristics, which means an increase in their popularity.

IBM did not make its computer a single one-piece device and did not protect its design with patents. On the contrary, she assembled the computer from independently manufactured parts and did not keep the specifications of these parts and how they were connected a secret. On the contrary, the design principles of the IBM PC were available to everyone. This approach, called the open architecture principle, made the IBM PC a terrific success, although it took away from IBM the sole benefit of that success.


The need to count arose in people along with the advent of civilization. They needed to carry out trade transactions, carry out land surveying, manage crop stocks, monitor astronomical cycles. For this, various tools were invented from ancient times, from counting sticks and an abacus, which, in the course of the development of science and technology, evolved into calculators and various computing devices, including personal computers.



Human life in the twenty-first century is directly related to artificial intelligence. Knowledge of the main milestones in the creation of computers is an indicator educated person. The development of computers is usually divided into 5 stages - it is customary to talk about five generations.

1946-1954 - first generation computers

It is worth saying that the first generation of computers (electronic computers) was a tube. Scientists at the University of Pennsylvania (USA) developed ENIAC - the name of the world's first computer. The day when it was officially put into operation is 02/15/1946. When assembling the device, 18 thousand electron tubes were involved. Computers by today's standards had a colossal area of ​​135 square meters and weighs 30 tons. The demand for electricity was also high - 150 kW.

It is a well-known fact that this electronic machine was created directly to help in solving the most difficult tasks of creating an atomic bomb. The USSR was rapidly catching up and in December 1951, under the guidance and with the direct participation of Academician S. A. Lebedev, the world's fastest computer was presented to the world. She wore the abbreviation MESM (Small Electronic Computing Machine). This device could perform from 8 to 10 thousand operations per second.

1954 - 1964 - computers of the second generation

The next step in development was the development of computers running on transistors. Transistors are devices made from semiconductor materials that allow you to control the current flowing in the circuit. The first known stable working transistor was created in America in 1948 by a team of physicists - researchers Shockley and Bardeen.

In terms of speed, electronic computers differed significantly from their predecessors - the speed reached hundreds of thousands of operations per second. Reduced size and consumption electrical energy became smaller. The scope of use has also increased significantly. This happened due to the rapid development of software. Our best computer, BESM-6, had a record speed of 1,000,000 operations per second. Developed in 1965 under the leadership of chief designer S. A. Lebedev.

1964 - 1971 - third generation computers

The main difference of this period is the beginning of the use of microcircuits with a low degree of integration. With the help of sophisticated technologies, scientists were able to place complex electronic circuits on a small semiconductor wafer, with an area of ​​\u200b\u200bless than 1 centimeter square. The invention of microcircuits was patented in 1958. Inventor: Jack Kilby. The use of this revolutionary invention made it possible to improve all parameters - the dimensions decreased to about the size of a refrigerator, the speed increased, as well as reliability.

This stage in the development of computers is characterized by the use of a new storage device - a magnetic disk. The PDP-8 minicomputer was first introduced in 1965.

In the USSR, such versions appeared much later - in 1972 and were analogues of the models presented on the American market.

1971 - present - fourth generation computers

An innovation in fourth generation computers is the application and use of microprocessors. Microprocessors are ALUs (arithmetic logic units) placed on a single chip and having a high degree integration. This means that microcircuits begin to take up even less space. In other words, a microprocessor is little brain, which performs millions of operations per second according to the program embedded in it. Dimensions, weight and power consumption have been drastically reduced, and performance has reached record heights. And that's when Intel got into the game.

The first microprocessor was called the Intel-4004, the name of the first microprocessor assembled in 1971. It had a bit depth of 4 bits, but then it was a giant technological breakthrough. Two years later, Intel introduced the world to the Intel-8008, which has eight bits, in 1975 the Altair-8800 was born - this is the first personal computer based on the Intel-8008.

This was the beginning of a whole era of personal computers. The machine began to be used everywhere in absolutely various purposes. A year later, Apple entered the game. The project was a great success, and Steve Jobs became one of the most famous and richest people on Earth.

The indisputable standard of the computer is the IBM PC. It was released in 1981 with 1 megabyte RAM.

It is noteworthy that at the moment, IBM-compatible electronic computers occupy about ninety percent of the computers produced! Also, it is impossible not to mention the Pentium. The development of the first processor with an integrated coprocessor was completed successfully in 1989. Now this trademark is an indisputable authority in the development and application of microprocessors in the computer market.

If we talk about the prospects, then this is, of course, the development and implementation of the latest technologies: very large integrated circuits, magneto-optical elements, even elements of artificial intelligence.

self-taught electronic systems- this is the foreseeable future, called the fifth generation in the development of computers.

A person seeks to erase the barrier in communicating with a computer. Japan worked on this for a very long time and, unfortunately, unsuccessfully, but this is a topic for a completely different article. At the moment, all projects are only in development, but with the current pace of development, this is not far away. The present is the time when history is being made!

Share.

One of the first devices (5th-4th centuries BC), from which the history of the development of computers can be considered, was a special board, later called "abacus". Calculations on it were carried out by moving bones or stones in the recesses of boards made of bronze, stone, ivory and the like. In Greece, the abacus existed already in the 5th century. BC, among the Japanese it was called "serobayan", among the Chinese - "suanpan". In ancient Russia, a device similar to an abacus was used for counting - a “board count”. In the 17th century, this device took the form of familiar Russian accounts.

Abacus (V-IV centuries BC)

The French mathematician and philosopher Blaise Pascal in 1642 created the first machine, which received the name Pascaline in honor of its creator. A mechanical device in the form of a box with many gears, in addition to addition, also performed subtraction. Data was entered into the machine by turning dials that corresponded to numbers from 0 to 9. The answer appeared at the top of the metal case.


Pascalina

In 1673, Gottfried Wilhelm Leibniz created a mechanical calculating device (Leibniz step calculator - Leibniz calculator), which for the first time not only added and subtracted, but also multiplied, divided and calculated Square root. Subsequently, the Leibniz wheel became the prototype for mass calculating devices - adding machines.


Leibniz step calculator model

English mathematician Charles Babbage developed a device that not only performed arithmetic operations, but also immediately printed the results. In 1832, a ten-fold reduced model was built from two thousand brass parts, which weighed three tons, but was able to perform arithmetic operations with an accuracy of six decimal places and calculate second-order derivatives. This computer became the prototype of real computers, it was called a differential machine.

differential machine

The summing apparatus with continuous transmission of tens is created by the Russian mathematician and mechanic Pafnuty Lvovich Chebyshev. This device has achieved automation of all arithmetic operations. In 1881, a prefix was created for a adding apparatus for multiplying and dividing. The principle of continuous transmission of tens has been widely used in various counters and computers.


Chebyshev summing apparatus

Automated data processing appeared at the end of the last century in the United States. Herman Hollerith created a device - Hollerith's Tabulator - in which, applied to punched cards, it was deciphered by electric current.

Hollerith tabulator

In 1936, a young scientist from Cambridge, Alan Turing, came up with a mental calculating machine-computer that existed only on paper. His "smart machine" acted according to a certain predetermined algorithm. Depending on the algorithm, the imaginary machine could be used for a wide variety of purposes. However, at that time these were purely theoretical considerations and schemes that served as a prototype of a programmable computer, as a computing device that processes data in accordance with a certain sequence of commands.

Information revolutions in history

In the history of the development of civilization, there have been several information revolutions - transformations of social social relations due to changes in the processing, storage and transmission of information.

First the revolution is associated with the invention of writing, which led to a gigantic qualitative and quantitative leap of civilization. It became possible to transfer knowledge from generations to generations.

Second(mid-16th century) the revolution was caused by the invention of printing, which radically changed industrial society, culture, and the organization of activities.

Third(end of the 19th century) a revolution with discoveries in the field of electricity, thanks to which the telegraph, telephone, radio, and devices appeared that allow you to quickly transfer and accumulate information in any volume.

Fourth(since the seventies of the XX century) the revolution is associated with the invention of microprocessor technology and the advent of the personal computer. Computers, data transmission systems (information communications) are created on microprocessors and integrated circuits.

This period is characterized by three fundamental innovations:

  • transition from mechanical and electrical means of information conversion to electronic ones;
  • miniaturization of all nodes, devices, devices, machines;
  • creation of software-controlled devices and processes.

History of the development of computer technology

The need for storing, converting and transmitting information in humans appeared much earlier than the telegraph apparatus, the first telephone exchange and an electronic computer (computer) were created. In fact, all the experience, all the knowledge accumulated by mankind, one way or another, contributed to the emergence of computer technology. The history of the creation of computers - common name electronic machines for performing calculations - begins far in the past and is associated with the development of almost all aspects of human life and activity. As long as human civilization has existed, a certain automation of calculations has been used for so long.

The history of development computer technology spans about five decades. During this time, several generations of computers have changed. Each subsequent generation was distinguished by new elements (electronic tubes, transistors, integrated circuits), the manufacturing technology of which was fundamentally different. Currently, there is a generally accepted classification of computer generations:

  • First generation (1946 - early 50s). Element base - electronic lamps. Computers were distinguished by large dimensions, high energy consumption, low speed, low reliability, programming in codes.
  • Second generation (late 50s - early 60s). Element base - semiconductor. Improved compared to computers previous generation almost all specifications. Algorithmic languages ​​are used for programming.
  • 3rd generation (late 60s - late 70s). Element base - integrated circuits, multilayer printed wiring. A sharp decrease in the dimensions of computers, an increase in their reliability, an increase in productivity. Access from remote terminals.
  • The fourth generation (from the mid-70s to the end of the 80s). Element base - microprocessors, large integrated circuits. Improved specifications. Mass production of personal computers. Directions of development: powerful multiprocessor computing systems with high performance, creation of cheap microcomputers.
  • Fifth generation (since the mid-80s). The development of intelligent computers began, which has not yet been crowned with success. Introduction to all areas of computer networks and their association, the use of distributed data processing, the widespread use of computer information technologies.

Along with the change of generations of computers, the nature of their use also changed. If at first they were created and used mainly for solving computational problems, then later the scope of their application expanded. This includes information processing, automation of production and technological and scientific processes and much more.

How Computers Work by Konrad Zuse

The idea of ​​​​the possibility of building an automated calculating machine came up with the German engineer Konrad Zuse (Konrad Zuse) and in 1934 Zuse formulated the basic principles on which future computers should work:

  • binary number system;
  • the use of devices operating on the principle of "yes / no" (logical 1 / 0);
  • fully automated operation of the calculator;
  • software control of the computing process;
  • support for floating point arithmetic;
  • use of large capacity memory.

Zuse was the first in the world to determine that data processing begins with a bit (he called the bit "yes / no status", and the formulas of binary algebra - conditional propositions), the first to introduce the term "machine word" (Word), the first to combine arithmetic and logical calculators operations, noting that “the elementary operation of a computer is to check two binary numbers for equality. The result will also be a binary number with two values ​​(equal, not equal).

First generation - computers with vacuum tubes

Colossus I - the first computer on lamps, created by the British in 1943, to decode German military ciphers; it consisted of 1800 vacuum tubes - information storage devices - and was one of the first programmable electronic digital computers.

ENIAC - was created to calculate artillery ballistics tables; this computer weighed 30 tons, occupied 1000 square feet and consumed 130-140 kW of electricity. The computer contained 17468 vacuum tubes of sixteen types, 7200 crystal diodes and 4100 magnetic elements, and they were contained in cabinets with a total volume of about 100 m 3 . ENIAC had a performance of 5000 operations per second. The total cost of the machine was $750,000. The electricity requirement was 174 kW, and the total space occupied was 300 m2.


ENIAC - a device for calculating artillery ballistics tables

Another representative of the 1st generation of computers that you should pay attention to is EDVAC (Electronic Discrete Variable Computer). EDVAC is interesting in that it attempted to record programs electronically in so-called "ultrasonic delay lines" using mercury tubes. In 126 such lines, it was possible to store 1024 lines of four-digit binary numbers. It was "fast" memory. As a "slow" memory, it was supposed to fix numbers and commands on a magnetic wire, but this method turned out to be unreliable, and teletype tapes had to be returned to. The EDVAC was faster than its predecessor, adding in 1 µs and dividing in 3 µs. It contained only 3.5 thousand electron tubes and was located on 13 m 2 of area.

UNIVAC (Universal Automatic Computer) was an electronic device with programs stored in memory, which were entered there no longer from punched cards, but using a magnetic tape; it provided high speed reading and writing information, and, consequently, higher performance of the machine as a whole. One tape could contain a million characters written in binary form. Tapes could store both programs and intermediate data.


Representatives of the 1st generation of computers: 1) Electronic Discrete Variable Computer; 2) Universal Automatic Computer

The second generation is a computer on transistors.

Transistors replaced vacuum tubes in the early 1960s. Transistors (which act like electrical switches) consume less electricity and generate less heat, and take up less space. Combining several transistor circuits on one board gives an integrated circuit (chip - “chip”, “chip” literally, a plate). Transistors are binary counters. These details fix two states - the presence of current and the absence of current, and thereby process the information presented to them in this binary form.

In 1953, William Shockley invented the p-n junction transistor. The transistor replaces the vacuum tube and at the same time operates at a higher speed, generates very little heat and consumes almost no electricity. Simultaneously with the process of replacing electron tubes with transistors, information storage methods were improved: as memory devices, magnetic cores and magnetic drums began to be used, and already in the 60s information storage on disks became widespread.

One of the first transistorized computers, the Atlas Guidance Computer, was launched in 1957 and was used to control the launch of the Atlas rocket.

Created in 1957, the RAMAC was an inexpensive computer with modular external memory on disks, combined magnetic core random access memory and drums. Although this computer was not yet completely transistorized, it was highly operable and easy to maintain and was in great demand in the office automation market. Therefore, a “large” RAMAC (IBM-305) was urgently released for corporate customers; to accommodate 5 MB of data, the RAMAC system needed 50 disks with a diameter of 24 inches. Based on this model Information system processed arrays of requests in 10 languages ​​without fail.

In 1959, IBM created its first all-transistorized large mainframe computer, the 7090, capable of 229,000 operations per second—a true transistorized mainframe. In 1964, based on two 7090 mainframes, the American airline SABER for the first time applied an automated system for selling and booking airline tickets in 65 cities around the world.

In 1960, DEC introduced the world's first minicomputer, the PDP-1 (Programmed Data Processor), a computer with a monitor and keyboard, which became one of the most notable items on the market. This computer was capable of performing 100,000 operations per second. The machine itself occupied only 1.5 m 2 on the floor. The PDP-1 became, in fact, the world's first gaming platform thanks to MIT student Steve Russell, who wrote a Star War computer toy for it!


Representatives of the second generation of computers: 1) RAMAC; 2) PDP-1

In 1968, Digital began mass production of minicomputers for the first time - it was the PDP-8: their price was about $ 10,000, and the model was the size of a refrigerator. It was this PDP-8 model that laboratories, universities and small businesses were able to buy.

Domestic computers of that time can be characterized as follows: in terms of architectural, circuit and functional solutions, they corresponded to their time, but their capabilities were limited due to the imperfection of the production and element base. The machines of the BESM series were the most popular. Serial production, rather insignificant, began with the release of the Ural-2 computer (1958), BESM-2, Minsk-1 and Ural-3 (all in 1959). In 1960, they went into the M-20 and Ural-4 series. At the end of 1960, the M-20 had the maximum performance (4500 lamps, 35 thousand semiconductor diodes, memory for 4096 cells) - 20 thousand operations per second. The first computers based on semiconductor elements (Razdan-2, Minsk-2, M-220 and Dnepr) were still under development.

Third generation - small-sized computers on integrated circuits

In the 50s and 60s assembly electronic equipment was a time-consuming process that was slowed down by the increasing complexity of electronic circuits. For example, a CD1604 computer (1960, Control Data Corp.) contained about 100,000 diodes and 25,000 transistors.

In 1959, Americans Jack St. Clair Kilby (Texas Instruments) and Robert N. Noyce (Fairchild Semiconductor) independently invented the integrated circuit (IC), a collection of thousands of transistors placed on a single silicon chip inside a microcircuit.

The production of computers on ICs (they were called microcircuits later) was much cheaper than on transistors. Thanks to this, many organizations were able to acquire and master such machines. And this, in turn, led to an increase in demand for mainframe computers designed to solve various tasks. During these years, the production of computers acquired an industrial scale.

At the same time, semiconductor memory appeared, which is still used in personal computers to this day.


Representative of the third generation of computers - ES-1022

Fourth generation - personal computers on processors

The forerunners of the IBM PC were the Apple II, Radio Shack TRS-80, Atari 400 and 800, Commodore 64 and Commodore PET.

The birth of personal computers (PC, PC) is rightfully associated with Intel processors. The corporation was founded in mid-June 1968. Since then, Intel has become the world's largest manufacturer of microprocessors with over 64,000 employees. Intel's goal was to create semiconductor memory, and in order to survive, the company began to take third-party orders for the development of semiconductor devices.

In 1971, Intel received an order to develop a set of 12 chips for programmable calculators, but the creation of 12 specialized chips seemed cumbersome and inefficient to Intel engineers. The task of reducing the range of microcircuits was solved by creating a “twin” from semiconductor memory and an actuator capable of working on commands stored in it. It was a breakthrough in computing philosophy: a universal logic device in the form of a 4-bit central processing unit i4004, which was later called the first microprocessor. It was a set of 4 chips, including one chip controlled by commands that were stored in an internal semiconductor memory.

As a commercial development, a microcomputer (as the microcircuit was then called) appeared on the market on November 11, 1971 under the name 4004: 4 bit, containing 2300 transistors, clock frequency 60 kHz, cost - $ 200. In 1972, Intel released an eight-bit microprocessor 8008, and in 1974 - its improved version Intel-8080, which by the end of the 70s became the standard for the microcomputer industry. Already in 1973, the first computer based on the 8080 processor, Micral, appeared in France. For various reasons, this processor was not successful in America (in the Soviet Union it was copied and produced for a long time called 580VM80). At the same time, a group of engineers left Intel and formed Zilog. Its loudest product is the Z80, which has an extended 8080 instruction set and has made it a commercial success for household appliances, managed with one supply voltage of 5V. On its basis, in particular, the ZX-Spectrum computer was created (sometimes it is called by the name of the creator - Sinclair), which practically became the prototype of the Home PC of the mid-80s. In 1981, Intel released the 16-bit processor 8086 and 8088, an analogue of the 8086, except for the external 8-bit data bus (all peripherals were still 8-bit at that time).

Intel's competitor, the Apple II computer, differed in that it was not a completely finished device and there was some freedom for refinement directly by the user - it was possible to install additional interface boards, memory boards, etc. It was this feature, which later became known as "open architecture", became its main advantage. Two more innovations, developed in 1978, contributed to the success of the Apple II. An inexpensive floppy disk drive and the first commercial calculation program, the VisiCalc spreadsheet.

The Altair-8800 computer, built on the basis of the Intel-8080 processor, was very popular in the 70s. Although the Altair's capabilities were rather limited - the RAM was only 4 Kb, the keyboard and screen were absent, its appearance was greeted with great enthusiasm. It was released to the market in 1975 and several thousand sets of the machine were sold in the first months.


Representatives of the 4th generation of computers: a) Micral; b) Apple II

This computer, designed by MITS, was sold by mail order as a DIY kit. The entire build kit cost $397, while only one processor from Intel sold for $360.

The spread of the PC by the end of the 70s led to a slight decrease in demand for main computers and minicomputers - IBM released the IBM PC based on the 8088 processor in 1979. The software that existed in the early 80s was focused on word processing and simple electronic tables, and the very idea that a "microcomputer" could become a familiar and necessary device at work and at home seemed incredible.

On August 12, 1981, IBM introduced the Personal Computer (PC), which, in combination with software from Microsoft, became the standard for the entire PC fleet. modern world. The price of the IBM PC model with a monochrome display was about $3,000, with a color one - $6,000. IBM PC configuration: Intel 8088 processor with a frequency of 4.77 MHz and 29 thousand transistors, 64 KB of RAM, 1 floppy drive with a capacity of 160 KB, - a conventional built-in speaker. At this time, launching and working with applications was a real pain: due to the lack of a hard drive, you had to change floppy disks all the time, there was no mouse, no graphical windowed user interface, no exact correspondence between the image on the screen and the final result (WYSIWYG ). Color graphics were extremely primitive, there was no question of three-dimensional animation or photo processing, but the history of the development of personal computers began with this model.

In 1984, IBM introduced two more innovations. First, a model for home users called the 8088-based PCjr was released, which was equipped with what was probably the first wireless keyboard, but this model did not succeed in the market.

The second novelty is the IBM PC AT. The most important feature: Upgrade to higher-end microprocessors (80286 with 80287 digital coprocessor) while maintaining compatibility with previous models. This computer proved to be a trendsetter for many years to come in a number of respects: here for the first time a 16-bit expansion bus appeared (which remains standard to this day) and EGA graphics adapters with a resolution of 640x350 at a color depth of 16 bits.

1984 saw the release of the first Macintosh computers with a graphical interface, a mouse, and many other user interface attributes that modern desktop computers are inconceivable without. Users of the new interface did not leave indifferent, but the revolutionary computer was not compatible with either the previous programs or hardware components. And in the corporations of that time, WordPerfect and Lotus 1-2-3 had already become normal working tools. Users have already become accustomed to and adapted to the symbolic DOS interface. From their point of view, the Macintosh even looked somehow frivolous.

Fifth generation of computers (from 1985 to our time)

Distinctive features of the 5th generation:

  1. New production technologies.
  2. Rejection of traditional programming languages ​​such as Cobol and Fortran in favor of languages ​​with enhanced character manipulation and elements of logic programming (Prolog and Lisp).
  3. Emphasis on new architectures (for example, data flow architecture).
  4. New user-friendly input/output methods (e.g. speech and image recognition, speech synthesis, natural language message processing)
  5. Artificial intelligence (that is, automation of the processes of solving problems, obtaining conclusions, manipulating knowledge)

It was at the turn of the 80-90s that the Windows-Intel alliance was formed. When Intel released the 486 microprocessor in early 1989, computer manufacturers didn't wait for an example from IBM or Compaq. A race began, in which dozens of firms entered. But all the new computers were extremely similar to each other - they were united by compatibility with Windows and processors from Intel.

In 1989, the i486 processor was released. It had a built-in math coprocessor, a pipeline, and a built-in first-level cache.

Directions for the development of computers

Neurocomputers can be attributed to the sixth generation of computers. Despite the fact that the actual use of neural networks began relatively recently, neurocomputing as a scientific direction has entered its seventh decade, and the first neurocomputer was built in 1958. The developer of the machine was Frank Rosenblatt, who gave his brainchild the name Mark I.

The theory of neural networks was first identified in the work of McCulloch and Pitts in 1943: any arithmetic or logical function can be implemented using a simple neural network. Interest in neurocomputing flared up again in the early 80s and was fueled by new work with multilayer perceptrons and parallel computing.

Neurocomputers are PCs consisting of many simple computing elements working in parallel, which are called neurons. Neurons form so-called neural networks. The high speed of neurocomputers is achieved precisely due to huge amount neurons. Neurocomputers are built according to the biological principle: nervous system human brain consists of individual cells - neurons, the number of which in the brain reaches 10 12, despite the fact that the response time of a neuron is 3 ms. Each neuron does enough simple functions, but since it is connected on average with 1-10 thousand other neurons, such a team successfully ensures the functioning of the human brain.

Representative of the VIth generation of computers - Mark I

In optoelectronic computers, the information carrier is the luminous flux. Electrical signals are converted to optical and vice versa. Optical radiation as an information carrier has a number of potential advantages over electrical signals:

  • Light streams, unlike electrical ones, can intersect with each other;
  • Light fluxes can be localized in the transverse direction of nanometer dimensions and transmitted through free space;
  • The interaction of light fluxes with non-linear media is distributed throughout the entire environment, which gives new degrees of freedom in organizing communication and creating parallel architectures.

Currently, developments are underway to create computers entirely consisting of optical information processing devices. Today this direction is the most interesting.

An optical computer has unprecedented performance and is completely different than electronic computer, architecture: for 1 cycle of less than 1 nanosecond (this corresponds to a clock frequency of more than 1000 MHz), an optical computer can process a data array of about 1 megabyte or more. To date, individual components of optical computers have already been created and optimized.

An optical computer the size of a laptop can give the user the ability to place in it almost all the information about the world, while the computer can solve problems of any complexity.

Biological computers are ordinary PCs, only based on DNA computing. There are so few really demonstrative works in this area that it is not necessary to talk about significant results.

Molecular computers are PCs, the principle of which is based on the use of changes in the properties of molecules in the process of photosynthesis. In the process of photosynthesis, the molecule assumes different states, so that scientists can only assign certain logical values ​​to each state, that is, "0" or "1". Using certain molecules, scientists have determined that their photocycle consists of only two states, which can be “switched” by changing the acid-base balance of the environment. The latter is very easy to do with an electrical signal. Modern technologies already make it possible to create entire chains of molecules organized in this way. Thus, it is very possible that molecular computers are waiting for us “just around the corner”.

The history of the development of computers is not over yet, in addition to improving the old ones, there is also the development of completely new technologies. An example of this is quantum computers - devices based on quantum mechanics. A full-scale quantum computer is a hypothetical device, the possibility of building which is associated with serious development quantum theory in the field of many particles and complex experiments; this work lies at the forefront of modern physics. Experimental quantum computers already exist; elements of quantum computers can be used to increase the efficiency of calculations on an already existing instrument base.