Software and hardware: concept, purpose, levels, characteristics and settings

Table of contents:

Software and hardware: concept, purpose, levels, characteristics and settings
Software and hardware: concept, purpose, levels, characteristics and settings
Anonim

A computer is a complex device that is a synthesis of software and hardware. This is a machine that solves problems by executing commands, such as: add two numbers, check if a number is different from zero, copy data from one memory cell to another, etc.

Simple commands make up a language called machine language, in which a person can tell a computer what to do. Each computer, depending on its purpose, is supplied with a certain set of commands. They are made primitive to make it easier to manufacture computers.

However, machine language creates big problems for humans, because it is tedious and extremely difficult to write in it. Therefore, engineers have invented several levels of abstraction, each of which is based on a lower level, down to machine language and computer logic, and at the top level is user interaction. This principle is called the multi-level structure of the computer, and it obeys andhardware and software of computer systems.

Multilevel structure of computers

As mentioned earlier, software and hardware are built on the principle of abstraction layers, each of which is based on the previous one. Simply put, to make it easier for a person to write programs, a new language is created (or rather built on) on the basis of the machine language, which is more understandable to a person, but completely unexecutable by a computer. Then how does a computer execute programs in a new language?

There are two main approaches - translation and interpretation. In the first case, each command of the new language corresponds to a set of machine language commands, so that the program in the new language is completely converted into a program in machine language. In the second case, a program is created in machine language that takes commands in a new language as input, recognizes them, translates them into machine language, and executes.

Multilevel organization of computers
Multilevel organization of computers

Computer hardware and software can contain many levels from the very first, or basic, to the one that can be understood by a human. To illustrate this process, the concept of a virtual machine is excellent. It can be considered that when a computer executes a program in any language (C++, for example), then a virtual machine runs in it, which executes the commands of this language. Below the C++ virtual machine is another one with a more primitive language. For example, let it be "Assembler". Works at this levelAssembler virtual machine. And between them, either translation or interpretation of the program code takes place. Thus, many levels are added into a single chain up to the very first - machine level. The virtual machine is just a concept that makes it easier to imagine the process of layering.

Answering the obvious question - why not make a computer that works directly with the same C++ language?

The fact is that the creation of such technology will require huge investments in hardware and software of such a computer. This is most likely possible, but would be so expensive that it would no longer be feasible.

Modern computers

Today, most computers consist of 2-6 levels. The zero level is the base, that is, machine or hardware; only machine code runs on it, which is executed by the computer's electrical circuits. And on the basis of them, a first-level language is built, etc. It should also be clarified that everything does not end with the zero level. Below it, there is a technical level - the transistors and resistors themselves, that is, solid state physics, it is called physical. Thus, the zero level is called the base because this is where the hardware and software meet.

Modern computers
Modern computers

Finally, let's list the hierarchical chain of levels that are contained in the average computer, starting from zero:

  • Lv. 0 - digital logical, or hardware - work heregates and registers that can store the value 0 or 1, as well as perform simple functions "and", "or", etc.
  • Lv. 1 - microarchitecture - the arithmetic-logical unit of a computer works at this level. This is where data, hardware and software begin to work together.
  • Lv. 2 - instruction set architectures.
  • Lv. 3 - hybrid, or operating system - this level is more flexible, although it is very similar to level 2. For example, here programs can be executed in parallel.
  • Lv. 4 - assembler - the level at which machine digital languages begin to give way to human ones.
  • Lv. 5 - high-level languages (C++, Pascal, PHP, etc.)

So, each level is a superstructure over the previous one and is connected with it by translation or interpretation methods, has its own abstract objects and operations. To work on a single level, you can, in principle, not know what is happening on the previous ones. It is thanks to this approach that understanding computer technology has become easier.

After all, each brand of computers has its own architecture. The architecture refers to the data types, operations, and characteristics of each layer. For example, the technology by which computer memory cells are created is not included in the concept of architecture.

Development of computers

With the development of technology, new levels appeared, some left. The first computers in the 40s had only two levels: digital-logical, where the program was executed, and architectural-command, on whichcode was written. Therefore, the boundary between hardware and software was obvious, but as the number of levels increased, it began to disappear.

Today, information hardware and software can be considered identical concepts. Because any operation modeled in software can be executed directly in hardware, and vice versa. There are no ironclad rules that say why one operation must be done in hardware and another in software. The division is based on factors such as production cost, speed, reliability, etc. Today's software may become part of the hardware tomorrow, or, conversely, something from the hardware may become a program.

Computer Generations

Mechanical computers represent the zero generation. Pascal in the 1640s created a hand-operated calculating machine that could add and subtract. In the 1670s, Leibniz created a machine that could also multiply and divide. Babbage in the 1830s, having spent all his savings, created an analytical engine, which was similar to a modern computer and consisted of an input device, memory, computing apparatus and an output method. The machine was so advanced that it could memorize up to 1000 words of 50 decimal places and execute different algorithms simultaneously. The Analytical Engine was programmed in Assembly language, so Babbage hired Ada Lovelace to create the first programs. However, he lacked both the means and the technology to debug the work of his offspring.

A little later in America was createdthe most powerful Atanasoff machine, which worked on binary arithmetic and had a renewable memory based on capacitors (RAM), which still works the same way to this day. Atanasov, like Babbage, could not debug the work of his creation. Finally, in 1944, Aiken created the first general-purpose computer, the Mark I, which could memorize 72 words of 23 decimal places each. At the time the Mark II was designed, relay computers were already a thing of the past, and they were replaced by electronic ones.

Pascal machine
Pascal machine

The world's first computer

The Second World War stimulated the creation of computers, which led to the development of the first generation (1945-1955) of computers. The first vacuum tube computer was the Turing machine COLOSSUS, designed to break the ENIGMA ciphers. And although the computer was late, and the war ended, and because of secrecy did not have an impact on the world of computers, nevertheless, it was the first.

Then, in the US Army, the scientist Mowshli began the development of ENIAC. The first such computer weighed three dozen tons, consisted of 18,000 lamps and 1,500 relays, it was programmed with 6,000 switches and consumed a huge amount of energy. Setting up the software and hardware of such a monster was extremely difficult.

Eniac machine
Eniac machine

Therefore, like COLOSSUS, the ENIAC machine was not debugged by the deadline and was no longer needed by the army. However, Moushli was allowed to create a school and, on the basis of work on ENIAC, put knowledge to the masses, which gave rise to the creation of many different computers (EDSAC, ILLIAC,WEIZAC, EDVAC, etc.).

Among the whole range of computers stood out the IAS, or von Neumann computer, which still influences computers to this day. It consisted of a memory, a control device and an I / O module, and could store 4096 words of 40 bits in length.

And although IAS did not become a market leader, it had a powerful influence on the development of computers. For example, Whirlwind I, a computer for serious scientific calculations, was created on its basis. Ultimately, all the research led to the fact that a small company, a manufacturer of IBM punched cards, released the 701 computer in 1953 and began to displace Moshli and his UNIVAC from the market leadership.

Transistors and the first computer game

Bell Labs received the 1956 Nobel Prize for the invention of the transistor, which instantly changed all computer technology and gave rise to the second generation (1955-1965) of computers. The first transistorized computer was the TX-0 (TX-2). It didn't have much weight, but one of the creators, Olsen, founded DEC, which launched the PDP-1 computer in 1961.

And although it was seriously inferior to IBM models in terms of parameters, it was cheaper. The PDP-1 hardware and software complex cost $120,000, not millions like the IBM 7090.

PDP-1 was a commercially successful product. It is believed that he laid the foundation for the computer industry. It was also used to create the first computer game "Space War". Later, the PDP-8 will be released with the breakthrough technology of the Omnibus single data bus. In 1964, CDC andscientist Cray releases the 6600 machine, which is an order of magnitude faster than any other machine due to the use of parallel computing within the CPU.

Machine PDP-1
Machine PDP-1

IBM First Steps

The invention of the silicon integrated circuit, which allowed dozens of transistors to be placed on a single chip, ushered in the third generation (1965-1980) of computers. They were smaller and worked faster. IBM should be noted here, which was the first to ask about the compatibility of different computers and began to produce a whole series called 360. The software and hardware of the 360 series models differed in parameters, but were supplied with a similar set of commands, so they were compatible. Also, the 360 machines were able to emulate the operation of other computers, which was a big breakthrough, as it allowed you to run programs written for other machines. Meanwhile, DEC remained the leader in the small computer market.

IBM360 model 65
IBM360 model 65

The era of PC creation

Fourth generation (1980 - present day) - VLSI or very large integrated circuits. There was a sharp jump in the IC, and technologies appeared that made it possible to place not tens, but thousands of transistors on silicon crystals. It's time for personal computers.

First CP/M operating systems; the entry of Apple into the market; the creation by Intel of the parent of the Pentium line - the 386 processor.

And here again, IBM makes a breakthrough in the market, starting to create personal computers from components from different companies, instead ofproduce everything yourself. This is how the IBM PC, the best-selling computer in history, was born.

The new approach of the IBM PC simultaneously spawned the era of personal computers, but at the same time hurt the computer industry as a whole. So, for example, Intel broke into the sole leaders in the production of CPUs, and no one could compete with them. Only narrowly focused companies were able to survive. The Apple Lisa appears, the first computer to use a graphical operating system. Compaq creates the first portable computers, carves out a niche market and buys out the former leaders of this segment DEC.

If Intel de alt the first blow to IBM, then the second was the blow from the small company Microsoft, which was engaged in the production of OS for IBM. The first OS was MS-DOS, later Microsoft created the OS/2 system for IBM, and Windows was created under the guise. OS/2 failed in the market.

Thus, Intel and Microsoft overthrew IBM. The latter are trying to survive and generate another revolutionary idea, creating a processor with two cores. There is an improvement in PC hardware and software through various optimizations.

Fifth generation

But development does not stand still. A paradigm shift is taking place, and the prerequisites for the 5th generation of computers are emerging. It all started with the Japanese government, which in the 1980s allocated huge funds to national companies and ordered them to invent the next generation of computers. Of course, the idea failed.

But the impact of this event was great. Japanese technology began to spread around the world. This techniquehas taken a leading position in many areas of the relevant market: cameras, audio equipment, etc. The West was not going to just give up and also joined the fight for the 5th generation.

Grid Systems released the first tablet computer, and Apple created the pocket Newton. This is how the PDA, or electronic assistants, or handheld computers appeared.

And then IBM specialists make another breakthrough and present a new idea - they combine the mobile phones that are gaining popularity with the beloved PDA users. Thus, in 1993, the first smartphone called Simon was born.

In part, the 5th generation can be considered a reduction in software and hardware in size. And also the fact that today mini-computers are built into any equipment: from smartphones and electric kettles to cars and train tracks - and expand its functionality. It is also worth noting spyware developments with hardware software protection. More inconspicuous, designed to fulfill their unique functions.

Smartphone IBM Simon
Smartphone IBM Simon

Computer types

Not only limited to PC hardware and software. Today there are many of them:

  • disposable computers: greeting cards, RFID;
  • microcontrollers: watches, toys, honey. equipment and other appliances;
  • mobile phones and laptops;
  • personal computers;
  • servers;
  • clusters (several servers combined into one)
  • mainframes - computers forbatch processing of large amounts of data;
  • "cloud technology" - second-tier mainframes;
  • supercomputers (although this class is being replaced by clusters that can also perform serious calculations).

Given this information, hardware and software can be customized to suit a variety of needs.

Computer families

The hardware and software of a personal computer (and not only it) differs by families. The most popular families are X86, ARM and AVR. A family refers to the instruction set architecture. The first family - X86 - includes almost all personal computers and servers (both on Windows and Linux and even Mac).

To the second - ARM - mobile systems. Finally, the third - AVR - includes most microcontrollers, those very inconspicuous computers that are embedded everywhere: in cars, electrical appliances, TVs, etc.

X86 is being developed by Intel. Their processors, from the 8080 (1974) to the Pentium 4 (2000), are backward compatible, meaning the new processor can run programs written for the old one.

Hardware and software heritage - working through generations of processors is what made Intel so versatile.

Acorn Computer was at the origin of the ARM project, which later spun off and became independent. The ARM architecture has long been a success in the low power market.

Atmel hired two students who had an interesting idea. They continued development and created the AVR processor, which has the distinction of being great for systems that do not require high performance. AVR processors fit into the toughest environments, with tight limits on size, power consumption, and power.

Popular topic

Editor's choice

  • IPv6 protocol: setup on Windows systems
    IPv6 protocol: setup on Windows systems

    Probably, many users of computer systems, delving into the network settings, noticed that in the list of protocols, in addition to the well-known IPv4, there is also the sixth version (IPv6)

  • View history on computer
    View history on computer

    Today you can learn about almost all the actions taken on the computer. It's about browsing history

  • Computer technology hardware: definition, description and types
    Computer technology hardware: definition, description and types

    Modern computers to ensure maximum performance and correct operation use hardware and software that are very interconnected and clearly interact in different directions. Now let's touch on the consideration of hardware, since initially it is they who occupy a dominant position in ensuring the operability of any computer or even mobile system

  • Keyboard shortcuts and the evolution of operating systems
    Keyboard shortcuts and the evolution of operating systems

    It takes about a minute to copy or move a file from one location to another using the menu system. The developers introduced a keyboard shortcut that replaced all these actions and made it possible to reduce the time of this common operation by 3 or more times. True, for this it was necessary to slightly modernize the keyboard: new keys were added - "Ctrl", and later - "Windows"

  • If the torrent does not download
    If the torrent does not download

    An article for those who consider the ability to download files from torrent trackers a convenience, not an echo of the past. Not only the unavailability of "high-speed" tariff plans is the reason for the choice, but also personal preferences