The element base in computers is their main electronic component. This component varies depending on the generation of computers. Generations of the element base of computers explain the history of the development of computers based on evolving technologies. With each new generation of computer circuits, their size has become smaller, the speed of information processing has doubled, the memory has become larger, and the convenience and reliability have improved. The time scale given to determine each generation is important for understanding what is the element base of a computer. But it is not fully defined and is considered rather conditional. The generations of the element base are actually based on evolving chip technology and not on any particular time frame.

The first generation of computers
Five generations of computers can be characterized by electric current flowing:
- in vacuum tubes;
- in transistors;
- in integrated circuits;
- in microprocessor chips;
- in smart devices capable of artificialintelligence.
The first generation of computers appeared in the 1940s-1950s. First generation computers were actually the first general purpose and true digital computers. They appeared to replace electromechanical systems that were too slow for the assigned tasks. The first computer oscillators used vacuum tubes for switching. The sealed glass allowed current to flow wirelessly from the filaments to the metal plates.
How the first computers worked
The element base of the computer, the tubes, were made from hermetically sealed glass containers the size of a light bulb. There were no moving parts in the system. The elemental base of the first generation were lamps, which were called diodes and triodes. Entry and exit were carried out using punched cards, magnetic drums, typewriters and punched card readers. The systems interface was made using plugins and machine language.

The element base of the first generation of computers was difficult to use. Technicians connected electrical circuits by plugging numerous cables into sockets. They then used special punched cards and waited several hours to get a result for some form of calculation. The first computers were so large that they occupied entire rooms. Assembly language and operating system software were not yet available. The systems could only solve one problem at a time. These machines were designed for low level operations and programming was done using onlybinary digits 0 and 1.
ENIAC - the most powerful of the early computers
One of the most prominent computers of this era was the ENIAC (Electronic Numerical Integrator and Computer), designed and built by engineer John Mauchly and John Presper Eckert of the University of Pennsylvania. Its assembly was carried out by a team of fifty people. ENIAC was 1000 times faster than previous electromechanical computers but much slower when reprogrammed.
Among other things, ENIAC has been used to study the capabilities of thermonuclear weapons, fire ballistic artillery and thermal engine ignition, and sometimes for weather forecasts. These systems were huge in size and took up entire rooms, using a lot of electricity, which made them a source of unbearable heat.

Universal Automatic Computer
UNIVAC (universal automatic computer) was created by the same engineers - John Mauchly and John Presper Eckert. The computer was the first in the same era to be developed for commercial purposes other than military use. Using its element base, it manipulated the alphabet and numbers quite well and was used by the US Census Bureau to list the total population.
Later it was used to report the sales of companies and even to predict the results of the presidential election in 1952. In contrast to the over 17,000 vacuum tubes at ENIAC, UNIVAC I used just over 5,000 vacuum tubes. Hewas also half the size of its predecessor. More than 46 of these computers have been sold.
Second generation computers: 1950s-1960s
Computers of the second generation were computers in which transistors were used instead of vacuum tubes. This was the element base of the second generation. The new computers were better than their predecessors in large part because of their relatively small size, speed, and lower cost. Transistors are the building blocks of almost any microchip, and are more reliable, energy efficient, and able to conduct electricity faster and better than vacuum tubes.
Like tubes, the second-generation computer hardware, which included transistors, were switches or electronic gates used to amplify or control current, or turn electrical signals on or off. Transistors are called semiconductors because they contain elements that sit between conductors and insulators.

Invention of transistorized semiconductors
Transistor semiconductors were invented at Bell Laboratories in 1947 by scientists William Shockley, John Bardeen, and W alter Brattent, but were not released until the mid-1950s. Engineers and creators of the new element base saw the future of second-generation computers in improving data input and output procedures.
Initially, these processes were similar to the latest first-generation computers. The work was quite laborious andtedious because it involved the labor of several employees carrying punched cards from room to room.
Packet data transmission system
In order to speed up the process, a batch system was created and implemented. It involved collecting multiple jobs of data onto multiple punched cards and feeding them to magnetic tapes using a relatively small and inexpensive system. The IBM-1401 was one such computer. It used the IBM-7094 operating system and Fortran Monitor System.
When data processing was completed, the files were transferred back to the tape. Using a smaller system such as the IBM-1401, the data could be printed onto multiple punched cards as output. These were the forerunners of operating system software.
Specifications of second generation computers
Then began the process of upgrading the restrictive binary machine code to languages that fully supported symbolic and alphanumeric encoding. Programmers could now write in assemblers and high-level languages such as FORTRAN, COBOL, SNOWBALL and BASIC.

Early supercomputers were just some of the machines that used transistors. Examples of these systems were the UNIVAC LARC universal box from Sperry Rand (1960) and the IBM-7030 Stretch supercomputer (1961) and the CDC 6600 mainframe (1963).
Third generation of computers: 1960s-1970s
The element base of the third generation of computers is integrated circuits andmultiprogramming. Third generation computers used an integrated circuit (IC) chip instead of transistors. The implementation of these computers was also consistent with Moore's Law, which noted that the size of transistors decreased so rapidly that their number on the circuit doubled every 2 years.
Advantages of integrated circuits
Semiconductor IC included a huge number of transistors, capacitors and diodes. Then they were printed on separate parts of the board. Manual connection of capacitors and diodes in transistors was laborious and not entirely reliable. Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Corporation separately discovered the benefits of integrated circuits in 1958 and 1959 respectively. Kilby built his IC on germanium, while Noyce built it on silicon.
The first system to use ICs was the IBM 360, used to process both commercial and scientific assignments. After placing multiple transistors on a single chip, in addition to lowering the cost, the speed and performance of any single computer also increased significantly. Since its invention, the speed of the IC has doubled every two years, further reducing the size and cost of computers.
The use of integrated circuits in modern computers
Today, almost all electronic devices use some form of integrated circuits placed on printed circuit boards. Unlike the IC circuit, interaction with computers has improved. Instead of punch cards for input and outputdata, information is displayed through visual displays, keyboards are used, as well as improved input peripherals.
Computers now use operating system software to manage hardware and resources, allowing systems to run different applications at the same time. This was due to centralized applications that controlled memory allocation. Computers have become available to a wide audience due to their size and fair value.
This generation also ushered in the concept of the “computer family”, which prompted manufacturers to come up with computer components compatible with other systems. Examples of these systems were the Scientific Systems Systems Sigma 7 (1966) supercomputers and the IBM-360 (1964) and CDC 8600 (1969) supercomputers.
The fourth generation of computers: from the 1970s to the present
Microprocessor, OS and graphical interface - the element base of modern computers. The birth of the microprocessor was at the same time the birth of the microcomputer. It was also in line with Moore's Law, which predicted the exponential growth of the transistor and microchips starting in 1965. Intel and its engineers Ted Hoff, Federico Faggin, and Stan Mazor introduced the world's first single-chip microprocessor, the Intel 4004, in November 1971.
What filled an entire room in the first generation could now be placed in the palm of your hand. Needless to say, the new microchip was as powerful as the ENIAC computer since 1946. The fourth generation and its elemental base plays an important role in creatingvarious devices.
Intel 4004 processor
Soon, manufacturers began integrating these microchips into their new computers. In 1973, the Xerox Alto was released from PARC. It was a real personal computer that included an Ethernet port, a mouse, and a bit-mapped GUI, the first of its kind. In 1974, Intel introduced an 8-bit general purpose microprocessor called the "8808". Programmer Gary Arlen Kildall then set about creating the disk-based software known as the "Microcomputer Management Program" (CPM). It became the prototype of the modern PC element base.
First home personal computer
In 1981, International Business Machine introduced its first home computer, which ran the 4004 processor. It was known as the IBM PC. The company partnered with Bill Gates, who bought the Disk Operating System from the Seattle Computer Product and distributed it from a new IBM computer. The IBM PC architecture has become the standard market model.

Creating the Windows operating system
Apple, led by Steve Jobs, changed the programming game when it released the Apple Macintosh computer in 1984 with an improved graphical user interface (GUI) using an interface idea derived from Xerox PARC. Both the control programs for the microcomputer and the disk operating system were command line based operating systems where the user needs to interact with the computer withkeyboard.
Following the success of Apple's GUI, Microsoft integrated a shell version of Windows into the 1985 DOS versions. Windows was used for the next 10 years until it was reinvented as Windows 95. It was real operating system software with all the necessary utilities.
The advent of Linux
While software became commonplace and corporations began charging money for it, a new programming movement launched Linux in 1991. Led by Linux Torvalds, they spearheaded a free and open source operating system project called Linux. In addition to Linux, other open source operating systems and free software have been circulated to serve office, network, and home computers.

Mobile proliferation
In the 1980s and 2000s, personal computers and desktop computers became commonplace. They have been installed in offices, schools and homes, their cost has become affordable, and their size has become compact. The software running on these computers has also become more accessible. Microprocessors soon emerged from the monopolization of desktop computers and moved to other platforms.
First there was a laptop, and then tablets and smartphones, consoles, embedded systems, smart cards, which became popular due to the need to use the Internet while driving. According to recent studies, mobilephones accounted for 60% of all digital devices worldwide.
The fifth generation of computers: present and future
Computers of the fifth generation are built on the technological progress obtained in previous generations of computers. Today's engineers hope to improve the interaction between humans and machines by harnessing human intelligence and big data accumulated since the dawn of the digital age. They come from the theory of the concept and implementation of artificial intelligence (AI) and machine learning (ML).
AI is the core of Generation 5 computers. It is a reality made possible by parallel processing and superconductors. Artificial intelligence computing devices are still in development, but some of these technologies are starting to emerge and be used, such as voice recognition. AI and ML may not be the same, but are used interchangeably to create devices and programs that are intelligent enough to interact with humans, other computers, environments, and programs.
The essence of the fifth generation will be to use these technologies to eventually create machines that can process and respond to natural language, as well as be able to learn and organize themselves.
The proliferation of computing devices with the ability to self-learn, respond and interact in a variety of ways based on experience and environment has also given impetus to the concept of IoT (Internetof things). At their peak, and with the right algorithms, computers are likely to exhibit high levels of learning, surpassing the intelligence of humans. Many AI projects are already being implemented, while others are still in development.
Pioneers in this area are Google, Amazon, Microsoft, Apple, Facebook and Tesla. Initial implementations have begun on smart home devices, which are designed to automate and integrate home activities, audio and visual devices, and self-driving cars.