Monday 9 November 2015

Computer history

COMPUTER HISTORY
Since time immemorial, the data processing has been performed by humans. Humans also find equipment mechanics and electronics to help human beings in calculation and data processing in order to get results faster. Computers that we see today is a long evolution of human inventions sejah time immemorial in the form of mechanical or electronic appliance.
    Nowadays computers and supporting tools have been included in every aspect of life and work. Computers are now capable of more than just an ordinary mathematical calculations. Among them is the computer system that is able to read the gauze supermarket shopping goods code, telephone exchange that handles millions of calls and communications, computer networks and the Internet that connects the various parts of the world.

Computer history according to the period is:
Tools and Calculators Calculate Traditional Mechanics
First Generation Computers
Second Generation Computers
Third Generation Computers
Fourth Generation Computers
Fifth Generation Computer

TRADITIONAL TOOLS CALCULATE and MECHANICAL CALCULATOR

Abacus, which emerged about 5,000 years ago in Asia Minor and is still used in some places up to now can be regarded as the beginning of a computing machine.
This tool allows users to perform calculations using sliding grains arranged on a shelf. The traders in the past using the abacus to calculate trade transactions. Along with the emergence of a pencil and paper, especially in Europe, the abacus lost its popularity.


The brass square box called Pascaline, used eight serrated wheel to add numbers to eight digits. This tool is a calculator tool based on number ten. The weakness of this tool is only limited to the sum.
    In 1694, a German mathematician and philosopher, Gottfred Wilhem von Leibniz (1646-1716) to improve Pascaline by creating a machine that can multiply. Just like its predecessor, this mechanical device works by using wheels serrations. By studying the notes and drawings made by Pascal, Leibniz can improve the tool.
    Then in 1820, mechanical calculators became popular. Charles Xavier Thomas de Colmar invented machines that can perform four basic arithmetic functions. Colmar mechanical calculator, arithometer, presenting a more practical approach in the calculation because the tool can perform addition, subtraction, multiplication, and division. With his ability, arithometer widely used until World War I. Together with Pascal and Leibniz, Colmar helped build a mechanical computing era.
    Beginning of the computer that is actually formed by a British mathematics professor, Charles Babbage (1791 to 1871). In 1812, Babbage noticed a natural conformity between mechanical and mathematical machine that is excellent in mechanical machines do the same tasks repeatedly without mistake; was mathematics requires a simple repetition of a tertenu steps. The problem kemudain grown to placing a mechanical machine as a tool to answer the needs of mechanics. Babbage's first attempt to address this problem emerged in 1822 when he proposed a machine to perform
the calculation of differential equations. The machine is called Differential Engine. Using steam, the machine can store programs and can perform calculations and print the results automatically.
    After working with Differential Engine for ten years, Babbage was suddenly inspired to start creating a general-purpose computer first, called the Analytical Engine. Babbage's assistant, Augusta Ada King (1815-1842) has an important role in the manufacture of this machine. He helped revise the plan, seek funding from the British government, and communicate the specifications of the Analytical Engine to the public. In addition, Augusta a good understanding of the machine makes it possible to put instructions into the machine and also make it the first female programmer. In 1980, the US Defense Department named a programming language ADA name as a tribute to him.
    Babbage steam engine, although never completed, seem very primitive compared to today's standards. However, these tools describe the basic elements of a modern computer and also reveals an important concept. Consisting of approximately 50,000 components, the basic design of the Analytical Engine using perforated cards (with holes) that contains the operating instructions for the machine.
    In 1889, Herman Hollerith (1860-1929) also applies the principle of perforated cards to perform calculations. His first task is to find a faster way to perform calculations for the US Census Bureau. Previous census conducted in 1880 took seven years to complete the calculation. With a growing population, the Bureau estimates that it would take ten years to complete the census calculations.


Hollerith using perforated cards to enter census data are then processed by the device mechanically. A card can store up to 80 variables. By using these tools, the census can be completed within six weeks. In addition to advantages in speed, the card serves as a data storage media. The error rate calculation can also be reduced drastically. Hollerith then develop these tools and sell them to the public. He founded the Tabulating Machine Company in 1896 which later became International Business Machines (1924) after some time the merger. Other companies such as Remington Rand and Burroghs also produce perforated card reader for business. Perforated cards used by businesses and government to permrosesan the data until 1960.
    In the next period, several engineers made other new discoveries. Vannevar Bush (18901974) created a calculator to solve differential equations in 1931. The machine could solve complex differential equations that had been considered complicated by academics. The machine is very large and heavy as hundreds of teeth and the shaft is required to perform the calculation. In 1903, John V. Atanasoff and Clifford Berry tried to make a computer that uses Boolean algebra in electrical circuits. This approach is based on the work of George Boole (1815-1864) in the form of a binary system of algebra, which states that any mathematical equation can be expressed as true or false. By applying the conditions are right and wrong into the electrical circuit in the form of connected-disconnected, Atanasoff and Berry made the first electronic computer in 1940. But they stopped the project because the loss of funding sources.


COMPUTER FIRST GENERATION
With the onset of the Second World War, the countries involved in the war sought to develop computers to exploit their potential strategic computer. This increases funding for the development of computers and accelerate the advancement of computer engineering. In 1941, Konrad Zuse, a German engineer to build a computer Z3, to design airplanes and missiles.
     The allies also made other advances in the development of computer power. In 1943, the British completed a secret code-breaking computer called Colossus to decode secret German. The Colossus's impact influenced the development of the computer industry because of two reasons. First, Colossus was not a general-purpose computer is a versatile computer), it is only designed to decode secret messages. Second, the existence of the machine was kept secret until decades after the war ended.


The work done by the Americans at that time produced a broader achievement. Howard H. Aiken (1900-1973), a Harvard engineer working with IBM, succeeded in producing electronic calculators for the US Navy. The calculator is a length of half a football field and has a range of 500 miles of wiring. The Harvd-IBM Automatic Sequence Controlled Calculator, or Mark I, an electronic relay computer. He uses electromagnetic signals to move mechanical components. Beropreasi machine was slow (taking 3-5 seconds per calculation) and inflexible (order calculations can not be changed). The calculator can perform basic arithmetic and more complex equations.


Another computer development during this period was the Electronic Numerical Integrator and Computer (ENIAC), created by the cooperation between the US government and the University of Pennsylvania. Consisting of 18,000 vacuum tubes, 70,000 resistors and 5 million soldered joints, the computer is a machine that consumes huge power of 160kW. This computer was designed by John Presper Eckert (1919-1995) and John W. Mauchly (1907-1980), ENIAC is a versatile computer (general purpose computer) that work 1000 times faster than Mark I.


In the mid-1940s, John von Neumann (1903-1957) joined the team of the University of Pennsylvania to build a concept desin computers up to 40 years is still used in computer engineering.
Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945 with a memory to accommodate either program or data. This technique allows the computer to stop at some point and then resume her job back. The main key to the von Neumann architecture is the central processing unit (CPU), which allowed all computer functions to be coordinated through a single source. In 1951, the UNIVAC I (Universal Automatic Computer I) made by Remington Rand, became the first commercial computer utilizing the model of von Neumann architecture. Both the US Census Bureau and General Electric have UNIVAC. One of the impressive results achieved by the UNIVAC dalah success in predicting victory of Dwight D. Eisenhower in the 1952 presidential election.


First generation computers were characterized by the fact that the operating instructions are made specifically for a particular task. Each computer has a program different binary-coded-called "machine language" (machine language). This causes the computer is difficult to be programmed and the speed limit. Another feature is the use of first-generation computer vacuum tube (which makes the computer at that time very large) and a magnetic cylinder for data storage.


SECOND GENERATION COMPUTER
    In 1948, the invention of the transistor greatly influenced the development of computers. The transistor replaced the vacuum tube in televisions, radios and computers. As a result, the size of the electrical machines is reduced drastically. The transistor used in computers began in 1956. Another is the development of magnetic-core memory to help the development of second generation computers smaller, faster, more reliable, and more energy efficient than their predecessors. The first machine that utilizes this new technology is a supercomputer. IBM makes supercomputer named Stretch, and Sprery-Rand makes a computer named LARC. These computers, which were developed for atomic energy laboratories, could handle large amounts of data, a capability much in demand by atomic scientists. The machine is very expensive and tend to be too complex for business computing needs, thereby limiting. There are only two LARC has ever installed and used: one at the Lawrence Radiation Labs in Livermore, California, and the other in the US Navy Research and Development Center in Washington DC The second-generation computers replaced the machine language with assembly language. Assembly language is a language that uses abbreviations to replace the binary code.
In the early 1960s, began to appear successful second generation computers in business, in universities and in government. The second generation of computers is fully computer using transistor. They also have components that can be associated with the computer at this time: a printer, storage, disk, memory, operating system, and programs. One important example was the IBM 1401 is widely accepted in the industry. In 1965, almost all large businesses use computers to process the second generation of financial information.

    Program stored in the computer and programming language that is in it gives flexibility to the computer. Flexibility is increased performance at a reasonable price for business use. With this concept, the computer can print customer invoices and minutes later design products or calculate paychecks. Some programming languages ​​began to appear at that time. Programming language Common Business-Oriented Language (COBOL) and FORTRAN (Formula Translator) came into common use. This programming language replaces complicated machine code with words, sentences, and mathematical formulas are more easily understood by humans. This allows a person to program a computer. Various New types of careers (programmer, analyst, and computer systems expert). Software industry also began to appear and grow during this second generation computers.


THIRD GENERATION COMPUTER
     Although the transistors in many respects the vacuum tube, but transistors generate considerable heat, which can potentially damage the internal parts of the computer. Quartz stone (quartz rock) eliminates this problem. Jack Kilby, an engineer at Texas Instruments, developed the integrated circuit (IC: integrated circuit) in 1958. IC combined three electronic components onto a small silicon disc made of quartz sand. Scientists later managed to fit more components into a single chip, called a semiconductor. As a result, computers became ever smaller as more components were squeezed onto the chip. Other third-generation development is the use of the operating system (operating system) that allows the engine to run many different programs at once with a central program that monitored and coordinated the computer's memory.

COMPUTER FOURTH GENERATION
     After IC, the development becomes more clear that shrink the size of circuits and electrical components. Large Scale Integration (LSI) could fit hundreds of components onto one chip. In the 1980s, the Very Large Scale Integration (VLSI) contains thousands of components in a single chip.
Ultra-Large Scale Integration (ULSI) increased that number into the millions. The ability to install so many components in a chip the size of a half dime helped diminish the size and price of computers. It also increased their power, efficiency and reliability of the computer. Intel 4004 chip made in 1971, took the IC with all the components of a computer (central processing unit, memory, and control input / output) on a chip
very small. Previously, the IC is made to do a certain task specific. Now, a microprocessor can be manufactured and then programmed to meet all the requirements. Not long after, everyday household items such as microwave ovens, television sets and automobiles with electronic fuel injection equipped with microprocessors.


Such developments allow ordinary people to use a regular computer. Computers are no longer a dominance of large corporations or government agencies. In the mid-1970s, computer assemblers offer their computer products to the general public. These computers, called minicomputers, sold with a software package that is easy to use by the layman. The software is most popular at the time was word processing and spreadsheet programs. In the early 1980s, video games such as the Atari 2600 ignited consumer interest in home computers are more sophisticated and can be programmed.


In 1981, IBM introduced the use of Personal Computer (PC) for use in homes, offices, and schools. The number of PCs in use jumped from 2 million units in 1981 to 5.5 million units in 1982. Ten years later, 65 million PCs in use. Computers continued their trend toward a smaller size, of computers that are on the table (desktop computer) into a computer that can be inserted into the bag (laptop), or even a computer that can be grasped (palmtops).
    IBM PC to compete with Apple Macintosh, introduced in the computer. Apple Macintosh became famous for popularizing the computer graphics system, while his rival was still using a text-based computer. Macintosh also popularized the use of mouse devices.
    At the present time, we know the journey IBM compatible with CPU usage: IBM PC / 486, Pentium, Pentium II, Pentium III, Pentium IV (series of CPUs made by Intel). Also we know AMD k6, Athlon, etc. This is all included in the class of fourth-generation computers. Along with the proliferation of computer usage in the workplace, new ways to explore the potential of being developed. Along with the increased strength of a small computer, these computers can be connected together in a network to share a memory, software, information, and also to be able to communicate with each other. Computer network allows a single computer to form electronic co-operation to complete an assignment process. By using direct cabling (also called local area network, LAN), or telephone cable, the network can become very large.




FIFTH GENERATION COMPUTER
    Defining the fifth generation computer becomes quite difficult because this stage is still very young. Examples are the fifth generation computer imaginative fictional HAL9000 computer from the novel by Arthur C. Clarke's 2001: A Space Odyssey. HAL displays all the desired functions of a fifth-generation computer. With artificial intelligence (artificial intelligence), HAL could reason well enough to hold conversations with its human operators, use visual input, and learn from his own experience.
    Although it may be the realization of HAL9000 is still far from reality, many of the functions that had been established. Some computers can receive verbal instructions and imitate human reasoning. The ability to translate a foreign language also becomes possible. This facility looks simpler. However, such facilities become much more complicated than expected when programmers realized that human understanding relies heavily on context and understanding rather than just translate the words directly.
    Many advances in the field of computer design and technology increasingly allows the manufacture of fifth generation computers. Two engineering advances which are mainly parallel processing capabilities, which will replace von Neumann model. Von Neumann model will be replaced with a system that is able to coordinate many CPUs to work in unison. Another advancement is the superconducting technology that allows the flow of electrically without any obstacles, which will accelerate the speed of information.
    Japan is a country well known in the jargon of socialization and the fifth generation computer project. ICOT institute (Institute for New Computer Technology) was also formed to make it happen. Many news stating that the project has failed, but some other information that the success of this fifth generation computer project will bring new changes in the world paradigm of computerization. We are waiting for information is more valid and fruitful.

Source:
Sudirman, Ivan, Computer History, IlmuKomputer.com







No comments:

Post a Comment