Our guide can help you find cheap computers & refurbished laptops for sale.
Timeline Of Computer History
Computers are one of the most widely used devices in the modern world. It is a convenient and efficient means of communication, education, as well as entertainment. The development of computer technology was a gradual process, and it dates back to more than 2,000 years ago.
The Babylonians invented the first “computer” during the 4th century. This was the abacus, an instrument that was used extensively in different ancient cultures for making fast arithmetic and astronomical calculations. Astronomical calculations eventually led the ancient Greeks and the medieval Muslim astronomers to invent the astronomical analog computers. The first programmable analog computer was Al-Jazari’s astronomical clock, which was invented in 1206.
1600 to 1900
For many centuries, there were no significant developments in the world of computers. In 1623, Wilhelm Shickard invented the mechanical calculator, and Blaise Pascal made improvements to this device with the use of gears in 1642. Pascal’s invention was not a success because the gears tended to jam and the calculator was very costly.
In 1820, Charles Thomas created the first mass produced calculator, the arithmometer, which was based on the work of Gottfried Wilhelm von Leibniz. It could add, subtract, multiply, and divide. At the same time, Charles Babbage was developing the “difference engine”, a massive steam-powered mechanical calculator that could print astronomical tables. The project was cancelled by the British government, and Babbage started to work on another device called the “analytical engine”, which was a mechanical computer capable of solving any mathematical problem.
In 1890, Herman Hollerith developed the “tabulating machine”, a device that could automatically read census information punched into a card. This became very successful, and Hollerith built his own firm, which later became known as IBM, to market the device.
1900 to 1945
IBM’s tabulating machines were so advanced in the 1920s that newspapers such as the New York World described them as “super computers”. In 1941, Konrad Zuse invented the Z3, which was the first programmable calculator. This device used the binary system as opposed to the decimal system that was used in earlier calculators. The binary system played a pivotal role in the future design of computers. Almost at the same time, American Howard Aiken and several IBM engineers constructed a large automatic digital computer called Harvard Mark I.
1945 to 1960
After the success of the Harvard Mark I, the US made its second contribution to the computer world in 1946. The ENIAC was a large machine that occupied 167 square meters of floor space, and it used 10 decimal digits to perform calculations. The first commercially available computers were released in the late 1940s, and they were the EDVAC and UNIVAC, which used random access memory or RAM to store all instruction programs along with data in the same memory unit. RAM made programming and computing faster, more efficient, and more flexible.
A breakthrough in the world of computers was achieved with the introduction of transistors, which are semiconductor devices that serve as amplifiers, and integrated circuits during the 1950s. This proved to be the beginning of the computer revolution. Until this time, vacuums were used in computers and calculating machines, but they were highly inefficient and required a great deal of space and low temperature for maintenance. The advent of transistors and integrated circuits, or chips, greatly reduced space occupied by the machines and vastly improved their speed.
Microcomputers are compact digital computers that are powered by a microprocessor. The invention of the microprocessor in 1971 made it possible for almost anyone to have their own computer. Computers were initially limited to the military, universities, and very large companies. The microprocessor made it possible for microcomputers and minicomputers to exist.
The first minicomputer was the Altair 8800, which was made by MITS, but it had no software and its owner needed to create personal software to use it. This prompted two young hackers, William Gates and Paul Allen, to offer BASIC, which was software that could help computer scientists program microcomputers. Gates and Allen later formed Microsoft in 1975, and they started making operating systems for various machines.
Following the introduction of Altair, other microcomputers were developed, and these included the Apple computers by Steve Jobs and Steve Wozniak, the TRS-80 by Tandy Radio Shack, and the IBM PC by IBM for home markets. Eventually, it was Apple and IBM that raced against each other in providing the home market with more attractive and user-friendly personal computers.
Computers have come a long way from the simple abacus that was invented in the ancient times and the large calculating machines that were developed during the World War II period. They have also become more efficient and easier to use than their earlier counterparts, enabling ordinary people and even young children to use computers for various purposes. Now, there are many different types of computers available in the market, and they include mainframes, desktops, laptops, tablet PCs, media centers, and personal digital assistants.
The future of computers seems to be in Cheap Laptops and small netbooks or handheld smartphones. It is predicted by many market analysts that personal desktop computers will soon be as obsolete as the room-sized computers first built back in the 1940's.
Buying A Laptop Computer | Definition Of Motherboard | External Disk Drives|
Basic Input Output System | History Of The Calculator | Network Programming | Very Brief History Of The Internet
|Copyright © 2004-2015 Web Exordium, LLC. All Rights Reserved.|