Anda di halaman 1dari 3

carry out a sequence of arithmetic or logical operations.

The particular sequenc e of operations can be changed readily, allowing the computer to solve more than one kind of problem. Conventionally a computer consists of some form of memory for data storage, at l east one element that carries out arithmetic and logic operations, and a sequenc ing and control element that can change the order of operations based on the inf ormation that is stored. Peripheral devices allow information to be entered from an external source, and allow the results of operations to be sent out. A computer's processing unit executes series of instructions that make it read, manipulate and then store data. Conditional instructions change the sequence of instructions as a function of the current state of the machine or its environmen t. The first electronic computers were developed in the mid-20th century (19401945). Originally, they were the size of a large room, consuming as much power as seve ral hundred modern personal computers (PCs).[1] Modern computers based on integrated circuits are millions to billions of times more capable than the early machines, and occupy a fraction of the space.[2] Sim ple computers are small enough to fit into mobile devices, and mobile computers can be powered by small batteries. Personal computers in their various forms are icons of the Information Age and are what most people think of as "computers". However, the embedded computers found in many devices from mp3 players to fighte r aircraft and from toys to industrial robots are the most numerous. Contents [hide] 1 History of computing o 1.1 Limited-function early computers o 1.2 First general-purpose computers o 1.3 Stored-program architecture o 1.4 Semiconductors and microprocessors 2 Programs o 2.1 Stored program architecture o 2.2 Bugs o 2.3 Machine code o 2.4 Higher-level languages and program design 3 Function o 3.1 Control unit o 3.2 Arithmetic/logic unit (ALU) o 3.3 Memory o 3.4 Input/output (I/O) o 3.5 Multitasking o 3.6 Multiprocessing o 3.7 Networking and the Internet 4 Misconceptions o 4.1 Required technology o 4.2 Computer architecture paradigms o 4.3 Limited-function computers o 4.4 Virtual computers 5 Further topics o 5.1 Artificial intelligence o 5.2 Hardware o 5.3 Software o 5.4 Programming languages o 5.5 Professions and organizations 6 See also 7 Notes 8 References 9 External links History of computing Main article: History of computing hardware

The first use of the word "computer" was recorded in 1613, referring to a person who carried out calculations, or computations, and the word continued with the same meaning until the middle of the 20th century. From the end of the 19th cent ury onwards, the word began to take on its more familiar meaning, describing a m achine that carries out computations.[3] Limited-function early computers The Jacquard loom, on display at the Museum of Science and Industry in Mancheste r, England, was one of the first programmable devices. The history of the modern computer begins with two separate technologiesautomated calculation and programmabilitybut no single device can be identified as the ear liest computer, partly because of the inconsistent application of that term. A f ew devices are worth mentioning though, like some mechanical aids to computing, which were very successful and survived for centuries until the advent of the el ectronic calculator, like the Sumerian abacus, designed around 2500 BC[4] which descendant won a speed competition against a modern desk calculating machine in Japan in 1946,[5] the slide rules, invented in the 1620s, which were carried on five Apollo space missions, including to the moon[6] and arguably the astrolabe and the Antikythera mechanism, an ancient astronomical computer built by the Gre eks around 80 BC.[7] The Greek mathematician Hero of Alexandria (c. 1070 AD) buil t a mechanical theater which performed a play lasting 10 minutes and was operate d by a complex system of ropes and drums that might be considered to be a means of deciding which parts of the mechanism performed which actions and when.[8] Th is is the essence of programmability. Around the end of the tenth century, the French monk Gerbert d'Aurillac brought back from Spain the drawings of a machine invented by the Moors that answered Ye s or No to the questions it was asked (binary arithmetic).[9] Again in the thirt eenth century, the monks Albertus Magnus and Roger Bacon built talking androids without any further development (Albertus Magnus complained that he had wasted f orty years of his life when Thomas Aquinas, terrified by his machine, destroyed it).[10] In 1642, the Renaissance saw the invention of the mechanical calculator,[11] a d evice that could perform all four arithmetic operations without relying on human intelligence.[12] The mechanical calculator was at the root of the development of computers in two separate ways ; initially, it is in trying to develop more p owerful and more flexible calculators[13] that the computer was first theorized by Charles Babbage[14][15] and then developed,[16] leading to the development of mainframe computers in the 1960s, but also the microprocessor, which started th e personal computer revolution, and which is now at the heart of all computer sy stems regardless of size or purpose,[17] was invented serendipitously by Intel[1 8] during the development of an electronic calculator, a direct descendant to th e mechanical calculator.[19] First general-purpose computers In 1801, Joseph Marie Jacquard made an improvement to the textile loom by introd ucing a series of punched paper cards as a template which allowed his loom to we ave intricate patterns automatically. The resulting Jacquard loom was an importa nt step in the development of computers because the use of punched cards to defi ne woven patterns can be viewed as an early, albeit limited, form of programmabi lity. The Most Famous Image in the Early History of Computing[20] This portrait of Jacquard was woven in silk on a Jacquard loom and required 24,0 00 punched cards to create (1839). It was only produced to order. Charles Babbag e owned one of these portraits ; it inspired him in using perforated cards in hi s analytical engine[21] It was the fusion of automatic calculation with programmability that produced th e first recognizable computers. In 1837, Charles Babbage was the first to concep

tualize and design a fully programmable mechanical computer, his analytical engi ne.[22] Limited finances and Babbage's inability to resist tinkering with the de sign meant that the device was never completed ; nevertheless his son, Henry Bab bage, completed a simplified version of the analytical engine's computing unit ( the mill) in 1888. He gave a successful demonstration of its use in computing ta bles in 1906. This machine was given to the Science museum in South Kensington i n 1910. In the late 1880s, Herman Hollerith invented the recording of data on a machine readable medium. Prior uses of machine readable media, above, had been for contr ol, not data. "After some initial trials with paper tape, he settled on punched cards ..."[23] To process these punched cards he invented the tabulator, and the keypunch machines. These three inventions were the foundation of the modern inf ormation processing industry. Large-scale automated data processing of punched c ards was performed for the 1890 United States Census by Hollerith's company, whi ch later became the core of IBM. By the end of the 19th century a number of idea s and technologies, that would later prove useful in the realization of practica l computers, had begun to appear: Boolean algebra, the vacuum tube (thermionic v alve), punched cards and tape, and the teleprinter.

Anda mungkin juga menyukai