Anda di halaman 1dari 10

Central processing unit The central processing unit is the hardware in a computer system which carries out the

instructions of a computer program by the stage the basic arithmetical, logical, and input/output operations of the system. The term has been in use in the computer manufacturing at least since the early 1960s. The form, design, and completion of CPUs have changed over the course of their history, but their original operation remains much the same. On large machines, CPUs require one or more printed circuit boards. On personal computers and small workstations, the CPU is housed in a single silicon chip called a microprocessor. Since the 1970s the microprocessor class of CPUs has almost completely overtaken all other CPU implementations. Modern CPUs are large scale integrated circuits in packages typically less than four centimeters square, with hundreds of connecting pins. There are two typical components of a CPU are the arithmetic logic unit (ALU) in the system, which performs arithmetic and logical operations, and the control unit (CU), which extracts instructions from memory and decodes and executes them, calling on the ALU when necessary. http://en.wikipedia.org/wiki/Central_processing_unit History: In the early days in 1969 Four-Phase Systems AL1 was an 8-bit bit slice chip containing eight registers and an ALU It was designed by Lee Boysel in 1969 At the time, it formed part of a nine-chip, 24-bit CPU with three AL1s, but it was later called a microprocessor. After this in 1971 Pico Electronics and General Instrument introduced their first collaboration in ICs, a complete single chip calculator IC for the Monroe/Litton Royal Digital III calculator. After this in 1971: Intel 4004 was Intel's first microprocessor. This breakthrough invention powered the Busicom calculator and paved the way for embedding intelligence in inanimate objects as well as the personal computer. In 1972 Intel 8008 was twice as powerful as the 4004. 1974 Intel 8080 became the brains of the first personal computer--the Altair, allegedly named for a destination of the Starship Enterprise from the Star Trek television show. Then gradually with the passage of time in 1974 Texas Instruments TMS1000, 1974 National Semiconductor "Pace" 16Bit, 1975: Motorola 6502, 1976: Zilog Z80, In 1978 Intel 8086-8088 A pivotal sale to IBM's new personal computer division made the 8088 the brains of IBM's new hit product. Then gradually in 1979 Motorola 68000, in 1981 HP 32 bit, in 1982 Intel 80286. The 80286, originally known as the 80286, was the first Intel processor that could run all the software written for its predecessor. This software compatibility remains a hallmark of Intel's family of microprocessors. Within 6 years of its release, an estimated 15 million 80286-based personal computers were installed around the world. In 1985 the Intel386 microprocessor featured 275,000 transistors more than 100times as many as the original 4004. It was a 32-bit chip and was "multi tasking," meaning it could run multiple programs at the same time. In 1989 Intel 80486 processor generation really meant you go from a command-level

computer into point and click computing. The 80486 processor was the first to offer a built in math coprocessor, which speeds up computing because it offloads complex math functions from the central processor. Then 1989 AMD486DX comes in market but after this in 1993 Intel Pentium processor allowed computers to more easily incorporate "real world" data such as speech, sound, handwriting and photographic images. Afte this great change in 1995 Intel Pentium Pro Released the Pentium Pro processor is designed to fuel 32-bit server and workstation applications, enabling fast computer-aided design, mechanical engineering and scientific computation. Each Pentium Pro processor is packaged together with a second speed-enhancing cache memory chip. In 1997 Intel Pentium II processor incorporates Intel MMX technology, which is designed specifically to process video, audio and graphics data efficiently. It was introduced in innovative Single Edge Contact (S.E.C) Cartridge that also incorporated a high-speed cache memory chip. With this chip, PC users can capture, edit and share digital photos with friends and family via the Internet. During 1998 Intel Pentium II Xeon processors are designed to meet the performance requirements of mid-range and higher servers and workstations. Consistent with Intel's strategy to deliver unique processor products targeted for specific markets segments, the Intel. Pentium II Xeon processors feature technical innovations specifically designed for workstations and servers that utilize demanding business applications such as Internet services, corporate data warehousing, digital content creation, and electronic and mechanical design automation. 1999IntelCeleron Continuing Intel's strategy of developing processors for specific market segments, the Celeron processor is designed for the value PC market segment. It provides consumers great performance at an exceptional price, and it delivers excellent performance for uses such as gaming and educational software. Before the 2oth century in 1999 Intel Pentium III processor features 70 new instructions--Internet Streaming SIMD extensions that dramatically enhance the performance of advanced imaging, 3-D, streaming audio, video and speech recognition applications. It was designed to significantly enhance Internet experiences, allowing users to do such things as browse through realistic online museums and stores and download high-quality video. In 5he same year 1999 Intel Pentium III Xeon comes in. The Pentium III Xeon processor extends Intel's offerings to the workstation and server market segments, providing additional performance for e-Commerce applications and advanced business computing. It is designed for systems with multiprocessor configurations. in 2000 Intel Pentium 4 processor based PCs can create professional-quality movies, deliver TV-like video via the Internet communicate with real-time video and voice; render 3D graphics in real time; quickly encode music for MP3 players and simultaneously run several multimedia applications while connected to the Internet. In 2001 Intel Xeon processor is targeted for high-performance and mid-range, dual-processor workstations, dual and multi-processor server configurations coming in the future. The platform offers customers a choice of operating systems and applications, along with high performance at affordable prices. In 2001 Intel Itanium processor is the first in a family of 64-bit products from Intel. Designed for high-end, enterprise-class servers and workstations. In 2002 Intel Itanium 2 processor is the second member of the Itanium processor family, a line of enterprise-class processors. The family brings outstanding performance and the volume economics of the Intel Architecture to the most data-intensive, businesscritical and technical computing applications. It provides leading performance for databases, computer aided engineering. In 2003 Intel Pentium M processor, the Intel 855 chipset family, and the PRO/Wireless 2100 network connection are the three components of Centrino mobile technology. Intel Centrino mobile technology is designed specifically for portable computing, with built-in wireless

LAN capability and breakthrough mobile performance. It enables extended battery life and thinner, lighter mobile computers. http://www.tayloredge.com/museum/processor/processorhistory.html

Role of a Processor: The CPU central processing unit better known as the CPU is that brain region of a PC structure that is able of doing complex coding calculations to allow the computer to execute a variety of functions and run a number of applications. To understand the vital functionality of a CPU, it is best advised to understand the meaning of the abbreviation CPU. It stands for Central Processing Unit and is physically present in the form of a single controlling chip attached to the main board of a PC. A CPU can be thought of as the area of intellect within a PC or its central intelligence. Every computer function is approved out in the form of complex mathematical codes. It is the CPU that calculates the accuracy and functionality of these codes and enables a computer to execute instructions. The CPU is capable to do this by following a four-step operational process. The first operation is called Fetch where the CPU reads an instruction from the PCs main memory. Sometimes fetching also includes reading data from an I/O module. This is followed by Decode or the interpretation phase where the instruction is decoded to decide what sort of action should be carried out. Next comes Execute, wherein the actual desired operation is carried out. The last CPU process is called Write back, where the data of an executed task is analyzed and results are written to memory. As all of you know that Every CPU has its own Cache Memory. This can be understood as a limited amount of memory that is offered by the CPU itself. The Cache memory can be further to the PC hardware and is usually very expensive. This cache performs the position of acting as storage for repeated functions and data that are required by the CPU regularly. This means that the CPU wouldnt have to depend on retrieving data from the PCs main memory. inside the CPU, there are several Registers that function above the memory offered by the PCs main memory or its own cache. These registers perform some very important functions. The most significant type of register is the user visible type. It helps to reduce the memory load on the PC by allowing assembly language programmers to diminish the number of main memory references that have to be made. Computer manufacturers have realized that a PCs performance gets a major boost if a computer is using more than one CPU. This is why dual-core and multi-core processing units have become the norm. Industry giants Intel and AMD lead the way in multiple core processor technology. This configuration has significantly improved the performance level of computers. As a result of multi core CPUs, larger and more multifaceted applications can be written that utilize a much larger code base and offer more features than ever before. New applications and games are utilizing the available horsepower of the new processors, and will be the driving force behind the next generation of CPUs. http://www.brighthub.com/computing/hardware/articles/31296.aspx

other Different Types of Processors:


1.

Central processing unit (CPU), an electronic circuit which executes computer programs, containing a processing unit and a control unit Processing unit, in Von Neumann computer architecture, contains an arithmetic logic unit (ALU) and processor registers Microprocessor, a CPU on one silicon chip as part of a microcomputer Graphics processing unit (GPU / VPU), a dedicated graphics rendering device for a personal computer or game console Physics processing unit (PPU), a dedicated microprocessor designed to handle the calculations of physics Digital signal processor, a specialized microprocessor designed specifically for digital signal processing Network processor, a microprocessor specifically targeted at the networking application domain Front end processor, a helper processor for communication between a host computer and other devices Coprocessor

2.

3. 4.

5.

6.

7. 8.

9.

10. Floating point unit 11. Data processor, a system that translates or converts between different data formats 12. Word processor, a computer application used for the production of printable material 13. Audio processor, used in studios and radio stations

http://en.wikipedia.org/wiki/Processor How the Processor works: The processor or CPU is a chip planned around instruction set. Rather than all chips are shaped equal: graphics chips, for example, are designed around a completely different set of instructions. The processor in your mobile phone is designed around yet another. x86 processors are designed as a sort of jack-of-all-trades. The CPU is a generalized piece of hardware, not specialized toward any given task. in theory, any type of processor can execute just about any type of code. Your CPU can execute the code necessary to produce the graphics of your favorite computer game. The CPU isn't designed and optimized for that task, so while your Nvidia GeForce 8400M can make Unreal Tournament 3 run pretty smoothly and hit about thirty frames per second, your CPU will choke trying to hit even five frames per

second, and it really doesn't matter just how fast your CPU is (unless somehow you've violated the laws of physics and gotten it running at 30GHz instead of 2GHz.) Modern processors have several things in common: they generally have some number of cores, an ondie cache, and support for either 32-bit or 64-bit code. They require a chipset (remember the motherboard article?) to properly communicate with the rest of the system. And they're one of the most power-hungry components of a laptop. http://www.notebookreview.com/default.asp?newsID=4521

Microprocessor A microprocessor incorporates the functions of a computer's central processing unit (CPU) on a single integrated circuit (IC),[1] or at most a few integrated circuits.[2] It is a multipurpose, programmable device that accepts digital data as input, processes it according to instructions stored in its memory, and provides results as output. It is an example of sequential digital logic, as it has internal memory. Microprocessors operate on numbers and symbols represented in the binary numeral system. The advent of low-cost computers on integrated circuits has transformed modern society. Generalpurpose microprocessors in personal computers are used for computation, text editing, multimedia display, and communication over the Internet. Many more microprocessors are part of embedded systems, providing digital control of a myriad of objects from appliances to automobiles to cellular phones and industrial process control. http://en.wikipedia.org/wiki/Micro_processor#Origins History and Its Evolution

The invention of the transistor in 1947 was a significant development in the world of technology. It could perform the function of a large component used in a computer in the early years. Shockley,

Brattain and Bardeen are credited with this invention and were awarded the Nobel prize for the same. Soon it was found that the function this large component was easily performed by a group of transistors arranged on a single platform. This platform, known as the integrated chip (IC), turned out to be a very crucial achievement and brought along a revolution in the use of computers. A person named Jack Kilby of Texas Instruments was honored with the Nobel Prize for the invention of IC, which laid the foundation on which microprocessors were developed. At the same time, Robert Noyce of Fairchild made a parallel development in IC technology for which he was awarded the patent.

ICs proved beyond doubt that complex functions could be integrated on a single chip with a highly developed speed and storage capacity. Both Fairchild and Texas Instruments began the manufacture of commercial ICs in 1961. Later, complex developments in the IC led to the addition of more complex functions on a single chip. The stage was set for a single controlling circuit for all the computer functions. Finally, Intel corporation's Ted Hoff and Frederico Fagin were credited with the design of the first microprocessor. The work on this project began with an order from a Japanese calculator company Busicom to Intel, for building some chips for it. Hoff felt that the design could integrate a number of functions on a single chip making it feasible for providing the required functionality. This led to the design of Intel 4004, the world's first microprocessor. The next in line was the 8 bit 8008 microprocessor. It was developed by Intel in 1972 to perform complex functions in harmony with the 4004. This was the beginning of a new era in computer applications. The use of mainframes and huge computers was scaled down to a much smaller device that was affordable to many. Earlier, their use was limited to large organizations and universities. With the advent of microprocessors, the use of computers trickled down to the common man. The next processor in line was Intel's 8080 with an 8 bit data bus and a 16 bit address bus. This was amongst the most popular microprocessors of all time. Very soon, the Motorola corporation developed its own 6800 in competition with the Intel's 8080. Fagin left Intel and formed his own firm Zilog. It launched a new microprocessor Z80 in 1980 that was far superior to the previous two versions. Similarly, a break off from Motorola prompted the design of 6502, a derivative of the 6800. Such attempts continued with some modifications in the base structure. The use of microprocessors was limited to task-based operations specifically required for company projects such as the automobile sector. The concept of a 'personal computer' was still a distant dream for the world and microprocessors were yet to come into personal use. The 16 bit microprocessors started becoming a commercial sell-out in the 1980s with the first popular one being the TMS9900 of Texas Instruments. Intel developed the 8086 which still serves as the base model for all latest advancements in the microprocessor family. It was largely a complete processor integrating all the required features in it. 68000 by Motorola was one of the first microprocessors to develop the concept of microcoding in its instruction set. They were further developed to 32 bit architectures. Similarly, many players like Zilog,

IBM and Apple were successful in getting their own products in the market. However, Intel had a commanding position in the market right through the microprocessor era. The 1990s saw a large-scale application of microprocessors in the personal computer applications developed by the newly formed Apple, IBM and Microsoft Corporation. It witnessed a revolution in the use of computers, which by then was a household entity. This growth was complemented by a highly sophisticated development in the commercial use of microprocessors. In 1993, Intel brought out its 'Pentium Processor' which is one of the most popular processors in use till date. It was followed by a series of excellent processors of the Pentium family, leading into the 21st century. The latest one in commercial use is the Pentium Dual Core technology and the Xeon processor. They have opened up a whole new world of diverse applications. Supercomputers have become common, owing to this amazing development in microprocessors. Certainly, these little chips will go down as history but will continue to rein in the future as an ingenious creation of the human mind.

How Microprocessor Works:

It is the central processing unit which coordinates all the functions of a computer. It generates timing signals, sends and receives data to and from every peripheral used inside or outside the computer. The commands required to do this are fed into the device in the form of current variations which are converted into meaningful instructions by the use of a Boolean Logic System. It divides its functions in two categories, logical functions and processing functions. The arithmetic and logical unit and the control unit handle these functions respectively. The information is communicated through a bunch of wires called buses. The address bus carries the 'address' of the location with which communication is desired while the data bus carries the data that is being exchanged. http://www.buzzle.com/articles/history-of-microprocessor.html Types of Micro Processor: Types There are different ways in which microprocessors are categorized. They are

CISC (Complex Instruction Set Computers)

RISC(Reduced Instruction Set Computers) VLIW(Very Long Instruction Word Computers) Super scalar processors

Other types of specialized processors are


General Purpose Processor (GPP) Special Purpose Processor (SPP) Application-Specific Integrated Circuit (ASIC) Digital Signal Processor (DSP)

http://www.buzzle.com/articles/history-of-microprocessor.html

Memory Internal storage areas in the computer. The term memory identifies data storage that comes in the form of chips, and the word storage is used for memory that exists on tapes or disks. Moreover, the term memory is usually used as a shorthand for physical memory, which refers to the actual chips capable of holding data. Some computers also use virtual memory, which expands physical memory onto a hard disk. Every computer comes with a certain amount of physical memory, usually referred to as main memory or RAM. You can think of main memory as an array of boxes, each of which can hold a single byte of information. A computer that has 1 megabyte of memory, therefore, can hold about 1 million bytes (or characters) of information. http://www.webopedia.com/TERM/M/memory.html History: In the early 1940s, memory technology mostly permitted a capacity of a few bytes. The first electronic programmable digital computer, the ENIAC, using thousands of octal-base radio vacuum tubes, could perform simple calculations involving 20 numbers of ten decimal digits which were held in the vacuum tube accumulators. The next significant advance in computer memory came with acoustic delay line memory, developed by J. Presper Eckert in the early 1940s. Through the construction of a glass tube filled with mercury and plugged at each end with a quartz crystal, delay lines could store bits of information within the quartz and transfer it through sound waves propagating through mercury. Delay line memory would be limited to a capacity of up to a few hundred thousand bits to remain efficient.

Two alternatives to the delay line, the Williams tube and Selectron tube, originated in 1946, both using electron beams in glass tubes as means of storage. Using cathode ray tubes, Fred Williams would invent the Williams tube, which would be the first random access computer memory. The Williams tube would prove more capacious than the Selectron tube (the Selectron was limited to 256 bits, while the Williams tube could store thousands) and less expensive. The Williams tube would nevertheless prove to be frustratingly sensitive to environmental disturbances. Efforts began in the late 1940s to find non-volatile memory. Jay Forrester, Jan A. Rajchman and An Wang developed magnetic core memory, which allowed for recall of memory after power loss. Magnetic core memory would become the dominant form of memory until the development of transistor-based memory in the late 1960s. Developments in technology and economies of scale have made possible so-called Very Large Memory (VLM) computers.[1] The term "memory" when used with reference to computers generally refers to Random Access Memory or RAM. Volatile memory: Memory or storage whose contents are erased when the system's power is turned off or interrupted. For example, RAM is volatile, if the computer was to be turned off anything stored currently in it would be erased. Because of RAM's volatile nature users must frequently save their work to a permanent medium, such as a hard drive, which is non-volatile to avoid losing data if the system's power is interrupted. http://www.computerhope.com/jargon/v/volamemo.htm RAM (Random Access Memory): RAM (random access memory) is the place in a computer where the operating system, application programs, and data in current use are kept so that they can be quickly reached by the computer's processor. RAM is much faster to read from and write to than the other kinds of storage in a computer, the hard disk, floppy disk, and CD-ROM. However, the data in RAM stays there only as long as your computer is running. When you turn the computer off, RAM loses its data. When you turn your computer on again, your operating system and other files are once again loaded into RAM, usually from your hard disk. http://searchmobilecomputing.techtarget.com/definition/RAM

Nonvolatile memory:
Nonvolatile memory is a general term for all forms of solid state (no moving parts) memorythat do not need to have their memory contents periodically refreshed. This includes all forms of read-only memory (ROM) such as programmable read-only memory (PROM), erasable programmable read-only memory (EPROM),

electrically erasable programmable read-only memory (EEPROM), and flash memory. It also includes random access memory (RAM) that is powered with a battery.

http://searchstorage.techtarget.com/definition/nonvolatile-memory ROM (Read Only Memory): ROM is an acronym for Read-Only Memory. It refers to computer memory chips containing permanent or semi-permanent data. Unlike RAM, ROM is non-volatile; even after you turn off your computer, the contents of ROM will remain.

Almost every computer comes with a small amount of ROM containing the boot firmware. This consists of a few kilobytes of code that tell the computer what to do when it starts up, e.g., running hardware diagnostics and loading the operating system into RAM. On a PC, the boot firmware is called the BIOS.

Originally, ROM was actually read-only. To update the programs in ROM, you had to remove and physically replace your ROM chips. Contemporary versions of ROM allow some limited rewriting, so you can usually upgrade firmware such as the BIOS by using installation software. Rewritable ROM chips include PROMs (programmable read-only memory), EPROMs (erasable read-only memory), EEPROMs (electrically erasable programmable read-only memory), and a common variation of EEPROMs called flash memory. How Computer Memory Works:
The microcomputer has two types of memory. The manufacturer installs machine-readable instructions or programs in special chips on the computer's motherboard. Users cannot change the programs on these chips. This type of memory is Read Only Memory, or ROM. These instructions help your computer start up, find drives and interpret commands from your keyboard or other input devices. They also help the computer display information for users in plain English or whatever language the user chooses. The memory that computer users work with when creating, saving and retrieving personal information is Random Access Memory, or RAM. Ram chips must also be installed in a microcomputer so that users have a space in which they can load software programs and files.

Anda mungkin juga menyukai