SlideShare a Scribd company logo
1 of 23
Computers Since it’s Birth




Group Members: Gunasinghe U.L.D.N     100162X

               Sashika W.A.D          100487X

               Siriwardena M.P        100512X

               Udara Y.B.N            100544V

               Wijayarathna D.G.C.D   100596F
Contents

1. What is a computer?
2. Abacus
3. Pascalene
4. Punched Card
5. Difference Engine
6. Analytical Engine
7. Ada Lovelace
8. Transistors
9. Hewlwtt-Packard Company
10.ENIAC
11.EDVAC
12.Alan Turing
13.UNIVAC
14.The first Transister Computer
15.Unix
16.Floppy disk
17.Intel 4004
18.Microsoft Windows.
19.Apple Computers
20.Pentium Pprocessors
21.Multi-Cpre Processors
22.Supeer Computers
23.Artificial Inteligence
24.Optical Computer
25.DNA Computing
26.Pen Computers
27.Quntum computers
What is a Computer?
       “A computer is an electronic device which is capable of receiving information (data)
and performing a sequence of logical operations in accordance with a predetermined but
variable set of procedural instructions (program) to produce a result in the form of
information or signals.” This is the definition of a computer according to the Oxford English
dictionary.

        The origin of the computer is compute which is basically means calculate. At earlier
days the major use of computers was calculating. But at present it has been completely
changed. Because rather than calculating we use computers for other advance applications.
Ex. Video editing, 3D animations, for military purposes, bio medical applications,
scientifically researchers etc. Due to the rapid change of technology, computer has become
very complex .

Characteristics of Computer.

   1.   A machine.
   2.   Can be programed.
   3.   Process data according to instructions.
   4.   Store data,              etc.
Abacus (3000 B.C.)
         Abacus is considered as the first computing device invented in the world. Abacus was
first invented by Babylonians in 3000 B.C.

                                                           About 1300 the more familiar wire-
                                                   and – bead abacus replace the Chinese
                                                   calculating rods. he Lee Kai-chen Abacus
                                                   was developed and manufactured in
                                                   Taiwan, China. Few of these remarkable
                                                   instruments remain as production only
                                                   lasted a few years. On the beam of the
                                                   suan pan in the lower part of the frame
                                                   there is an adjustable 'place setting
                                                   vernier'.

        With a moveable indicator that runs along the bottom rail and adjustable unit rod
markers on the beam of the soroban above Lee called his abacus, "A Revolution of Chinese
Calculators." This abacus was made circa 1959 and comes with a 58 page instruction manual.
The abacus measures 13 inches long by 8 inches wide. For a demonstration of Lee's Abacus
please see.



                              Pascalene( 1642-1643)
        In 1642-1643 Blaise Pascal creates a gear – driven adding machine called the
―Pascalene‖ , the first mechanical adding machine. Blaise Pascal, the French
scientist was one of the most reputed mathematician and physicist of his time.
He is credited with inventing an early calculator, amazingly advanced for its
time. A genuis from a young age, Blaise Pascal composed a treatise on the
communication of sounds at the age of twelve, and at the age of sixteen he
composed a treatise on conic sections.

        The idea of using machines to solve mathematical problems can be traced at least as
far as the early 17th century. Mathematicians who designed and implemented calculators that
                                were capable of addition, subtraction, multiplication, and
                                division included Wilhelm Schickhard, Blaise Pascal, and
                                Gottfried Leibnitz. In 1642, at the age of eighteen Blaise
                                Pascal invented his numerical wheel calculator called the
                                Pascaline to help his father a French tax collector count taxes.
                                The Pascaline had eight movable dials that added up to eight
                                figured long sums and used base ten. When the first dial (one's
column) moved ten notches - the second dial moved one notch to represent the ten's column
reading of 10 - and when the ten dial moved ten notches the third dial (hundred's column)
moved one notch to represent one hundred and so on.
Punched Card (1725)
       Punched cards were first used around 1725 by Basile Bouchon and Jean-Baptiste
Falcon as a more robust form of the perforated paper rolls then in use for controlling textile
looms in France. This technique was greatly improved by Joseph Marie Jacquard in his
Jacquard loom in 1801.

       Semen Korsakov was reputedly the first to use the punched cards in informatics for
information store and search. Korsakov announced his new method and machines in
September 1832, and rather than seeking patents offered the machines for public use.

        From the invention of computer programming languages up to the mid-1980s, many if
not most computer programmers created, edited and stored their programs on punched cards.
                     The     practice     was      nearly     universal
                     with IBM computers in the era. A punched
                     card is a flexible write-once medium that
                     encodes, most commonly, 80 characters of
                     data. Groups or "decks" of cards form
                     programs and collections of data. Users could create cards using a desk-
                     sized keypunch with a typewriter-like keyboard. A typing error generally
                     necessitated re-punching an entire card. A single character typo could be
corrected by duplicating the card up to the error column, typing the correct character and then
duplicating the rest of the card. In some companies programmers wrote information on
special forms called coding sheets, taking care to distinguish the digit zero from the
letter O, the digit one from the letter I, 8's from Bs, 2's from Zs, and so on. These forms were
then converted to cards by keypunch operators and, in some cases, checked by verifiers




                                Difference Engine (1822)
        Charles Babbage (1791-1871), computer pioneer, designed the first automatic
computing engines. He invented computers but failed to build them. The difference engine
was a special-purpose calculator designed to tabulate logarithms and trigonometric functions
                 by evaluating finite differences to create approximating polynomials.
                 Construction of this machine couldn’t complete; Babbage had conflicts with
                 his chief engineer, Joseph Clement, and ultimately the British government
                 withdrew its funding for the project. The first complete Babbage Engine was
                 completed in London in 2002, 153 years after it was designed. Difference
                 Engine No. 2, built faithfully to the original drawings, consists of 8,000
                 parts, weighs five tons, and measures 11 feet long. Actually it was a very
large step in computer history.
Analytical Engine (1834-1835)
        During the Difference Engine project Babbage realized that a much more general
design, the Analytical Engine, was possible. The input (programs and data) was to be
provided to the machine via punched cards, a method being used at the time to direct
mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a
curve plotter and a bell. The machine would also be able to punch numbers onto cards to be
read in later. It employed ordinary base-10 fixed-point arithmetic.

        There was to be a store (that is, a memory) capable of holding 1,000 numbers of 50
decimal digits each (ca. 20.7 kB). An arithmetical unit (the "mill") would be able to perform
                              all four arithmetic operations, plus comparisons and optionally
                              square roots. Initially it was conceived as a difference engine
                              curved back upon itself, in a generally circular layout, with the
                              long store exiting off to one side. (Later drawings depict a
                              regularized grid layout.) Like the central processing unit (CPU)
                              in a modern computer, the mill would rely upon its own
                              internal procedures, to be stored in the form of pegs inserted
into rotating drums called "barrels", to carry out some of the more complex instructions the
user's program might specify. (See microcode for the modern equivalent.)

        The programming language to be employed by users was akin to modern day
assembly languages. Loops and conditional branching were possible, and so
the language as conceived would have been Turing-complete long before
Alan Turing's concept. Three different types of punch cards were used: one
for arithmetical operations, one for numerical constants, and one for load and
store operations, transferring numbers from the store to the arithmetical unit
or back. There were three separate readers for the three types of cards.



                            Ada Lovelace (1842-1843)
       Ada Lovelace met and corresponded with Charles Babbage on many occasions,
including socially and in relation to Babbage's Difference Engine and Analytical Engine.
Babbage was impressed by Lovelace's intellect and writing skills. He called her "The
Enchantress of Numbers". In 1843 he wrote of her.



Forget this world and all its troubles and if
possible its multitudinous Charlatans — every thing
in short but the Enchantress of Numbers.



      During a nine-month period in 1842–43, Lovelace translated Italian mathematician
Luigi Menabrea's memoir on Babbage's newest proposed machine, the Analytical Engine.
With the article, she appended a set of notes. The notes are longer than the memoir itself and
include (Section G), in complete detail, a method for calculating a sequence of Bernoulli
numbers with the Engine, which would have run correctly had the Analytical Engine been
built. Based on this work, Lovelace is now widely credited with being the first computer
programmer and her method is recognized as the world's first computer program.



                                    Transistors (1934)
       The first patent for the field-effect transistor principle was filed in Canada
by Austrian-Hungarian physicist Julius Edgar Lilienfeld on October 22, 1925, but Lilienfeld
published no research articles about his devices, and they were ignored by industry. In 1934
German physicist Dr. Oskar Heil patented another field-effect transistor. There is no direct
evidence that these devices were built, but later work in the 1990s show that one of
Lilienfeld's designs worked as described and gave substantial gain. Legal papers from
the Bell Labs patent show that William Shockley and a co-worker at Bell Labs, Gerald
Pearson, had built operational versions from Lilienfeld's patents, yet they never referenced
this work in any of their later research papers or historical articles.

       The work emerged from their war-time efforts to produce extremely
pure germanium "crystal" mixer diodes, used in radar units as a frequency mixer element
in microwave radar receivers. A parallel project on germanium diodes at Purdue
University succeeded in producing the good-quality germanium semiconducting crystals that
were used at Bell Labs. Early tube-based technology did not switch fast enough for this role,
leading the Bell team to use solid state diodes instead. With this knowledge in hand they
turned to the design of a triode, but found this was not at all easy. Bardeen eventually
developed a new branch of quantum mechanics known as surface physics to account for the
"odd" behavior they saw, and Bardeen and Brattain eventually succeeded in building a
working device.

       After the war, William Shockley decided to attempt the building of a triode-like
semiconductor device. He secured funding and lab space, and went to work on the problem
with Brattain and John Bardeen.

       The key to the development of the transistor was the further understanding of the
process of the electron mobility in a semiconductor. It was realized that if there was some
way to control the flow of the electrons from the emitter to the collector of this newly
discovered diode, one could build an amplifier. For instance, if you placed contacts on either
side of a single type of crystal the current would not flow through it. However if a third
contact could then "inject" electrons or holes into the material, the current would flow.
Hewlett-Packard Company (1939)
       Bill Hewlett and Dave Packard graduated in electrical engineering from Stanford
University in 1935. The company originated in a garage in nearby Palo Alto during a
                        fellowship they had with a past professor, Frederick Terman at
                        Stanford during the Great Depression. Terman was considered a
                        mentor to them in forming Hewlett-Packard. In 1939, Packard and
                        Hewlett established Hewlett-Packard (HP) in Packard's garage with
                        an initial capital investment of US$538. Hewlett and Packard tossed
                        a coin to decide whether the company they founded would be called
Hewlett-Packard or Packard-Hewlett Packard won the coin toss but named their electronics
manufacturing enterprise the "Hewlett-Packard Company". HP incorporated on August 18,
1947, and went public on November 6, 1957.

         Of the many projects they worked on, their very first financially successful product
was a precision audio oscillator, the Model HP200A. Their innovation was the use of a small
incandescent light bulb (known as a "pilot light") as a temperature dependent resistor in a
critical portion of the circuit, the negative feedback loop which stabilized the amplitude of the
output sinusoidal waveform. This allowed them to sell the Model 200A for $54.40 when
competitors were selling less stable oscillators for over $200. The Model 200 series of
generators continued until at least 1972 as the 200AB, still tube-based but improved in design
through the years.

       One of the company's earliest customers was The Walt Disney Company, which
bought eight Model 200B oscillators (at $71.50 each) for use in certifying the Fanta sound
surround sound systems installed in theaters for the movie Fantasia.



    ENIAC (Electronic Numerical Integrator And Computer,
                           1943)
        The ENIAC's design and construction was financed by the United States Army during
World War II. The construction contract was signed on June 5, 1943, and work on the
computer began in secret by the University of Pennsylvania's Moore School of Electrical
Engineering starting the following month under the code name "Project PX". The completed
machine was announced to the public the evening of February 14, 1946 and formally
dedicated the next day at the University of Pennsylvania, having cost almost $500,000
(nearly $6 million in 2010, adjusted for inflation). It was formally accepted by the U.S. Army
Ordnance Corps in July 1946. ENIAC was shut down on November 9, 1946 for a
refurbishment and a memory upgrade, and was transferred to Aberdeen Proving Ground,
Maryland in 1947. There, on July 29, 1947, it was turned on and was in continuous operation
until 11:45 p.m. on October 2, 1955.
The ENIAC was a modular computer, composed of individual panels to perform different
functions. Twenty of these modules were accumulators, which
could not only add and subtract but hold a ten-digit decimal
number in memory. Numbers were passed between these units
across a number of general-purpose buses, or trays, as they were
called. In order to achieve its high speed, the panels had to send
and receive numbers, compute, save the answer, and trigger the
next operation—all without any moving parts. Key to its versatility was the ability to branch;
it could trigger different operations that depended on the sign of a computed result.

        Besides its speed, the most remarkable thing about ENIAC was its size and
                    complexity. ENIAC contained 17,468 vacuum tubes, 7,200 crystal
                    diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5
                    million hand-soldered joints. It weighed more than 30 short tons (27 t),
                    was roughly 8 by 3 by 100 feet (2.4 m × 0.9 m × 30 m), took up 1800
                    square feet (167 m2), and consumed 150 kW of power. Input was
                    possible from an IBM card reader, and an IBM card punch was used for
output. These cards could be used to produce printed output offline using an IBM accounting
machine, such as the IBM 405.

        The ENIAC could be programmed to perform complex sequences of operations,
which could include loops, branches, and subroutines. The task of taking a problem and
mapping it onto the machine was complex, and usually took weeks. After the program was
figured out on paper, the process of getting the program "into" the ENIAC by manipulating
its switches and cables took additional days. This was followed by a period of verification
and debugging, aided by the ability to "single step" the machine.



              EDVAC (Electronic Discrete Variable Automatic
                         Computer, 1945)
       ENIAC inventors John Mauchly and J. Presper Eckert proposed the EDVAC's
construction in August 1944, and design work for the EDVAC
commenced before the ENIAC was fully operational. The design would
implement a number of important architectural and logical
improvements conceived during the ENIAC's construction and would
incorporate a high speed serial access memory. Like the ENIAC, the
EDVAC was built for the U.S. Army's Ballistics Research Laboratory at
the Aberdeen Proving Ground by the University of Pennsylvania's
Moore School of Electrical Engineering. Eckert and Mauchly and the
other ENIAC designers were joined by John von Neumann in a consulting role; von
Neumann summarized and elaborated upon logical design developments in his 1945 First
Draft of a Report on the EDVAC.

      A contract to build the new computer was signed in April 1946 with an initial budget
of US$100,000. The contract named the device the Electronic Discrete Variable Automatic
Calculator. The final cost of EDVAC, however, was similar to the ENIAC's, at just under
$500,000.

       The EDVAC was a binary serial computer with automatic addition, subtraction,
multiplication, programmed division and automatic checking with an ultrasonic serial
memory capacity of 1,000 44-bit words (later set to 1,024 words, thus giving a memory, in
modern terms, of 5.5 kilobytes).

              Physically, the computer comprised the following components:
              a magnetic tape reader-recorder (Wilkes 1956:36 describes this as a wire
              recorder.)
              a control unit with an oscilloscope
              a dispatcher unit to receive instructions from the control and memory and
              direct them to other units
              a computational unit to perform arithmetic operations on a pair of numbers at
              a time and send the result to memory after checking on a duplicate unit
              a timer
              a dual memory unit consisting of two sets of 64 mercury acoustic delay lines
              of eight words capacity on each line
              three temporary tanks each holding a single word

       EDVAC's addition time was 864 microseconds and its multiplication time was 2900
microseconds (2.9 milliseconds).

       The computer had almost 6,000 vacuum tubes and 12,000 diodes, and consumed 56
kW of power. It covered 490 ft² (45.5 m²) of floor space and weighed 17,300 lb (7,850 kg).
The full complement of operating personnel was thirty people for each eight-hour shift.



                               Alan Turing (1912-1954)
        From 1945 to 1947 Turing lived in Church Street, Hampton while he worked on the
design of the ACE (Automatic Computing Engine) at the National Physical Laboratory. He
presented a paper on 19 February 1946, which was the first detailed design of a stored-
program computer. Although ACE was a feasible design, the secrecy surrounding the
wartime work at Bletchley Park led to delays in starting the project and he became
disillusioned. In late 1947 he returned to Cambridge for a sabbatical year. While he was at
Cambridge, the Pilot ACE was built in his absence. It executed its first program on 10 May
1950.

        In 1948, he was appointed Reader in the Mathematics Department
at Manchester (now part of The University of Manchester). In 1949, he
became Deputy Director of the computing laboratory at the University of
Manchester, and worked on software for one of the earliest stored-
program computers—the Manchester Mark 1. During this time he
continued to do more abstract work, and in "Computing machinery and
intelligence" (Mind, October 1950), Turing addressed the problem of
artificial intelligence, and proposed an experiment which became known as the Turing test,
an attempt to define a standard for a machine to be called "intelligent". The idea was that a
computer could be said to "think" if a human interrogator could not tell it apart, through
conversation, from a human being. In the paper, Turing suggested that rather than building a
program to simulate the adult mind, it would be better rather to produce a simpler one to
simulate a child's mind and then to subject it to a course of education. A reversed form of the
Turing test is widely used on the Internet; the CAPTCHA test is intended to determine
whether the user is a human or a computer.

        In 1948, Turing, working with his former undergraduate colleague, D. G.
Champernowne, began writing a chess program for a computer that did not yet exist. In 1952,
lacking a computer powerful enough to execute the program, Turing played a game in which
he simulated the computer, taking about half an hour per move. The game was recorded. The
program lost to Turing's colleague Alick Glennie, although it is said that it won a game
against Champernowne's wife.

        His Turing Test was a significant and characteristically provocative and lasting
contribution to the debate regarding artificial intelligence, which continues after more than
half a century.

       He also invented the LU decomposition method used today for solving an equations
matrix in 1948.

       Alan Turing consider as a father of Computer Science.



                               UNIVAC 1101 (1950)
         The UNIVAC 1101, or ERA 1101, was a computer system designed by Engineering
Research Associates (ERA) and built by the Remington Rand corporation in the 1950s. It was
the first stored program computer in the U.S. that was moved from its site of manufacture and
successfully installed at a distant site. Remington Rand used the 1101's architecture as the
basis for a series of machines into the 1960s.

                              This computer was 38 ft (12 m) long, 20 ft (6.1 m) wide, and
                      used 2700 vacuum tubes for its logic circuits. Its drum memory was 8.5
                      in (22 cm) in diameter, rotated at 3500 rpm, had 200 read-write heads,
                      and held 16,384 24-bit words (a memory size equivalent to 48 kB) with
                      access time between 32 microseconds and 17 milliseconds.

        Instructions were 24 bits long, with 6 bits for the opcode, 4
bits for the "skip" value (telling how many memory locations to
skip to get to the next instruction in program sequence), and 14 bits
for the memory address. Numbers were binary with negative values
in one's complement. The addition time was 96 microseconds and
the multiplication time was 352 microseconds.
The single 48-bit accumulator was fundamentally subtractive, addition being carried
out by subtracting the one's complement of the number to be added. This may appear rather
strange, but the subtractive adder reduces the chance of getting negative zero in normal
operations.

       The machine had 38 instructions.



                       The first transistor computer (1953)
        The University of Manchester's experimental Transistor Computer was first
operational in November 1953 and it is widely believed to be the first transistor computer to
come into operation anywhere in the world. There were two versions of the Transistor
Computer, the prototype, operational in 1953, and the full-size version, commissioned in
April 1955. The 1953 machine had 92 point-contact transistors and 550 diodes, manufactured
by STC. It had a 48-bit machine word. The 1955 machine had a total of 200 point-contact
transistors and 1300 point diodes, which resulted in a power consumption of 150 watts. There
were considerable reliability problems with the early batches of transistors and the average
error free run in 1955 was only 1.5 hours. The Computer also used a small number of tubes in
its clock generator, so it was not the first fully transistorized machine.

        The design of a full-size Transistor Computer was subsequently adopted by the
Manchester firm of Metropolitan-Vickers, who changed all the circuits to more reliable types
of junction transistors.The production version was known as the Metrovick 950 and was built
from 1956 to the extent of six or seven machines, which were "used commercially within the
company" or "mainly for internal use".



                                           Unix (1969)
        Unix (officially trademarked as UNIX, sometimes also written as
Unix) is a multitasking, multi-user computer operating system originally
developed in 1969 by a group of AT&T employees at Bell Labs, including
               Ken Thompson, Dennis Ritchie, Brian Kernighan, Douglas
               McIlroy, and Joe Ossanna. The Unix operating system was first
               developed in assembly language, but by 1973 had been almost
               entirely recoded in C, greatly facilitating its further development and porting to
               other hardware. Today's Unix systems are split into various branches,
               developed over time by AT&T as well as various commercial vendors and non-
profit organizations. The second edition of Unix was released on December 6th, 1972.
Floppy Disk (1970)
      The earliest floppy disks, invented at IBM, were 8 inches in diameter. They became
commercially available in 1971 Disks in this form factor were produced and improved upon
by IBM and other companies such as Memorex, Shugart Associates, and Burroughs
Corporation. A BASF double-density 5¼-inch diskette.

       In 1976 Shugart Associates introduced the first 5¼-inch FDD and associated media.
By 1978 there were more than 10 manufacturers producing 5¼-inch FDDs, in competing disk
formats: hard or soft sectored with various encoding schemes such as FM, MFM and GCR.
The 5¼-inch formats quickly displaced the 8-inch for most applications, and the 5¼-inch
hard-sectored disk format eventually disappeared.

       In 1984, IBM introduced the 1.2 megabyte dual sided floppy disk along with its AT
                        model. Although often used as backup storage, the high density
                          floppy was not often used by software manufacturers for
                          interchangeability. In 1986, IBM began to use the 720 kB double
                          density 3.5" microfloppy disk on its Convertible laptop computer. It
                          introduced the so-called "1.44 MB" high density version with the
PS/2 line. These disk drives could be added to existing older model PCs. In 1988 IBM
introduced a drive for 2.88 MB "DSED" diskettes in its top-of-the-line PS/2 models; it was a
commercial failure.

        In 1991, Insite Peripherals introduced the "Floptical", which used an infra-red LED to
position the heads over marks in the disk surface. The original drive
stored 21 MB, while also reading and writing standard DD and HD
floppies. In order to improve data transfer speeds and make the
high-capacity drive usefully quick as well, the drives were attached
to the system using a SCSI connector instead of the normal floppy
controller. This made them appear to the operating system as a hard drive instead of a floppy,
meaning that most PCs were unable to boot from them. This again adversely affected pickup
rates.
Intel 4004 (1971)
        The Intel 4004 was a 4-bit central processing unit (CPU) released by Intel
Corporation in 1971. It was the first complete CPU on one chip, and also the first
commercially available microprocessor. Such a feat of integration was made possible by the
use of then new silicon gate technology allowing a higher number of transistors and a faster
speed than was possible before.The 4004 employed a 10-μm silicon-gate enhancement
load pMOS technology and could execute approximately 92,000 instructions per second (that
is, a single instruction cycle was 10.8 microseconds). The chief designers of the chip
wereFederico Faggin and Ted Hoff of Intel, and Masatoshi Shima of Busicom (later
of ZiLOG, founded by Faggin).

         Faggin, the sole chip designer among the engineers on the MCS-4 project, was the
                              only one with experience in MOS random logic and circuit
                              design. He also had the crucial knowledge of the new silicon
                              gate process technology with self-aligned gates, which he had
                              created at Fairchild in 1968. At Fairchild in 1968, Faggin also
                              designed and manufactured the world's first commercial IC
                              using SGT — the Fairchild 3708. As soon as he joined the Intel
                              MOS Department he created a new random design methodology
                              based on silicon gate, and contributed many technology and
circuit design inventions that enabled a single chip microprocessor to become a reality for the
first time. His methodology set the design style for all the early Intel microprocessors and
later for the Zilog’s Z80. He also led the MCS-4 project and was responsible for its
successful outcome (1970–1971). Ted Hoff, head of the Application Research Department,
contributed only the architectural proposal for Busicom working with Stanley Mazor in 1969,
then he moved on to other projects. When asked where he got the ideas for the architecture of
the first microprocessor, Hoff related that Plessey, "a British tractor company," had donated a
minicomputer to Stanford, and he had "played with it some" while he was there. Shima
designed the Busicom calculator firmware and assisted Faggin during the first six months of
the implementation. The manager of Intel's MOS Design Department was Leslie L.
Vadász. At the time of the MCS-4 development, Vadasz's attention was completely focused
on the mainstream business of semiconductor memories and he left the leadership and the
management of the MCS-4 project to Faggin.

        A popular myth has it that Pioneer 10, the first spacecraft to leave the solar system,
used an Intel 4004 microprocessor. According to Dr. Larry Lasher of Ames Research Center,
the Pioneer team did evaluate the 4004, but decided it was too new at the time to include in
any of the Pioneer projects. The myth was repeated by Federico Faggin himself in a lecture
for the Computer History Museum in 2006
Microsoft Windows (1975)
        Paul Allen and Bill Gates, childhood friends with a passion in computer
programming, were seeking to make a successful business utilizing their shared skills. The
January 1975 issue of Popular Electronics featured Micro Instrumentation and Telemetry
Systems's (MITS) Altair 8800 microcomputer. Allen noticed that they could program
a BASIC interpreter for the device; after a call from Gates claiming to have a working
interpreter, MITS requested a demonstration. Since they didn't actually have one, Allen
worked on a simulator for the Altair while Gates developed the interpreter. Although they
developed the interpreter on a simulator and not the actual device, the interpreter worked
flawlessly when they demonstrated the interpreter to MITS in Albuquerque, New Mexico in
March 1975; MITS agreed to distribute it, marketing it
as Altair BASIC. They officially established Microsoft on
April 4, 1975, with Gates as the CEO. Allen came up with
the original name of "Micro-Soft," as recounted in a 1995
Fortune magazine article. In August 1977 the company
formed an agreement with ASCII Magazine in Japan,
resulting in its first international office, "ASCII Microsoft". The company moved to a new
home in Bellevue, Washington in January 1979.

       Microsoft entered the OS business in 1980 with its own version of Unix, called Xenix.
However, it was DOS (Disk Operating System) that solidified the company's dominance.
After negotiations with Digital Research failed, IBM awarded a contract to Microsoft in
November 1980 to provide a version of the CP/MOS, which was set to be used in the
upcoming IBM Personal Computer (IBM PC). For this deal, Microsoft purchased a CP/M
clone called 86-DOS from Seattle Computer Products, branding it as MS-DOS, which IBM
rebranded to PC-DOS. Following the release of the IBM PC in August 1981, Microsoft
retained ownership of MS-DOS. Since IBM copyrighted the IBM PC BIOS, other companies
had to reverse engineer it in order for non-IBM hardware to run as IBM PC compatibles, but
no such restriction applied to the operating systems. Due to various factors, such as MS-
DOS's available software selection, Microsoft eventually became the leading PC OS vendor.
The company expanded into new markets with the release of the Microsoft Mouse in 1983, as
well as a publishing division named Microsoft Press. Paul Allen resigned from Microsoft in
February after developing Hodgkin's disease.
Apple Computers (1976)
        On April fool’s Day, 1976, Steve Wozniak and Steve Jobsreleased the Apple I
computer and started Apple Computers. The Apple I was the first with a single circuit board
used in a computer. The first home computer with a GUI or
graphical user interface was the Apple Lisa. The very first
graphical user interface was developed by the Xerox Corporation
at their Palo Alto Research Center (PARC) in the 1970s. Steve
Jobs visited PARC in 1979 (after buying Xerox stock) and was
impressed and influenced by the Xerox Alto, the first computer ever with a graphical user
interface. Jobs designed the new Apple Lisa based on the technology he saw at Xerox.




                                    Pentium Processors
         Pentium is a registered trademark that is included in the brand names of many
of Intel's x86-compatible microprocessors, both single- and multi-core. The name Pentium
was derived from the Greekpente (πέντε), meaning 'five', and the Latin ending -ium, a name
selected after courts had disallowed trademarking of number-based names like "i586" or
"80586" (model numbers cannot always be trademarked). Following Intel's previous series
of 8086, 80186, 80286, 80386,          and 80486 microprocessors,          Intel's         fifth-
generation microarchitecture, the P5, was first released under the Pentium brandon March 22,
1993. In 1995, Intel started to employ the registered Pentium trademark also for x86
microprocessors         with radically       different microarchitectures        (e.g., Pentium
Pro, II, III, 4, D, M, etc.). In 2006, the Pentium brand briefly disappeared from
Intel's roadmaps, only to re-emerge in 2007.




                                  Multi-core processors
       A multi-core processor is a single computing component with two or more
independent actual processors (called "cores"), which are the units that
read and execute program instructions. The data in the instruction tells
the processor what to do. The instructions are very basic things like
reading data from memory or sending data to the user display, but they
are processed so rapidly that we experience the results as the smooth
operation of a program. Manufacturers typically integrate the cores onto a single integrated
circuit die (known as a chip multiprocessor or CMP), or onto multiple dies in a single chip
package.
Processors were originally developed with only one core. A multi-core processor is
one in which the number of cores is large enough that traditional multi-processor techniques
are no longer efficient — largely due to issues with congestion in supplying instructions and
data to the many processors. The many-core threshold is roughly in the range of several tens
of cores; above this threshold network on chip technology is advantageous.

       A dual-core processor has two cores (e.g. AMD Phenom II X2, Intel Core Duo),
a quad-core processor contains four cores (e.g. AMD Phenom II X4, the Intel 2010 core line
                          that includes three levels of quad-core processors, see i3, i5,
                          and i7 at Intel Core), and a hexa-core processor contains six cores
                          (e.g. AMD Phenom II X6, Intel Core i7 Extreme Edition 980X). A
                          multi-core     processor   implements multiprocessing in    a   single
physical package. Designers may couple cores in a multi-core device tightly or loosely. For
example, cores may or may not share caches, and they may implement message
passing or shared    memory inter-core       communication      methods.     Common network
topologies to    interconnect    cores      include bus,    ring,   two-dimensional       mesh,
and crossbar. Homogeneous multi-core           systems       include       only       identical
cores, heterogeneous multi-core systems have cores which are not identical. Just as with
single-processor systems, cores in multi-core systems may implement architectures such
as superscalar, VLIW, vector processing, SIMD, or multithreading.

       The improvement in performance gained by the use of a multi-core processor depends
very much on the software algorithms used and their implementation. In particular, possible
gains are limited by the fraction of the software that can be parallelized to run on multiple
cores simultaneously; this effect is described by Amdahl's law. In the best case, so-
called embarrassingly parallel problems may realize speedup factors near the number of
cores, or even more if the problem is split up enough to fit within each core's cache(s),
avoiding use of much slower main system memory. Most applications, however, are not
accelerated so much unless programmers invest a prohibitive amount of effort in re-factoring
the whole problem. The parallelization of software is a significant ongoing topic of research.
Super Computers
        A supercomputer is a computer that is at the frontline of current processing capacity,
particularly speed of calculation.

        Supercomputers are used for highly calculation-intensive tasks such as problems
involving quantum      physics, weather      forecasting,    climate     research, molecular
modeling (computing the structures and properties of chemical compounds,
biological macromolecules, polymers, and crystals), physical simulations (such as simulation
of airplanes in wind tunnels, simulation of the detonation of nuclear weapons, and research
into nuclear fusion).

      Supercomputers were introduced in the 1960s and were designed primarily
by Seymour Cray at Control Data Corporation (CDC), which led
the market into the 1970s until Cray left to form his own
company, Cray Research. He then took over the supercomputer
market with his new designs, holding the top spot in
supercomputing for five years (1985–1990). In the 1980s a large
number of smaller competitors entered the market, in parallel to the creation of the
minicomputer market a decade earlier, but many of these disappeared in the mid-1990s
"supercomputer market crash".

        Today, supercomputers are typically one-of-a-kind custom designs produced by
traditional companies such as Cray, IBM and Hewlett-Packard, who had purchased many of
the 1980s companies to gain their experience. Since October 2010, the Tianhe-
1A supercomputer has been the fastest in the world; it is located in China.
       The term supercomputer itself is rather fluid, and the speed of today's supercomputers
tends to become typical of tomorrow's ordinary computers. CDC's early machines were
                         simply very fast scalar processors, some ten times the speed of
                         the fastest machines offered by other companies. In the 1970s
                         most supercomputers were dedicated to running a vector
                           processor, and many of the newer players developed their own
                           such processors at a lower price to enter the market. The early
                           and mid-1980s saw machines with a modest number of vector
processors working in parallel to become the standard. Typical numbers of processors were in
the range of four to sixteen. In the later 1980s and 1990s, attention turned from vector
processors to massive parallel processing systems with thousands of "ordinary" CPUs, some
being off the shelf units and others being custom designs. Today, parallel designs are based
on "off the shelf" server-class microprocessors, such as the PowerPC,Opteron, or Xeon, and
coprocessors like NVIDIA Tesla GPGPUs, AMD GPUs, IBM Cell, FPGAs. Most modern
supercomputers are now highly-tuned computer clusters using commodity processors
combined with custom interconnects.
Artificial Inteligence
        Artificial intelligence (AI) is the intelligence of machines and the branch of computer
science that aims to create it. AI textbooks define the field as "the study and design of
intelligent agents" where an intelligent agent is a system that perceives its environment and
takes actions that maximize its chances of success. John McCarthy, who coined the term in
1956, defines it as "the science and engineering of making intelligent machines."

        The field was founded on the claim that a central property of humans, intelligence—
the sapience of Homo sapiens—can be so precisely
described that it can be simulated by a machine.
This raises philosophical issues about the nature of
the mind and the ethics of creating artificial beings,
issues     which      have       been       addressed
by myth, fiction and philosophy since       antiquity.
Artificial intelligence has been the subject of
optimism, but has also suffered setbacksnd, today, has become an essential part of the
technology industry, providing the heavy lifting for many of the most difficult problems in
computer science.

       AI research is highly technical and specialized, and deeply divided into subfields that
often fail to communicate with each other. Subfields have grown up around particular
institutions, the work of individual researchers, the solution of specific problems,
longstanding differences of opinion about how AI should be done and the application of
widely differing tools. The central problems of AI include such traits as reasoning,
knowledge, planning, learning, communication, perception and the ability to move and
manipulate objects. General intelligence (or "strong AI") is still among the field's long term
goals.




                                      Optical Computer
      The computers we use today use transistors and semiconductors to control electricity.
Computers of the future may utilize crystals and metamaterials to control light. Optical
computers make use of light particles called photons.

       NASA scientists are working to solve the need for computer speed using light
Light travels at 186,000 miles per second. That's 982,080,000 feet
per second -- or 11,784,960,000 inches. In a billionth of a second,
one nanosecond, photons of light travel just a bit less than a foot, not
considering resistance in air or of an optical fiber strand or thin film.
Just right for doing things very quickly in microminiaturized
computer chips.
"Entirely optical computers are still some time in the future," says Dr. Frazier, "but
                        electro-optical hybrids have been possible since 1978, when it was
                        learned that photons can respond to electrons through media such as
                        lithium niobate. Newer advances have produced a variety of thin
                        films and optical fibers that make optical interconnections and
                        devices practical. We are focusing on thin films made of organic
                        molecules, which are more light sensitive than inorganics.
       Organics can perform functions such as switching, signal processing and frequency
doubling using less power than inorganics. Inorganics such as silicon used with organic
materials let us use both photons and electrons in current hybrid systems, which will
eventually lead to all-optical computer systems."
       "What we are accomplishing in the lab today will result in development of super-fast,
super-miniaturized, super-lightweight and lower cost optical computing and optical
communication devices and systems," Frazier explained.



                                     DNA Computing

        DNA            computing is            a          form         of computing which
uses DNA, biochemistry and molecular biology, instead of the traditional silicon-
based computer technologies. DNA computing, or, more generally, biomolecular computing,
is a fast developing interdisciplinary area. Research and development in this area concerns
theory, experiments and applications of DNA computing.
       This field was initially developed by Leonard Adleman of
the University of Southern California, in 1994. Adleman demonstrated
a proof-of-concept use of DNA as a form of computation which solved
the seven-point Hamiltonian path problem. Since the initial Adleman
experiments, advances have been made and various Turing
machines have been proven to be constructible.

       In 2002, researchers from the Weizmann Institute of Science in Rehovot, Israel,
unveiled a programmable molecular computing machine composed of enzymes and DNA
molecules instead of silicon microchips. On April 28, 2004, Ehud Shapiro, Yaakov
Benenson, Binyamin Gil, Uri Ben-Dor, and Rivka Adar at the Weizmann Institute announced
in the journal Nature that they had constructed a DNA computer coupled with an input and
output module which would theoretically be capable of diagnosing cancerous activity within
a cell, and releasing an anti-cancer drug upon diagnosis.
Pen Computers




        P-ISM is a gadget package including five functions: a pen-style cellular phone with a
handwriting data input function, virtual keyboard, a very small projector, camera scanner, and
personal ID key with cashless pass function. P-ISMs are connected with one another through
short-range wireless technology. The whole set is also connected to the Internet through the
cellular phone function. This personal gadget in a minimalistic pen style enables the ultimate
ubiquitous computing.




                              Quantum Computers
        A quantum computer is a device for computation that makes direct use of
quantum mechanical phenomena, such as superposition and entanglement, to perform
operations on data. Quantum computers are different from traditional computers based
on transistors. The basic principle behind quantum computation is that quantum properties
can be used to represent data and perform operations on these data. A theoretical model is
the quantum Turing machine, also known as the universal quantum computer.

                                Although quantum computing is still in its infancy,
experiments have been carried out in which quantum computational operations were executed
on a very small number of qubits (quantum bits). Both practical and theoretical research
continues, and many national government and military funding agencies support quantum
computing research to develop quantum computers for both civilian and national security
purposes, such as cryptanalysis
Large-scale quantum computers could be able to solve certain problems much faster than any
classical computer by using the best currently known algorithms, like integer
factorization using Short’s algorithm or the simulation of quantum many-body systems.
There exist quantum algorithms, which run faster than any possible probabilistic classical
algorithm. Given enough resources, a classical computer can simulate an arbitrary quantum
computer. Hence, ignoring computational and space constraints, a quantum computer is not
capable of solving any problem which a classical computer cannot.




                                     References

       Wikipidea ,The Free Encyclopedia.
       www.computer.org/computer/timeline/timeline.pdf
       www.intel.com
       www.futureforall.org
       www.nvidea.com
History of Computer

More Related Content

Similar to History of Computer

History of Computers.pptx
History of Computers.pptxHistory of Computers.pptx
History of Computers.pptxSMohamedImran
 
History of Computer, Generations of Computer
History of Computer, Generations of ComputerHistory of Computer, Generations of Computer
History of Computer, Generations of ComputerArthur Glenn Guillen
 
History and generation of a computer
History and generation of a computerHistory and generation of a computer
History and generation of a computerSanskritSecondarySch
 
Introduction to Computing Lecture 01 history of computers
Introduction to Computing Lecture 01 history of computersIntroduction to Computing Lecture 01 history of computers
Introduction to Computing Lecture 01 history of computersMuhammad Bilal
 
History of computer final
History of computer finalHistory of computer final
History of computer finalKartikey Rohila
 
Evolution of computer and its impact on society
Evolution of computer and its impact on societyEvolution of computer and its impact on society
Evolution of computer and its impact on societyshreyash singh
 
presentation by sunriz n sarani
presentation by sunriz n saranipresentation by sunriz n sarani
presentation by sunriz n saraniMehnaz Binte Zia
 
History of Computer-.pptx
History of Computer-.pptxHistory of Computer-.pptx
History of Computer-.pptxIrisOlaso
 
SSC-ICT 7_History of Computer_031810.pptx
SSC-ICT 7_History of Computer_031810.pptxSSC-ICT 7_History of Computer_031810.pptx
SSC-ICT 7_History of Computer_031810.pptxHaruHaru68
 
Computer history & evaluation
Computer history & evaluationComputer history & evaluation
Computer history & evaluationFazil Ahamed
 
History-of-Computers (1).pdf
History-of-Computers (1).pdfHistory-of-Computers (1).pdf
History-of-Computers (1).pdfGerthieEspaola1
 

Similar to History of Computer (20)

History of Computers.pptx
History of Computers.pptxHistory of Computers.pptx
History of Computers.pptx
 
History of Computer, Generations of Computer
History of Computer, Generations of ComputerHistory of Computer, Generations of Computer
History of Computer, Generations of Computer
 
History and generation of a computer
History and generation of a computerHistory and generation of a computer
History and generation of a computer
 
History of computing
History of computingHistory of computing
History of computing
 
Introduction to Computing Lecture 01 history of computers
Introduction to Computing Lecture 01 history of computersIntroduction to Computing Lecture 01 history of computers
Introduction to Computing Lecture 01 history of computers
 
Computer History
Computer HistoryComputer History
Computer History
 
History of computer
History of computerHistory of computer
History of computer
 
History of computer final
History of computer finalHistory of computer final
History of computer final
 
Evolution of computer and its impact on society
Evolution of computer and its impact on societyEvolution of computer and its impact on society
Evolution of computer and its impact on society
 
History of computers
History of computersHistory of computers
History of computers
 
presentation by sunriz n sarani
presentation by sunriz n saranipresentation by sunriz n sarani
presentation by sunriz n sarani
 
History of Computer-.pptx
History of Computer-.pptxHistory of Computer-.pptx
History of Computer-.pptx
 
SSC-ICT 7_History of Computer_031810.pptx
SSC-ICT 7_History of Computer_031810.pptxSSC-ICT 7_History of Computer_031810.pptx
SSC-ICT 7_History of Computer_031810.pptx
 
Brief_History_Computing
Brief_History_ComputingBrief_History_Computing
Brief_History_Computing
 
Brief history computing
Brief history computingBrief history computing
Brief history computing
 
Computer history & evaluation
Computer history & evaluationComputer history & evaluation
Computer history & evaluation
 
history-of-computers513-converted.pdf
history-of-computers513-converted.pdfhistory-of-computers513-converted.pdf
history-of-computers513-converted.pdf
 
Project kuldeep
Project kuldeepProject kuldeep
Project kuldeep
 
History-of-Computers (1).pdf
History-of-Computers (1).pdfHistory-of-Computers (1).pdf
History-of-Computers (1).pdf
 
History of Computer
History of ComputerHistory of Computer
History of Computer
 

More from Chamila Wijayarathna

Why Johnny Can't Store Passwords Securely? A Usability Evaluation of Bouncyca...
Why Johnny Can't Store Passwords Securely? A Usability Evaluation of Bouncyca...Why Johnny Can't Store Passwords Securely? A Usability Evaluation of Bouncyca...
Why Johnny Can't Store Passwords Securely? A Usability Evaluation of Bouncyca...Chamila Wijayarathna
 
Using Cognitive Dimensions Questionnaire to Evaluate the Usability of Securit...
Using Cognitive Dimensions Questionnaire to Evaluate the Usability of Securit...Using Cognitive Dimensions Questionnaire to Evaluate the Usability of Securit...
Using Cognitive Dimensions Questionnaire to Evaluate the Usability of Securit...Chamila Wijayarathna
 
SinMin - Sinhala Corpus Project - Thesis
SinMin - Sinhala Corpus Project - ThesisSinMin - Sinhala Corpus Project - Thesis
SinMin - Sinhala Corpus Project - ThesisChamila Wijayarathna
 
Implementing a Corpus for Sinhala Language
Implementing a Corpus for Sinhala LanguageImplementing a Corpus for Sinhala Language
Implementing a Corpus for Sinhala LanguageChamila Wijayarathna
 
Sinmin Literature Review Presentation
Sinmin Literature Review PresentationSinmin Literature Review Presentation
Sinmin Literature Review PresentationChamila Wijayarathna
 
Xbotix 2014 Rules undergraduate category
Xbotix 2014 Rules   undergraduate categoryXbotix 2014 Rules   undergraduate category
Xbotix 2014 Rules undergraduate categoryChamila Wijayarathna
 
Higgs Boson Machine Learning Challenge Report
Higgs Boson Machine Learning Challenge ReportHiggs Boson Machine Learning Challenge Report
Higgs Boson Machine Learning Challenge ReportChamila Wijayarathna
 
Knock detecting door lock research paper
Knock detecting door lock research paperKnock detecting door lock research paper
Knock detecting door lock research paperChamila Wijayarathna
 
Helen Keller, The Story of My Life
Helen Keller, The Story of My LifeHelen Keller, The Story of My Life
Helen Keller, The Story of My LifeChamila Wijayarathna
 
Shirsha Yaathra - Head Movement controlled Wheelchair - Research Paper
Shirsha Yaathra - Head Movement controlled Wheelchair - Research PaperShirsha Yaathra - Head Movement controlled Wheelchair - Research Paper
Shirsha Yaathra - Head Movement controlled Wheelchair - Research PaperChamila Wijayarathna
 
Products, Process Development Firms in Sri Lanka and their focus on Sustaina...
Products, Process  Development Firms in Sri Lanka and their focus on Sustaina...Products, Process  Development Firms in Sri Lanka and their focus on Sustaina...
Products, Process Development Firms in Sri Lanka and their focus on Sustaina...Chamila Wijayarathna
 

More from Chamila Wijayarathna (20)

Why Johnny Can't Store Passwords Securely? A Usability Evaluation of Bouncyca...
Why Johnny Can't Store Passwords Securely? A Usability Evaluation of Bouncyca...Why Johnny Can't Store Passwords Securely? A Usability Evaluation of Bouncyca...
Why Johnny Can't Store Passwords Securely? A Usability Evaluation of Bouncyca...
 
Using Cognitive Dimensions Questionnaire to Evaluate the Usability of Securit...
Using Cognitive Dimensions Questionnaire to Evaluate the Usability of Securit...Using Cognitive Dimensions Questionnaire to Evaluate the Usability of Securit...
Using Cognitive Dimensions Questionnaire to Evaluate the Usability of Securit...
 
SinMin - Sinhala Corpus Project - Thesis
SinMin - Sinhala Corpus Project - ThesisSinMin - Sinhala Corpus Project - Thesis
SinMin - Sinhala Corpus Project - Thesis
 
GS0C - "How to Start" Guide
GS0C - "How to Start" GuideGS0C - "How to Start" Guide
GS0C - "How to Start" Guide
 
Sinmin final presentation
Sinmin final presentation Sinmin final presentation
Sinmin final presentation
 
Implementing a Corpus for Sinhala Language
Implementing a Corpus for Sinhala LanguageImplementing a Corpus for Sinhala Language
Implementing a Corpus for Sinhala Language
 
Sinmin Literature Review Presentation
Sinmin Literature Review PresentationSinmin Literature Review Presentation
Sinmin Literature Review Presentation
 
Xbotix 2014 Rules undergraduate category
Xbotix 2014 Rules   undergraduate categoryXbotix 2014 Rules   undergraduate category
Xbotix 2014 Rules undergraduate category
 
Kaggle KDD Cup Report
Kaggle KDD Cup ReportKaggle KDD Cup Report
Kaggle KDD Cup Report
 
Higgs Boson Machine Learning Challenge Report
Higgs Boson Machine Learning Challenge ReportHiggs Boson Machine Learning Challenge Report
Higgs Boson Machine Learning Challenge Report
 
Programs With Common Sense
Programs With Common SensePrograms With Common Sense
Programs With Common Sense
 
Knock detecting door lock research paper
Knock detecting door lock research paperKnock detecting door lock research paper
Knock detecting door lock research paper
 
IEEE Xtreme Final results 2012
IEEE Xtreme Final results 2012IEEE Xtreme Final results 2012
IEEE Xtreme Final results 2012
 
Helen Keller, The Story of My Life
Helen Keller, The Story of My LifeHelen Keller, The Story of My Life
Helen Keller, The Story of My Life
 
Shirsha Yaathra - Head Movement controlled Wheelchair - Research Paper
Shirsha Yaathra - Head Movement controlled Wheelchair - Research PaperShirsha Yaathra - Head Movement controlled Wheelchair - Research Paper
Shirsha Yaathra - Head Movement controlled Wheelchair - Research Paper
 
Ieee xtreme 5.0 results
Ieee xtreme 5.0 resultsIeee xtreme 5.0 results
Ieee xtreme 5.0 results
 
Memory technologies
Memory technologiesMemory technologies
Memory technologies
 
Products, Process Development Firms in Sri Lanka and their focus on Sustaina...
Products, Process  Development Firms in Sri Lanka and their focus on Sustaina...Products, Process  Development Firms in Sri Lanka and their focus on Sustaina...
Products, Process Development Firms in Sri Lanka and their focus on Sustaina...
 
Path Following Robot
Path Following RobotPath Following Robot
Path Following Robot
 
Path following robot
Path following robotPath following robot
Path following robot
 

Recently uploaded

Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1GloryAnnCastre1
 
Measures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataMeasures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataBabyAnnMotar
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfJemuel Francisco
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxAnupkumar Sharma
 
Congestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationCongestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationdeepaannamalai16
 
ClimART Action | eTwinning Project
ClimART Action    |    eTwinning ProjectClimART Action    |    eTwinning Project
ClimART Action | eTwinning Projectjordimapav
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management systemChristalin Nelson
 
Choosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for ParentsChoosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for Parentsnavabharathschool99
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4MiaBumagat1
 
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...Association for Project Management
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptxmary850239
 
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptxBIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptxSayali Powar
 
How to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseHow to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseCeline George
 
IPCRF/RPMS 2024 Classroom Observation tool is your access to the new performa...
IPCRF/RPMS 2024 Classroom Observation tool is your access to the new performa...IPCRF/RPMS 2024 Classroom Observation tool is your access to the new performa...
IPCRF/RPMS 2024 Classroom Observation tool is your access to the new performa...MerlizValdezGeronimo
 
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITW
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITWQ-Factor HISPOL Quiz-6th April 2024, Quiz Club NITW
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITWQuiz Club NITW
 
Grade Three -ELLNA-REVIEWER-ENGLISH.pptx
Grade Three -ELLNA-REVIEWER-ENGLISH.pptxGrade Three -ELLNA-REVIEWER-ENGLISH.pptx
Grade Three -ELLNA-REVIEWER-ENGLISH.pptxkarenfajardo43
 
week 1 cookery 8 fourth - quarter .pptx
week 1 cookery 8  fourth  -  quarter .pptxweek 1 cookery 8  fourth  -  quarter .pptx
week 1 cookery 8 fourth - quarter .pptxJonalynLegaspi2
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxHumphrey A Beña
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfVanessa Camilleri
 

Recently uploaded (20)

Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1
 
Measures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataMeasures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped data
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
 
Congestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationCongestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentation
 
Faculty Profile prashantha K EEE dept Sri Sairam college of Engineering
Faculty Profile prashantha K EEE dept Sri Sairam college of EngineeringFaculty Profile prashantha K EEE dept Sri Sairam college of Engineering
Faculty Profile prashantha K EEE dept Sri Sairam college of Engineering
 
ClimART Action | eTwinning Project
ClimART Action    |    eTwinning ProjectClimART Action    |    eTwinning Project
ClimART Action | eTwinning Project
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management system
 
Choosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for ParentsChoosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for Parents
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4
 
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx
 
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptxBIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
 
How to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseHow to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 Database
 
IPCRF/RPMS 2024 Classroom Observation tool is your access to the new performa...
IPCRF/RPMS 2024 Classroom Observation tool is your access to the new performa...IPCRF/RPMS 2024 Classroom Observation tool is your access to the new performa...
IPCRF/RPMS 2024 Classroom Observation tool is your access to the new performa...
 
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITW
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITWQ-Factor HISPOL Quiz-6th April 2024, Quiz Club NITW
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITW
 
Grade Three -ELLNA-REVIEWER-ENGLISH.pptx
Grade Three -ELLNA-REVIEWER-ENGLISH.pptxGrade Three -ELLNA-REVIEWER-ENGLISH.pptx
Grade Three -ELLNA-REVIEWER-ENGLISH.pptx
 
week 1 cookery 8 fourth - quarter .pptx
week 1 cookery 8  fourth  -  quarter .pptxweek 1 cookery 8  fourth  -  quarter .pptx
week 1 cookery 8 fourth - quarter .pptx
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdf
 

History of Computer

  • 1. Computers Since it’s Birth Group Members: Gunasinghe U.L.D.N 100162X Sashika W.A.D 100487X Siriwardena M.P 100512X Udara Y.B.N 100544V Wijayarathna D.G.C.D 100596F
  • 2. Contents 1. What is a computer? 2. Abacus 3. Pascalene 4. Punched Card 5. Difference Engine 6. Analytical Engine 7. Ada Lovelace 8. Transistors 9. Hewlwtt-Packard Company 10.ENIAC 11.EDVAC 12.Alan Turing 13.UNIVAC 14.The first Transister Computer 15.Unix 16.Floppy disk 17.Intel 4004 18.Microsoft Windows. 19.Apple Computers 20.Pentium Pprocessors 21.Multi-Cpre Processors 22.Supeer Computers 23.Artificial Inteligence 24.Optical Computer 25.DNA Computing 26.Pen Computers 27.Quntum computers
  • 3. What is a Computer? “A computer is an electronic device which is capable of receiving information (data) and performing a sequence of logical operations in accordance with a predetermined but variable set of procedural instructions (program) to produce a result in the form of information or signals.” This is the definition of a computer according to the Oxford English dictionary. The origin of the computer is compute which is basically means calculate. At earlier days the major use of computers was calculating. But at present it has been completely changed. Because rather than calculating we use computers for other advance applications. Ex. Video editing, 3D animations, for military purposes, bio medical applications, scientifically researchers etc. Due to the rapid change of technology, computer has become very complex . Characteristics of Computer. 1. A machine. 2. Can be programed. 3. Process data according to instructions. 4. Store data, etc.
  • 4. Abacus (3000 B.C.) Abacus is considered as the first computing device invented in the world. Abacus was first invented by Babylonians in 3000 B.C. About 1300 the more familiar wire- and – bead abacus replace the Chinese calculating rods. he Lee Kai-chen Abacus was developed and manufactured in Taiwan, China. Few of these remarkable instruments remain as production only lasted a few years. On the beam of the suan pan in the lower part of the frame there is an adjustable 'place setting vernier'. With a moveable indicator that runs along the bottom rail and adjustable unit rod markers on the beam of the soroban above Lee called his abacus, "A Revolution of Chinese Calculators." This abacus was made circa 1959 and comes with a 58 page instruction manual. The abacus measures 13 inches long by 8 inches wide. For a demonstration of Lee's Abacus please see. Pascalene( 1642-1643) In 1642-1643 Blaise Pascal creates a gear – driven adding machine called the ―Pascalene‖ , the first mechanical adding machine. Blaise Pascal, the French scientist was one of the most reputed mathematician and physicist of his time. He is credited with inventing an early calculator, amazingly advanced for its time. A genuis from a young age, Blaise Pascal composed a treatise on the communication of sounds at the age of twelve, and at the age of sixteen he composed a treatise on conic sections. The idea of using machines to solve mathematical problems can be traced at least as far as the early 17th century. Mathematicians who designed and implemented calculators that were capable of addition, subtraction, multiplication, and division included Wilhelm Schickhard, Blaise Pascal, and Gottfried Leibnitz. In 1642, at the age of eighteen Blaise Pascal invented his numerical wheel calculator called the Pascaline to help his father a French tax collector count taxes. The Pascaline had eight movable dials that added up to eight figured long sums and used base ten. When the first dial (one's column) moved ten notches - the second dial moved one notch to represent the ten's column reading of 10 - and when the ten dial moved ten notches the third dial (hundred's column) moved one notch to represent one hundred and so on.
  • 5. Punched Card (1725) Punched cards were first used around 1725 by Basile Bouchon and Jean-Baptiste Falcon as a more robust form of the perforated paper rolls then in use for controlling textile looms in France. This technique was greatly improved by Joseph Marie Jacquard in his Jacquard loom in 1801. Semen Korsakov was reputedly the first to use the punched cards in informatics for information store and search. Korsakov announced his new method and machines in September 1832, and rather than seeking patents offered the machines for public use. From the invention of computer programming languages up to the mid-1980s, many if not most computer programmers created, edited and stored their programs on punched cards. The practice was nearly universal with IBM computers in the era. A punched card is a flexible write-once medium that encodes, most commonly, 80 characters of data. Groups or "decks" of cards form programs and collections of data. Users could create cards using a desk- sized keypunch with a typewriter-like keyboard. A typing error generally necessitated re-punching an entire card. A single character typo could be corrected by duplicating the card up to the error column, typing the correct character and then duplicating the rest of the card. In some companies programmers wrote information on special forms called coding sheets, taking care to distinguish the digit zero from the letter O, the digit one from the letter I, 8's from Bs, 2's from Zs, and so on. These forms were then converted to cards by keypunch operators and, in some cases, checked by verifiers Difference Engine (1822) Charles Babbage (1791-1871), computer pioneer, designed the first automatic computing engines. He invented computers but failed to build them. The difference engine was a special-purpose calculator designed to tabulate logarithms and trigonometric functions by evaluating finite differences to create approximating polynomials. Construction of this machine couldn’t complete; Babbage had conflicts with his chief engineer, Joseph Clement, and ultimately the British government withdrew its funding for the project. The first complete Babbage Engine was completed in London in 2002, 153 years after it was designed. Difference Engine No. 2, built faithfully to the original drawings, consists of 8,000 parts, weighs five tons, and measures 11 feet long. Actually it was a very large step in computer history.
  • 6. Analytical Engine (1834-1835) During the Difference Engine project Babbage realized that a much more general design, the Analytical Engine, was possible. The input (programs and data) was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. It employed ordinary base-10 fixed-point arithmetic. There was to be a store (that is, a memory) capable of holding 1,000 numbers of 50 decimal digits each (ca. 20.7 kB). An arithmetical unit (the "mill") would be able to perform all four arithmetic operations, plus comparisons and optionally square roots. Initially it was conceived as a difference engine curved back upon itself, in a generally circular layout, with the long store exiting off to one side. (Later drawings depict a regularized grid layout.) Like the central processing unit (CPU) in a modern computer, the mill would rely upon its own internal procedures, to be stored in the form of pegs inserted into rotating drums called "barrels", to carry out some of the more complex instructions the user's program might specify. (See microcode for the modern equivalent.) The programming language to be employed by users was akin to modern day assembly languages. Loops and conditional branching were possible, and so the language as conceived would have been Turing-complete long before Alan Turing's concept. Three different types of punch cards were used: one for arithmetical operations, one for numerical constants, and one for load and store operations, transferring numbers from the store to the arithmetical unit or back. There were three separate readers for the three types of cards. Ada Lovelace (1842-1843) Ada Lovelace met and corresponded with Charles Babbage on many occasions, including socially and in relation to Babbage's Difference Engine and Analytical Engine. Babbage was impressed by Lovelace's intellect and writing skills. He called her "The Enchantress of Numbers". In 1843 he wrote of her. Forget this world and all its troubles and if possible its multitudinous Charlatans — every thing in short but the Enchantress of Numbers. During a nine-month period in 1842–43, Lovelace translated Italian mathematician Luigi Menabrea's memoir on Babbage's newest proposed machine, the Analytical Engine.
  • 7. With the article, she appended a set of notes. The notes are longer than the memoir itself and include (Section G), in complete detail, a method for calculating a sequence of Bernoulli numbers with the Engine, which would have run correctly had the Analytical Engine been built. Based on this work, Lovelace is now widely credited with being the first computer programmer and her method is recognized as the world's first computer program. Transistors (1934) The first patent for the field-effect transistor principle was filed in Canada by Austrian-Hungarian physicist Julius Edgar Lilienfeld on October 22, 1925, but Lilienfeld published no research articles about his devices, and they were ignored by industry. In 1934 German physicist Dr. Oskar Heil patented another field-effect transistor. There is no direct evidence that these devices were built, but later work in the 1990s show that one of Lilienfeld's designs worked as described and gave substantial gain. Legal papers from the Bell Labs patent show that William Shockley and a co-worker at Bell Labs, Gerald Pearson, had built operational versions from Lilienfeld's patents, yet they never referenced this work in any of their later research papers or historical articles. The work emerged from their war-time efforts to produce extremely pure germanium "crystal" mixer diodes, used in radar units as a frequency mixer element in microwave radar receivers. A parallel project on germanium diodes at Purdue University succeeded in producing the good-quality germanium semiconducting crystals that were used at Bell Labs. Early tube-based technology did not switch fast enough for this role, leading the Bell team to use solid state diodes instead. With this knowledge in hand they turned to the design of a triode, but found this was not at all easy. Bardeen eventually developed a new branch of quantum mechanics known as surface physics to account for the "odd" behavior they saw, and Bardeen and Brattain eventually succeeded in building a working device. After the war, William Shockley decided to attempt the building of a triode-like semiconductor device. He secured funding and lab space, and went to work on the problem with Brattain and John Bardeen. The key to the development of the transistor was the further understanding of the process of the electron mobility in a semiconductor. It was realized that if there was some way to control the flow of the electrons from the emitter to the collector of this newly discovered diode, one could build an amplifier. For instance, if you placed contacts on either side of a single type of crystal the current would not flow through it. However if a third contact could then "inject" electrons or holes into the material, the current would flow.
  • 8. Hewlett-Packard Company (1939) Bill Hewlett and Dave Packard graduated in electrical engineering from Stanford University in 1935. The company originated in a garage in nearby Palo Alto during a fellowship they had with a past professor, Frederick Terman at Stanford during the Great Depression. Terman was considered a mentor to them in forming Hewlett-Packard. In 1939, Packard and Hewlett established Hewlett-Packard (HP) in Packard's garage with an initial capital investment of US$538. Hewlett and Packard tossed a coin to decide whether the company they founded would be called Hewlett-Packard or Packard-Hewlett Packard won the coin toss but named their electronics manufacturing enterprise the "Hewlett-Packard Company". HP incorporated on August 18, 1947, and went public on November 6, 1957. Of the many projects they worked on, their very first financially successful product was a precision audio oscillator, the Model HP200A. Their innovation was the use of a small incandescent light bulb (known as a "pilot light") as a temperature dependent resistor in a critical portion of the circuit, the negative feedback loop which stabilized the amplitude of the output sinusoidal waveform. This allowed them to sell the Model 200A for $54.40 when competitors were selling less stable oscillators for over $200. The Model 200 series of generators continued until at least 1972 as the 200AB, still tube-based but improved in design through the years. One of the company's earliest customers was The Walt Disney Company, which bought eight Model 200B oscillators (at $71.50 each) for use in certifying the Fanta sound surround sound systems installed in theaters for the movie Fantasia. ENIAC (Electronic Numerical Integrator And Computer, 1943) The ENIAC's design and construction was financed by the United States Army during World War II. The construction contract was signed on June 5, 1943, and work on the computer began in secret by the University of Pennsylvania's Moore School of Electrical Engineering starting the following month under the code name "Project PX". The completed machine was announced to the public the evening of February 14, 1946 and formally dedicated the next day at the University of Pennsylvania, having cost almost $500,000 (nearly $6 million in 2010, adjusted for inflation). It was formally accepted by the U.S. Army Ordnance Corps in July 1946. ENIAC was shut down on November 9, 1946 for a refurbishment and a memory upgrade, and was transferred to Aberdeen Proving Ground, Maryland in 1947. There, on July 29, 1947, it was turned on and was in continuous operation until 11:45 p.m. on October 2, 1955.
  • 9. The ENIAC was a modular computer, composed of individual panels to perform different functions. Twenty of these modules were accumulators, which could not only add and subtract but hold a ten-digit decimal number in memory. Numbers were passed between these units across a number of general-purpose buses, or trays, as they were called. In order to achieve its high speed, the panels had to send and receive numbers, compute, save the answer, and trigger the next operation—all without any moving parts. Key to its versatility was the ability to branch; it could trigger different operations that depended on the sign of a computed result. Besides its speed, the most remarkable thing about ENIAC was its size and complexity. ENIAC contained 17,468 vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints. It weighed more than 30 short tons (27 t), was roughly 8 by 3 by 100 feet (2.4 m × 0.9 m × 30 m), took up 1800 square feet (167 m2), and consumed 150 kW of power. Input was possible from an IBM card reader, and an IBM card punch was used for output. These cards could be used to produce printed output offline using an IBM accounting machine, such as the IBM 405. The ENIAC could be programmed to perform complex sequences of operations, which could include loops, branches, and subroutines. The task of taking a problem and mapping it onto the machine was complex, and usually took weeks. After the program was figured out on paper, the process of getting the program "into" the ENIAC by manipulating its switches and cables took additional days. This was followed by a period of verification and debugging, aided by the ability to "single step" the machine. EDVAC (Electronic Discrete Variable Automatic Computer, 1945) ENIAC inventors John Mauchly and J. Presper Eckert proposed the EDVAC's construction in August 1944, and design work for the EDVAC commenced before the ENIAC was fully operational. The design would implement a number of important architectural and logical improvements conceived during the ENIAC's construction and would incorporate a high speed serial access memory. Like the ENIAC, the EDVAC was built for the U.S. Army's Ballistics Research Laboratory at the Aberdeen Proving Ground by the University of Pennsylvania's Moore School of Electrical Engineering. Eckert and Mauchly and the other ENIAC designers were joined by John von Neumann in a consulting role; von Neumann summarized and elaborated upon logical design developments in his 1945 First Draft of a Report on the EDVAC. A contract to build the new computer was signed in April 1946 with an initial budget of US$100,000. The contract named the device the Electronic Discrete Variable Automatic
  • 10. Calculator. The final cost of EDVAC, however, was similar to the ENIAC's, at just under $500,000. The EDVAC was a binary serial computer with automatic addition, subtraction, multiplication, programmed division and automatic checking with an ultrasonic serial memory capacity of 1,000 44-bit words (later set to 1,024 words, thus giving a memory, in modern terms, of 5.5 kilobytes). Physically, the computer comprised the following components: a magnetic tape reader-recorder (Wilkes 1956:36 describes this as a wire recorder.) a control unit with an oscilloscope a dispatcher unit to receive instructions from the control and memory and direct them to other units a computational unit to perform arithmetic operations on a pair of numbers at a time and send the result to memory after checking on a duplicate unit a timer a dual memory unit consisting of two sets of 64 mercury acoustic delay lines of eight words capacity on each line three temporary tanks each holding a single word EDVAC's addition time was 864 microseconds and its multiplication time was 2900 microseconds (2.9 milliseconds). The computer had almost 6,000 vacuum tubes and 12,000 diodes, and consumed 56 kW of power. It covered 490 ft² (45.5 m²) of floor space and weighed 17,300 lb (7,850 kg). The full complement of operating personnel was thirty people for each eight-hour shift. Alan Turing (1912-1954) From 1945 to 1947 Turing lived in Church Street, Hampton while he worked on the design of the ACE (Automatic Computing Engine) at the National Physical Laboratory. He presented a paper on 19 February 1946, which was the first detailed design of a stored- program computer. Although ACE was a feasible design, the secrecy surrounding the wartime work at Bletchley Park led to delays in starting the project and he became disillusioned. In late 1947 he returned to Cambridge for a sabbatical year. While he was at Cambridge, the Pilot ACE was built in his absence. It executed its first program on 10 May 1950. In 1948, he was appointed Reader in the Mathematics Department at Manchester (now part of The University of Manchester). In 1949, he became Deputy Director of the computing laboratory at the University of Manchester, and worked on software for one of the earliest stored- program computers—the Manchester Mark 1. During this time he continued to do more abstract work, and in "Computing machinery and intelligence" (Mind, October 1950), Turing addressed the problem of
  • 11. artificial intelligence, and proposed an experiment which became known as the Turing test, an attempt to define a standard for a machine to be called "intelligent". The idea was that a computer could be said to "think" if a human interrogator could not tell it apart, through conversation, from a human being. In the paper, Turing suggested that rather than building a program to simulate the adult mind, it would be better rather to produce a simpler one to simulate a child's mind and then to subject it to a course of education. A reversed form of the Turing test is widely used on the Internet; the CAPTCHA test is intended to determine whether the user is a human or a computer. In 1948, Turing, working with his former undergraduate colleague, D. G. Champernowne, began writing a chess program for a computer that did not yet exist. In 1952, lacking a computer powerful enough to execute the program, Turing played a game in which he simulated the computer, taking about half an hour per move. The game was recorded. The program lost to Turing's colleague Alick Glennie, although it is said that it won a game against Champernowne's wife. His Turing Test was a significant and characteristically provocative and lasting contribution to the debate regarding artificial intelligence, which continues after more than half a century. He also invented the LU decomposition method used today for solving an equations matrix in 1948. Alan Turing consider as a father of Computer Science. UNIVAC 1101 (1950) The UNIVAC 1101, or ERA 1101, was a computer system designed by Engineering Research Associates (ERA) and built by the Remington Rand corporation in the 1950s. It was the first stored program computer in the U.S. that was moved from its site of manufacture and successfully installed at a distant site. Remington Rand used the 1101's architecture as the basis for a series of machines into the 1960s. This computer was 38 ft (12 m) long, 20 ft (6.1 m) wide, and used 2700 vacuum tubes for its logic circuits. Its drum memory was 8.5 in (22 cm) in diameter, rotated at 3500 rpm, had 200 read-write heads, and held 16,384 24-bit words (a memory size equivalent to 48 kB) with access time between 32 microseconds and 17 milliseconds. Instructions were 24 bits long, with 6 bits for the opcode, 4 bits for the "skip" value (telling how many memory locations to skip to get to the next instruction in program sequence), and 14 bits for the memory address. Numbers were binary with negative values in one's complement. The addition time was 96 microseconds and the multiplication time was 352 microseconds.
  • 12. The single 48-bit accumulator was fundamentally subtractive, addition being carried out by subtracting the one's complement of the number to be added. This may appear rather strange, but the subtractive adder reduces the chance of getting negative zero in normal operations. The machine had 38 instructions. The first transistor computer (1953) The University of Manchester's experimental Transistor Computer was first operational in November 1953 and it is widely believed to be the first transistor computer to come into operation anywhere in the world. There were two versions of the Transistor Computer, the prototype, operational in 1953, and the full-size version, commissioned in April 1955. The 1953 machine had 92 point-contact transistors and 550 diodes, manufactured by STC. It had a 48-bit machine word. The 1955 machine had a total of 200 point-contact transistors and 1300 point diodes, which resulted in a power consumption of 150 watts. There were considerable reliability problems with the early batches of transistors and the average error free run in 1955 was only 1.5 hours. The Computer also used a small number of tubes in its clock generator, so it was not the first fully transistorized machine. The design of a full-size Transistor Computer was subsequently adopted by the Manchester firm of Metropolitan-Vickers, who changed all the circuits to more reliable types of junction transistors.The production version was known as the Metrovick 950 and was built from 1956 to the extent of six or seven machines, which were "used commercially within the company" or "mainly for internal use". Unix (1969) Unix (officially trademarked as UNIX, sometimes also written as Unix) is a multitasking, multi-user computer operating system originally developed in 1969 by a group of AT&T employees at Bell Labs, including Ken Thompson, Dennis Ritchie, Brian Kernighan, Douglas McIlroy, and Joe Ossanna. The Unix operating system was first developed in assembly language, but by 1973 had been almost entirely recoded in C, greatly facilitating its further development and porting to other hardware. Today's Unix systems are split into various branches, developed over time by AT&T as well as various commercial vendors and non- profit organizations. The second edition of Unix was released on December 6th, 1972.
  • 13. Floppy Disk (1970) The earliest floppy disks, invented at IBM, were 8 inches in diameter. They became commercially available in 1971 Disks in this form factor were produced and improved upon by IBM and other companies such as Memorex, Shugart Associates, and Burroughs Corporation. A BASF double-density 5¼-inch diskette. In 1976 Shugart Associates introduced the first 5¼-inch FDD and associated media. By 1978 there were more than 10 manufacturers producing 5¼-inch FDDs, in competing disk formats: hard or soft sectored with various encoding schemes such as FM, MFM and GCR. The 5¼-inch formats quickly displaced the 8-inch for most applications, and the 5¼-inch hard-sectored disk format eventually disappeared. In 1984, IBM introduced the 1.2 megabyte dual sided floppy disk along with its AT model. Although often used as backup storage, the high density floppy was not often used by software manufacturers for interchangeability. In 1986, IBM began to use the 720 kB double density 3.5" microfloppy disk on its Convertible laptop computer. It introduced the so-called "1.44 MB" high density version with the PS/2 line. These disk drives could be added to existing older model PCs. In 1988 IBM introduced a drive for 2.88 MB "DSED" diskettes in its top-of-the-line PS/2 models; it was a commercial failure. In 1991, Insite Peripherals introduced the "Floptical", which used an infra-red LED to position the heads over marks in the disk surface. The original drive stored 21 MB, while also reading and writing standard DD and HD floppies. In order to improve data transfer speeds and make the high-capacity drive usefully quick as well, the drives were attached to the system using a SCSI connector instead of the normal floppy controller. This made them appear to the operating system as a hard drive instead of a floppy, meaning that most PCs were unable to boot from them. This again adversely affected pickup rates.
  • 14. Intel 4004 (1971) The Intel 4004 was a 4-bit central processing unit (CPU) released by Intel Corporation in 1971. It was the first complete CPU on one chip, and also the first commercially available microprocessor. Such a feat of integration was made possible by the use of then new silicon gate technology allowing a higher number of transistors and a faster speed than was possible before.The 4004 employed a 10-μm silicon-gate enhancement load pMOS technology and could execute approximately 92,000 instructions per second (that is, a single instruction cycle was 10.8 microseconds). The chief designers of the chip wereFederico Faggin and Ted Hoff of Intel, and Masatoshi Shima of Busicom (later of ZiLOG, founded by Faggin). Faggin, the sole chip designer among the engineers on the MCS-4 project, was the only one with experience in MOS random logic and circuit design. He also had the crucial knowledge of the new silicon gate process technology with self-aligned gates, which he had created at Fairchild in 1968. At Fairchild in 1968, Faggin also designed and manufactured the world's first commercial IC using SGT — the Fairchild 3708. As soon as he joined the Intel MOS Department he created a new random design methodology based on silicon gate, and contributed many technology and circuit design inventions that enabled a single chip microprocessor to become a reality for the first time. His methodology set the design style for all the early Intel microprocessors and later for the Zilog’s Z80. He also led the MCS-4 project and was responsible for its successful outcome (1970–1971). Ted Hoff, head of the Application Research Department, contributed only the architectural proposal for Busicom working with Stanley Mazor in 1969, then he moved on to other projects. When asked where he got the ideas for the architecture of the first microprocessor, Hoff related that Plessey, "a British tractor company," had donated a minicomputer to Stanford, and he had "played with it some" while he was there. Shima designed the Busicom calculator firmware and assisted Faggin during the first six months of the implementation. The manager of Intel's MOS Design Department was Leslie L. Vadász. At the time of the MCS-4 development, Vadasz's attention was completely focused on the mainstream business of semiconductor memories and he left the leadership and the management of the MCS-4 project to Faggin. A popular myth has it that Pioneer 10, the first spacecraft to leave the solar system, used an Intel 4004 microprocessor. According to Dr. Larry Lasher of Ames Research Center, the Pioneer team did evaluate the 4004, but decided it was too new at the time to include in any of the Pioneer projects. The myth was repeated by Federico Faggin himself in a lecture for the Computer History Museum in 2006
  • 15. Microsoft Windows (1975) Paul Allen and Bill Gates, childhood friends with a passion in computer programming, were seeking to make a successful business utilizing their shared skills. The January 1975 issue of Popular Electronics featured Micro Instrumentation and Telemetry Systems's (MITS) Altair 8800 microcomputer. Allen noticed that they could program a BASIC interpreter for the device; after a call from Gates claiming to have a working interpreter, MITS requested a demonstration. Since they didn't actually have one, Allen worked on a simulator for the Altair while Gates developed the interpreter. Although they developed the interpreter on a simulator and not the actual device, the interpreter worked flawlessly when they demonstrated the interpreter to MITS in Albuquerque, New Mexico in March 1975; MITS agreed to distribute it, marketing it as Altair BASIC. They officially established Microsoft on April 4, 1975, with Gates as the CEO. Allen came up with the original name of "Micro-Soft," as recounted in a 1995 Fortune magazine article. In August 1977 the company formed an agreement with ASCII Magazine in Japan, resulting in its first international office, "ASCII Microsoft". The company moved to a new home in Bellevue, Washington in January 1979. Microsoft entered the OS business in 1980 with its own version of Unix, called Xenix. However, it was DOS (Disk Operating System) that solidified the company's dominance. After negotiations with Digital Research failed, IBM awarded a contract to Microsoft in November 1980 to provide a version of the CP/MOS, which was set to be used in the upcoming IBM Personal Computer (IBM PC). For this deal, Microsoft purchased a CP/M clone called 86-DOS from Seattle Computer Products, branding it as MS-DOS, which IBM rebranded to PC-DOS. Following the release of the IBM PC in August 1981, Microsoft retained ownership of MS-DOS. Since IBM copyrighted the IBM PC BIOS, other companies had to reverse engineer it in order for non-IBM hardware to run as IBM PC compatibles, but no such restriction applied to the operating systems. Due to various factors, such as MS- DOS's available software selection, Microsoft eventually became the leading PC OS vendor. The company expanded into new markets with the release of the Microsoft Mouse in 1983, as well as a publishing division named Microsoft Press. Paul Allen resigned from Microsoft in February after developing Hodgkin's disease.
  • 16. Apple Computers (1976) On April fool’s Day, 1976, Steve Wozniak and Steve Jobsreleased the Apple I computer and started Apple Computers. The Apple I was the first with a single circuit board used in a computer. The first home computer with a GUI or graphical user interface was the Apple Lisa. The very first graphical user interface was developed by the Xerox Corporation at their Palo Alto Research Center (PARC) in the 1970s. Steve Jobs visited PARC in 1979 (after buying Xerox stock) and was impressed and influenced by the Xerox Alto, the first computer ever with a graphical user interface. Jobs designed the new Apple Lisa based on the technology he saw at Xerox. Pentium Processors Pentium is a registered trademark that is included in the brand names of many of Intel's x86-compatible microprocessors, both single- and multi-core. The name Pentium was derived from the Greekpente (πέντε), meaning 'five', and the Latin ending -ium, a name selected after courts had disallowed trademarking of number-based names like "i586" or "80586" (model numbers cannot always be trademarked). Following Intel's previous series of 8086, 80186, 80286, 80386, and 80486 microprocessors, Intel's fifth- generation microarchitecture, the P5, was first released under the Pentium brandon March 22, 1993. In 1995, Intel started to employ the registered Pentium trademark also for x86 microprocessors with radically different microarchitectures (e.g., Pentium Pro, II, III, 4, D, M, etc.). In 2006, the Pentium brand briefly disappeared from Intel's roadmaps, only to re-emerge in 2007. Multi-core processors A multi-core processor is a single computing component with two or more independent actual processors (called "cores"), which are the units that read and execute program instructions. The data in the instruction tells the processor what to do. The instructions are very basic things like reading data from memory or sending data to the user display, but they are processed so rapidly that we experience the results as the smooth operation of a program. Manufacturers typically integrate the cores onto a single integrated circuit die (known as a chip multiprocessor or CMP), or onto multiple dies in a single chip package.
  • 17. Processors were originally developed with only one core. A multi-core processor is one in which the number of cores is large enough that traditional multi-processor techniques are no longer efficient — largely due to issues with congestion in supplying instructions and data to the many processors. The many-core threshold is roughly in the range of several tens of cores; above this threshold network on chip technology is advantageous. A dual-core processor has two cores (e.g. AMD Phenom II X2, Intel Core Duo), a quad-core processor contains four cores (e.g. AMD Phenom II X4, the Intel 2010 core line that includes three levels of quad-core processors, see i3, i5, and i7 at Intel Core), and a hexa-core processor contains six cores (e.g. AMD Phenom II X6, Intel Core i7 Extreme Edition 980X). A multi-core processor implements multiprocessing in a single physical package. Designers may couple cores in a multi-core device tightly or loosely. For example, cores may or may not share caches, and they may implement message passing or shared memory inter-core communication methods. Common network topologies to interconnect cores include bus, ring, two-dimensional mesh, and crossbar. Homogeneous multi-core systems include only identical cores, heterogeneous multi-core systems have cores which are not identical. Just as with single-processor systems, cores in multi-core systems may implement architectures such as superscalar, VLIW, vector processing, SIMD, or multithreading. The improvement in performance gained by the use of a multi-core processor depends very much on the software algorithms used and their implementation. In particular, possible gains are limited by the fraction of the software that can be parallelized to run on multiple cores simultaneously; this effect is described by Amdahl's law. In the best case, so- called embarrassingly parallel problems may realize speedup factors near the number of cores, or even more if the problem is split up enough to fit within each core's cache(s), avoiding use of much slower main system memory. Most applications, however, are not accelerated so much unless programmers invest a prohibitive amount of effort in re-factoring the whole problem. The parallelization of software is a significant ongoing topic of research.
  • 18. Super Computers A supercomputer is a computer that is at the frontline of current processing capacity, particularly speed of calculation. Supercomputers are used for highly calculation-intensive tasks such as problems involving quantum physics, weather forecasting, climate research, molecular modeling (computing the structures and properties of chemical compounds, biological macromolecules, polymers, and crystals), physical simulations (such as simulation of airplanes in wind tunnels, simulation of the detonation of nuclear weapons, and research into nuclear fusion). Supercomputers were introduced in the 1960s and were designed primarily by Seymour Cray at Control Data Corporation (CDC), which led the market into the 1970s until Cray left to form his own company, Cray Research. He then took over the supercomputer market with his new designs, holding the top spot in supercomputing for five years (1985–1990). In the 1980s a large number of smaller competitors entered the market, in parallel to the creation of the minicomputer market a decade earlier, but many of these disappeared in the mid-1990s "supercomputer market crash". Today, supercomputers are typically one-of-a-kind custom designs produced by traditional companies such as Cray, IBM and Hewlett-Packard, who had purchased many of the 1980s companies to gain their experience. Since October 2010, the Tianhe- 1A supercomputer has been the fastest in the world; it is located in China. The term supercomputer itself is rather fluid, and the speed of today's supercomputers tends to become typical of tomorrow's ordinary computers. CDC's early machines were simply very fast scalar processors, some ten times the speed of the fastest machines offered by other companies. In the 1970s most supercomputers were dedicated to running a vector processor, and many of the newer players developed their own such processors at a lower price to enter the market. The early and mid-1980s saw machines with a modest number of vector processors working in parallel to become the standard. Typical numbers of processors were in the range of four to sixteen. In the later 1980s and 1990s, attention turned from vector processors to massive parallel processing systems with thousands of "ordinary" CPUs, some being off the shelf units and others being custom designs. Today, parallel designs are based on "off the shelf" server-class microprocessors, such as the PowerPC,Opteron, or Xeon, and coprocessors like NVIDIA Tesla GPGPUs, AMD GPUs, IBM Cell, FPGAs. Most modern supercomputers are now highly-tuned computer clusters using commodity processors combined with custom interconnects.
  • 19. Artificial Inteligence Artificial intelligence (AI) is the intelligence of machines and the branch of computer science that aims to create it. AI textbooks define the field as "the study and design of intelligent agents" where an intelligent agent is a system that perceives its environment and takes actions that maximize its chances of success. John McCarthy, who coined the term in 1956, defines it as "the science and engineering of making intelligent machines." The field was founded on the claim that a central property of humans, intelligence— the sapience of Homo sapiens—can be so precisely described that it can be simulated by a machine. This raises philosophical issues about the nature of the mind and the ethics of creating artificial beings, issues which have been addressed by myth, fiction and philosophy since antiquity. Artificial intelligence has been the subject of optimism, but has also suffered setbacksnd, today, has become an essential part of the technology industry, providing the heavy lifting for many of the most difficult problems in computer science. AI research is highly technical and specialized, and deeply divided into subfields that often fail to communicate with each other. Subfields have grown up around particular institutions, the work of individual researchers, the solution of specific problems, longstanding differences of opinion about how AI should be done and the application of widely differing tools. The central problems of AI include such traits as reasoning, knowledge, planning, learning, communication, perception and the ability to move and manipulate objects. General intelligence (or "strong AI") is still among the field's long term goals. Optical Computer The computers we use today use transistors and semiconductors to control electricity. Computers of the future may utilize crystals and metamaterials to control light. Optical computers make use of light particles called photons. NASA scientists are working to solve the need for computer speed using light Light travels at 186,000 miles per second. That's 982,080,000 feet per second -- or 11,784,960,000 inches. In a billionth of a second, one nanosecond, photons of light travel just a bit less than a foot, not considering resistance in air or of an optical fiber strand or thin film. Just right for doing things very quickly in microminiaturized computer chips.
  • 20. "Entirely optical computers are still some time in the future," says Dr. Frazier, "but electro-optical hybrids have been possible since 1978, when it was learned that photons can respond to electrons through media such as lithium niobate. Newer advances have produced a variety of thin films and optical fibers that make optical interconnections and devices practical. We are focusing on thin films made of organic molecules, which are more light sensitive than inorganics. Organics can perform functions such as switching, signal processing and frequency doubling using less power than inorganics. Inorganics such as silicon used with organic materials let us use both photons and electrons in current hybrid systems, which will eventually lead to all-optical computer systems." "What we are accomplishing in the lab today will result in development of super-fast, super-miniaturized, super-lightweight and lower cost optical computing and optical communication devices and systems," Frazier explained. DNA Computing DNA computing is a form of computing which uses DNA, biochemistry and molecular biology, instead of the traditional silicon- based computer technologies. DNA computing, or, more generally, biomolecular computing, is a fast developing interdisciplinary area. Research and development in this area concerns theory, experiments and applications of DNA computing. This field was initially developed by Leonard Adleman of the University of Southern California, in 1994. Adleman demonstrated a proof-of-concept use of DNA as a form of computation which solved the seven-point Hamiltonian path problem. Since the initial Adleman experiments, advances have been made and various Turing machines have been proven to be constructible. In 2002, researchers from the Weizmann Institute of Science in Rehovot, Israel, unveiled a programmable molecular computing machine composed of enzymes and DNA molecules instead of silicon microchips. On April 28, 2004, Ehud Shapiro, Yaakov Benenson, Binyamin Gil, Uri Ben-Dor, and Rivka Adar at the Weizmann Institute announced in the journal Nature that they had constructed a DNA computer coupled with an input and output module which would theoretically be capable of diagnosing cancerous activity within a cell, and releasing an anti-cancer drug upon diagnosis.
  • 21. Pen Computers P-ISM is a gadget package including five functions: a pen-style cellular phone with a handwriting data input function, virtual keyboard, a very small projector, camera scanner, and personal ID key with cashless pass function. P-ISMs are connected with one another through short-range wireless technology. The whole set is also connected to the Internet through the cellular phone function. This personal gadget in a minimalistic pen style enables the ultimate ubiquitous computing. Quantum Computers A quantum computer is a device for computation that makes direct use of quantum mechanical phenomena, such as superposition and entanglement, to perform operations on data. Quantum computers are different from traditional computers based on transistors. The basic principle behind quantum computation is that quantum properties can be used to represent data and perform operations on these data. A theoretical model is the quantum Turing machine, also known as the universal quantum computer. Although quantum computing is still in its infancy, experiments have been carried out in which quantum computational operations were executed on a very small number of qubits (quantum bits). Both practical and theoretical research continues, and many national government and military funding agencies support quantum computing research to develop quantum computers for both civilian and national security purposes, such as cryptanalysis
  • 22. Large-scale quantum computers could be able to solve certain problems much faster than any classical computer by using the best currently known algorithms, like integer factorization using Short’s algorithm or the simulation of quantum many-body systems. There exist quantum algorithms, which run faster than any possible probabilistic classical algorithm. Given enough resources, a classical computer can simulate an arbitrary quantum computer. Hence, ignoring computational and space constraints, a quantum computer is not capable of solving any problem which a classical computer cannot. References Wikipidea ,The Free Encyclopedia. www.computer.org/computer/timeline/timeline.pdf www.intel.com www.futureforall.org www.nvidea.com