Overview - 1
 How does a computer work?
 How components fit together
 Control signals, signaling methods, memory types
...
Overview - 2
 It’s all about function & structure
 Structure => the way in which components are
interrelated
 Function ...
Overview - 2
 Architecture vs organization
 Architecture = attributes visible to programmers e.g.
instruction set, I/O m...
Why study computer
architecture?
 Program optimization – understanding why a
program/algorithm runs slow
 Understanding ...
Why study computer
architecture?
 Benchmarking
 Building better compilers, OS’s
 Writing software that takes advantage ...
You should be able to
 Understand how programs written in high level
languages get translated & executed by the H/W
 Det...
General purpose
computers
 Software and hardware are interrelated
 “Anything that can be done with software can also
bee...
Functions of general purpose
computer
 Data processing
 Data can take various forms
 Data storage
 Temporarily or long...
Computer Organization & architecture - Stallings
What makes a computer?
 Processor – data path & control
 Memory
 Mechanism to communicate with the outside world
 Inpu...
Organization of a
computer
Computer Organization & design - Patterson
Classes of computers
 Desktop computers
 Familiar to most people
 Features: good performance, single user, execution of...
Growth of embedded
computers in cell phones
Tons of features
The essentials of Computer organization & architecture - Null
A look inside
The essentials of Computer organization & architecture - Null
How did we get here?
 A lot has happened in the 60+ year life span
 E.g. if transportation industry developed at same
pa...
Generation 0: Mechanical
Calculating Machines - 1
 Defining characteristic - mechanical
 1500s
 There was a need to mak...
Generation 0: Mechanical
Calculating Machines - 2
 Analytical engine – an improvement over “Difference
Engine” – It was a...
Generation 0
 Drawbacks
 Slow – limited by the inertia of moving parts (gears &
pulleys)
 Cumbersome, unreliable & expe...
1st Generation: Vacuum Tube
Computers (1945 -1953)
 Defining characteristic: use of vacuum tubes as switching
technology
...
1st Generation
 John Atanasoff, John Mauchly, J. Presper Eckert –
credited with the invention of digital computers;
Howev...
2nd Generation: Transistorized
Computers (1954 – 1965)
 Defining characteristic: Use of transistor as switching
technolog...
3rd Generation: Integrated Circuit
Computers (1965 – 1980)
 Defining characteristic: Integration of dozens of transistors...
4th Generation: VLSI (1980 - )
 Defining characteristic: Integration of very large numbers of transistors on
a single chi...
5th Generation?
 Quantum computing
 Artificial Intelligence
 Massively parallel machines
 Non Von Neumann architectures
Common themes in evolution of
computers
 The same underlying fundamental concepts
 Obsession with
 Increase in performa...
Moore’s Law
 How small can we make transistors? How densely can we pack
them
 In 1965, Intel founder Gordon Moore stated...
Moore’s Law
Moore’s Law
 Implications
 Cost of a chip remained almost unchanged
 Cost of components decreased
 Higher packing dens...
Uniprocessor to Multiprocessor
 Because of physical limits – power limits
 Clock rates cannot increase forever
 Power =...
IS 139 Lecture 1
IS 139 Lecture 1
IS 139 Lecture 1
Upcoming SlideShare
Loading in...5
×

IS 139 Lecture 1

525

Published on

IS 139 Lecture 1 - UDSM - 2014

Published in: Technology
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
525
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
43
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide

IS 139 Lecture 1

  1. 1. Overview - 1  How does a computer work?  How components fit together  Control signals, signaling methods, memory types  How do I design a computer?  Structure and behavior of computer systems  ISA – instruction sets and formats, op codes, data types, no. and type of registers, addressing modes, memory access methods, I/O mechanisms
  2. 2. Overview - 2  It’s all about function & structure  Structure => the way in which components are interrelated  Function => the operation of each individual component
  3. 3. Overview - 2  Architecture vs organization  Architecture = attributes visible to programmers e.g. instruction set, I/O mechanisms, addressing modes  Organization = units & their interconnections that realize architectural specs e.g. control signals, I/O interfaces, memory technology used  Manufacturers offer family of models => same architecture but different organizations
  4. 4. Why study computer architecture?  Program optimization – understanding why a program/algorithm runs slow  Understanding FP arithmetic crucial for large, real world systems  Design of peripheral systems i.e. device drivers  Design tradeoffs of embedded systems
  5. 5. Why study computer architecture?  Benchmarking  Building better compilers, OS’s  Writing software that takes advantage of hardware features - parallelism
  6. 6. You should be able to  Understand how programs written in high level languages get translated & executed by the H/W  Determine the performance of programs and what affects them  Understand techniques used by hardware designers to improve performance  Evaluate and compare the performance of different computing systems
  7. 7. General purpose computers  Software and hardware are interrelated  “Anything that can be done with software can also been done with hardware, and anything that can be done with hardware can be done with software” – Principal of Equivalence of H/W and S/W  This observation allows us to construct general purpose computing systems with simple instructions
  8. 8. Functions of general purpose computer  Data processing  Data can take various forms  Data storage  Temporarily or long term  Data movement  Between itself & outside world  Control  Orchestrates the different parts
  9. 9. Computer Organization & architecture - Stallings
  10. 10. What makes a computer?  Processor – data path & control  Memory  Mechanism to communicate with the outside world  Input  Output
  11. 11. Organization of a computer Computer Organization & design - Patterson
  12. 12. Classes of computers  Desktop computers  Familiar to most people  Features: good performance, single user, execution of third party software  Servers  Hidden from most users – accessible via a network – Cloud computing – in data centers  Features: handling large workloads, dependability, expandability  From cheap low ends to extreme super computers with thousands of processors used for forecasting, oil exploration  Embedded computers  The largest class – wide range of applications & performance  In cars, cell phones, video game consoles, planes  Focus in limitation of power and cost
  13. 13. Growth of embedded computers in cell phones
  14. 14. Tons of features The essentials of Computer organization & architecture - Null
  15. 15. A look inside The essentials of Computer organization & architecture - Null
  16. 16. How did we get here?  A lot has happened in the 60+ year life span  E.g. if transportation industry developed at same pace – here to London in 1 sec for a few cents  Different generations in the evolution of computers  Each generation defined by a distinct technology used to build a computer at that time  Why? – gives perspective & context into design decisions, understand why things are as they are
  17. 17. Generation 0: Mechanical Calculating Machines - 1  Defining characteristic - mechanical  1500s  There was a need to make decimal calculations faster  Mechanical calculator (Pascaline) – Blaise Pascal  No memory, not programmable  Used well into 20th century  Difference Engine – Charles Babbage “Father of Computing”  Used method of difference to solve polynomial functions  Was still a calculator
  18. 18. Generation 0: Mechanical Calculating Machines - 2  Analytical engine – an improvement over “Difference Engine” – It was a significant development  More versatile – capable of performing any math operation  Similar to modern computers – mill (processor), store (memory) & input/output devices  Conditioning branch op – next instruction depending on previous  Ada, Countess of Lovelace suggested a plan for how the machine should calculate numbers – The first programmer  Used punch cards for input & programming – this method survived for a long time
  19. 19. Generation 0  Drawbacks  Slow – limited by the inertia of moving parts (gears & pulleys)  Cumbersome, unreliable & expensive
  20. 20. 1st Generation: Vacuum Tube Computers (1945 -1953)  Defining characteristic: use of vacuum tubes as switching technology  Previous generations were mechanical but not electrical  Konrad Zuse – in 1930s added electrical tech & other improvements to Babbage’s design  Z1 used electromechanical relays – was programmable, had memory, arithmetic unit, control unit  Used discarded film for input
  21. 21. 1st Generation  John Atanasoff, John Mauchly, J. Presper Eckert – credited with the invention of digital computers; However many others contributed  Their work resulted in ENIAC (Electronic numerical Integrator and Computer) in 1946 – the first all electronic, general purpose digital computer  Was built to facilitate weather prediction but ended up being financed & by the US army for ballistic trajection calculations
  22. 22. 2nd Generation: Transistorized Computers (1954 – 1965)  Defining characteristic: Use of transistor as switching technology  Vacuum tube tech was not very dependable – they tended to burn out  In 1948 at Bell Laboratories – John Bardeen, Walter Brattain & William Shockley invented the TRANSISTOR  Was a revolution – Transistors are smaller, more reliable, consume less power  Caused circuitry to become more smaller & more reliable  Emergence of companies such as IBM, DEC & Unisys
  23. 23. 3rd Generation: Integrated Circuit Computers (1965 – 1980)  Defining characteristic: Integration of dozens of transistors on a single silicon/germanium piece – “microchip” or “chip”  Kibly started with germanium, Robert Noyce eventually used silicon  Led to the silicon chip => “Silicon Valley”  Allowed dozens of transistors to exist on a single chip smaller than a single discrete transistor  Effect: Computers became faster, smaller & cheaper  E.g. IBM System/360, DEC’s PDP-8 and PDP-11
  24. 24. 4th Generation: VLSI (1980 - )  Defining characteristic: Integration of very large numbers of transistors on a single chip  3rd generation had only multiple transistors on a chip  No. increased as manufacturing techniques improved: SSI (<100) => MSI (<1000) => LSI (<10,000) => VLSI (>10,000)  Led to the development of first microprocessor (4004) by Intel in 1971  Effect: allowed computers to be cheaper, smaller & faster – led to development of microprocessors  Computers for consumers: Altair 8800 => Apple I & II => IBM’s PC
  25. 25. 5th Generation?  Quantum computing  Artificial Intelligence  Massively parallel machines  Non Von Neumann architectures
  26. 26. Common themes in evolution of computers  The same underlying fundamental concepts  Obsession with  Increase in performance  Decrease in size  Decrease in cost
  27. 27. Moore’s Law  How small can we make transistors? How densely can we pack them  In 1965, Intel founder Gordon Moore stated “the density of transistors in an integrated circuit will double every year” – Moore’s Law  Ended up being every after every 18 months  Has hold up for almost 40 years  However cannot hold forever – physical & financial limits  Rock’s Law: “The cost of capital equipment to build semiconductors will double every for years”
  28. 28. Moore’s Law
  29. 29. Moore’s Law  Implications  Cost of a chip remained almost unchanged  Cost of components decreased  Higher packing density, shorter electrical paths  Higher speeds  Smaller size  Increased flexibility  Reduced power & cooling requirements  Fewer interconnections  Increased reliability
  30. 30. Uniprocessor to Multiprocessor  Because of physical limits – power limits  Clock rates cannot increase forever  Power = capacitive load x V^2 x frequency  Multiple processors per chip – “cores”  Programmers have to take advantage of multiple processors  Parallelism – similar to instruction level parallelism
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×