The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
IS 139 Lecture 1 - 2015
1.
2. Overview - 1
How does a computer work?
How components fit together
Control signals, signaling methods, memory types
How do I design a computer?
Structure and behavior of computer systems
ISA – instruction sets and formats, op codes, data
types, no. and type of registers, addressing modes,
memory access methods, I/O mechanisms
3. Overview - 2
It’s all about function & structure
Structure => the way in which components are
interrelated
Function => the operation of each individual
component
4. What is Computer Architecture
Computer Architecture is the science and art of
selecting and interconnecting hardware components
to create computers that meet functional,
performance and cost goals.” - WWW Computer
Architecture Page
5. What is Architecture
The role of a building architect:
Materials: Steel, Concrete, Brick, Wood, Glass
Goals: Function, Cost, Safety, Ease of Construction,
Energy Efficiency, Fast Build Time, Aesthetics
Buildings: Houses, Offices, Apartments, Stadiums,
Museums
6. What is Computer Architecture
The role of a computer architect:
Technology: Logic Gates, SRAM, DRAM, Circuit
Techniques, Packaging, Magnetic Storage, Flash
Memory
Goals: Function, Performance, Reliability,
Cost/Manufacturability, Energy Efficiency, Time to
Market
Computers: Desktops, Servers, Mobile Phones,
Supercomputers, Game Consoles, Embedded
7. Architecture vs
Organization
Architecture vs organization
Architecture = attributes visible to programmers e.g.
instruction set, I/O mechanisms, addressing modes
Organization = units & their interconnections that
realize architectural specs e.g. control signals, I/O
interfaces, memory technology used
Manufacturers offer family of models => same
architecture but different organizations
8. Why study computer
architecture?
Program optimization – understanding why a
program/algorithm runs slow
for (int i = 0; i < n; i++) {
for (int j = 0; j < n; j++) {
/* operate on A[i][j] ... */
}
}
for (int j = 0; j < n; j++) {
for (int j = 0; i < n; i++) {
/* operate on A[i][j] ... */
}
}
9. Why study computer
architecture?
Understanding Floating Point arithmetic crucial for
large, real world systems
Javascript: 0.1 + 0.2 === 0.3 (True or False)?
1000 + 0.5 + 0.5 => (1000 + 0.5) +0.5 vs 1000 + (0.5
+ 0.5)
10. Why study computer
architecture?
Design of peripheral systems i.e. device drivers
How I/O is performed
Interrupt mechanism
Device interfaces
13. Why study computer
architecture?
Building better compilers, OS’s
Take advantage of new instructions
Avoid costly instructions
Better utilize memory/caches/registers
15. You should be able to
Understand how programs written in high level
languages get translated & executed by the H/W
Determine the performance of programs and what
affects them
Understand techniques used by hardware designers
to improve performance
Evaluate and compare the performance of different
computing systems
16. General purpose
computers
Software and hardware are interrelated
“Anything that can be done with software can also
been done with hardware, and anything that can be
done with hardware can be done with software” –
Principal of Equivalence of H/W and S/W
This observation allows us to construct general
purpose computing systems with simple instructions
17. Functions of general purpose
computer
Data processing
Data can take various forms
Data storage
Temporarily or long term
Data movement
Between itself & outside world
Control
Orchestrates the different parts
21. Design Goals
Functional
Needs to be correct
Reliable
Does it continue to perform correctly?
Hard fault vs transient fault
Space satellites vs desktop vs server reliability
High performance
“Fast” is only meaningful in the context of a set of important
tasks
Not just “Gigahertz” – truck vs sports car analogy
Impossible goal: fastest possible design for all programs
22. Design Goals
Low cost
Per unit manufacturing cost (wafer cost)
Cost of making first chip after design (mask cost)
Low power/energy
Energy in (battery life, cost of electricity)
Energy out (cooling and related costs)
Cyclic problem, very much a problem today
Challenge: balancing the relative importance of these goals
And the balance is constantly changing
No goal is absolutely important at expense of all others
Our focus: performance, only touch on cost, power, reliability
23. Classes of computers
Desktop computers
Familiar to most people i.e. Intel Core 2 Duo, Core i7, AMD Athlon
Features: good performance, single user, execution of third party software
Need: integer, memory bandwidth, integrated graphics/network?
Servers
Hidden from most users – accessible via a network – Cloud computing – in data
centers
Features: handling large workloads, dependability, expandability
From cheap low ends to extreme super computers with thousands of processors
used for forecasting, oil exploration
Embedded computers
The largest class – wide range of applications & performance
In cars, cell phones, video game consoles, planes
Need: low power, low cost
26. How did we get here?
A lot has happened in the 60+ year life span
E.g. if transportation industry developed at same
pace – here to London in 1 sec for a few cents
Different generations in the evolution of computers
Each generation defined by a distinct technology
used to build a computer at that time
Why? – gives perspective & context into design
decisions, understand why things are as they are
27. Generation 0: Mechanical
Calculating Machines - 1
Defining characteristic - mechanical
1500s
There was a need to make decimal calculations faster
Mechanical calculator (Pascaline) – Blaise Pascal
No memory, not programmable
Used well into 20th century
Difference Engine – Charles Babbage “Father of
Computing”
Used method of difference to solve polynomial functions
Was still a calculator
28. Generation 0: Mechanical
Calculating Machines - 2
Analytical engine – an improvement over “Difference
Engine” – It was a significant development
More versatile – capable of performing any math operation
Similar to modern computers – mill (processor), store
(memory) & input/output devices
Conditioning branch op – next instruction depending on
previous
Ada, Countess of Lovelace suggested a plan for how the
machine should calculate numbers – The first programmer
Used punch cards for input & programming – this method
survived for a long time
29.
30. Generation 0
Drawbacks
Slow – limited by the inertia of moving parts (gears &
pulleys)
Cumbersome, unreliable & expensive
31. 1st Generation: Vacuum Tube
Computers (1945 -1953)
Defining characteristic: use of vacuum tubes as switching
technology
Previous generations were mechanical but not electrical
Konrad Zuse – in 1930s added electrical tech & other
improvements to Babbage’s design
Z1 used electromechanical relays – was programmable,
had memory, arithmetic unit, control unit
Used discarded film for input
32. 1st Generation
John Atanasoff, John Mauchly, J. Presper Eckert –
credited with the invention of digital computers;
However many others contributed
Their work resulted in ENIAC (Electronic numerical
Integrator and Computer) in 1946 – the first all
electronic, general purpose digital computer
Was built to facilitate weather prediction but ended
up being financed & by the US army for ballistic
trajection calculations
33.
34. 2nd Generation: Transistorized
Computers (1954 – 1965)
Defining characteristic: Use of transistor as switching
technology
Vacuum tube tech was not very dependable – they tended to
burn out
In 1948 at Bell Laboratories – John Bardeen, Walter Brattain &
William Shockley invented the TRANSISTOR
Was a revolution – Transistors are smaller, more reliable,
consume less power
Caused circuitry to become more smaller & more reliable
Emergence of companies such as IBM, DEC & Unisys
35. 3rd Generation: Integrated Circuit
Computers (1965 – 1980)
Defining characteristic: Integration of dozens of transistors on a
single silicon/germanium piece – “microchip” or “chip”
Kibly started with germanium, Robert Noyce eventually used
silicon
Led to the silicon chip => “Silicon Valley”
Allowed dozens of transistors to exist on a single chip smaller
than a single discrete transistor
Effect: Computers became faster, smaller & cheaper
E.g. IBM System/360, DEC’s PDP-8 and PDP-11
36.
37. 4th Generation: VLSI (1980 - )
Defining characteristic: Integration of very large numbers of transistors on
a single chip
3rd generation had only multiple transistors on a chip
No. increased as manufacturing techniques improved: SSI (<100) => MSI
(<1000) => LSI (<10,000) => VLSI (>10,000)
Led to the development of first microprocessor (4004) by Intel in 1971
Effect: allowed computers to be cheaper, smaller & faster – led to
development of microprocessors
Computers for consumers: Altair 8800 => Apple I & II => IBM’s PC
38. 5th Generation?
Quantum computing
Artificial Intelligence
Massively parallel machines
Non Von Neumann architectures
39. Common themes in evolution of
computers
The same underlying fundamental concepts
Obsession with
Increase in performance
Decrease in size
Decrease in cost
40. Moore’s Law
How small can we make transistors? How densely can we pack
them
In 1965, Intel founder Gordon Moore stated “the density of
transistors in an integrated circuit will double every year” –
Moore’s Law
Ended up being every after every 18 months
Has hold up for almost 40 years
However cannot hold forever – physical & financial limits
Rock’s Law: “The cost of capital equipment to build
semiconductors will double every for years”
42. Moore’s Law
Implications
Cost of a chip remained almost unchanged
Cost of components decreased
Higher packing density, shorter electrical paths
Higher speeds
Smaller size
Increased flexibility
Reduced power & cooling requirements
Fewer interconnections
Increased reliability
43. Uniprocessor to Multiprocessor
Because of physical limits – power limits
Clock rates cannot increase forever
Power = capacitive load x V^2 x frequency
Multiple processors per chip – “cores”
Programmers have to take advantage of multiple
processors
Parallelism – similar to instruction level parallelism