The document discusses the five generations of computers based on the underlying technologies used. The first generation used vacuum tube technology, while the second generation introduced transistor technology. The third generation was based on integrated circuit (IC) technology, and the fourth generation used microchip technology. The fifth generation aims to develop computers with human-like thinking capabilities using technologies like ultra-large-scale integrated circuits and artificial intelligence. The document also covers other classification methods of computers like by purpose, size, and operating principles.
This document provides an overview of the history and present state of the Indian information technology sector. It discusses four periods in the history of IT: premechanical, mechanical, electromechanical, and electronic. It then outlines the major services provided by the IT sector today, including custom application development and infrastructure management. Finally, it notes that the IT sector in India has grown significantly in recent years and is expected to become an $80 billion industry by 2011, representing one of the fastest growing sectors in the country.
The document discusses several key components of a computer system. It describes the central processing unit (CPU) as the brain of the computer and explains that CPU speed is measured in GHz, with higher speeds allowing more data to be processed. It also discusses different types of memory like ROM, EEPROM, flash memory, and RAM (including SDRAM and DDR). The document outlines internal storage devices like hard disk drives, describing technologies like SATA, PATA, and SSD. It also covers adapter cards, motherboards, and other essential computer parts.
The document discusses various aspects of network forensics and investigating logs. It covers analyzing log files as evidence, maintaining accurate timekeeping across systems, configuring extended logging in IIS servers, and the importance of log file accuracy and authenticity when using logs as evidence in an investigation.
Computer fundamental on Discovering ComputerAlok683622
Early computers used batch processing and operating systems that were inefficient. Timesharing systems in the 1960s allowed users to access computers through terminals, making programs appear to run simultaneously. Personal computing became popular in the 1970s with the Apple and IBM PC using off-the-shelf components. Today's top PCs are as powerful as million dollar machines from ten years ago. Networks connect computers and devices to share resources and access the worldwide Internet to communicate, obtain information, shop, and entertain.
The document provides an overview of a course on computer hardware and networking. The course objectives are to understand computer components, peripherals, and networking systems. It will cover selecting, installing, and maintaining hardware and networking equipment, as well as identifying hardware versus software problems. The document outlines concepts like motherboards, chipsets, CPUs, memory, and input/output devices. It also discusses computer types, operations, and basic components.
It is clear that information security technology has advanced much faster than
the number of people who are knowledgeable to apply it. It is even clearer that with these advancements come more difficulties in keeping networks secure from intruders, viruses and other threats.
Course Code: CS-301
Book: Introduction to Computing.
Chapter Number 1: Introduction to Computer Systems.
Degree: BS (SE, CS, BIO)
Contents:
This chapter will cover the following topics:
1.Computer Hardware and Information Technology Infrastructure
2. The Computer System
3. How Computers Represent Data
4. The CPU and Primary Storage
5. Microprocessors and Processing Power
6. Multiple Processors and Parallel Processing
7. Storage Input, and Output Technology
8. Secondary Storage Technology
9. Input and Output Devices
10. Categories of Computers and Computer Systems
11. Computer Software
The document discusses various types of intruders including masqueraders, misfeasors, and clandestine users. It also covers intrusion techniques like password cracking, intrusion detection methods using statistical anomaly detection and rule-based approaches, and the importance of audit records and covering tracks to hide evidence of intrusion. Distributed intrusion detection systems are also mentioned as a more effective defense approach.
This document provides an overview of the history and present state of the Indian information technology sector. It discusses four periods in the history of IT: premechanical, mechanical, electromechanical, and electronic. It then outlines the major services provided by the IT sector today, including custom application development and infrastructure management. Finally, it notes that the IT sector in India has grown significantly in recent years and is expected to become an $80 billion industry by 2011, representing one of the fastest growing sectors in the country.
The document discusses several key components of a computer system. It describes the central processing unit (CPU) as the brain of the computer and explains that CPU speed is measured in GHz, with higher speeds allowing more data to be processed. It also discusses different types of memory like ROM, EEPROM, flash memory, and RAM (including SDRAM and DDR). The document outlines internal storage devices like hard disk drives, describing technologies like SATA, PATA, and SSD. It also covers adapter cards, motherboards, and other essential computer parts.
The document discusses various aspects of network forensics and investigating logs. It covers analyzing log files as evidence, maintaining accurate timekeeping across systems, configuring extended logging in IIS servers, and the importance of log file accuracy and authenticity when using logs as evidence in an investigation.
Computer fundamental on Discovering ComputerAlok683622
Early computers used batch processing and operating systems that were inefficient. Timesharing systems in the 1960s allowed users to access computers through terminals, making programs appear to run simultaneously. Personal computing became popular in the 1970s with the Apple and IBM PC using off-the-shelf components. Today's top PCs are as powerful as million dollar machines from ten years ago. Networks connect computers and devices to share resources and access the worldwide Internet to communicate, obtain information, shop, and entertain.
The document provides an overview of a course on computer hardware and networking. The course objectives are to understand computer components, peripherals, and networking systems. It will cover selecting, installing, and maintaining hardware and networking equipment, as well as identifying hardware versus software problems. The document outlines concepts like motherboards, chipsets, CPUs, memory, and input/output devices. It also discusses computer types, operations, and basic components.
It is clear that information security technology has advanced much faster than
the number of people who are knowledgeable to apply it. It is even clearer that with these advancements come more difficulties in keeping networks secure from intruders, viruses and other threats.
Course Code: CS-301
Book: Introduction to Computing.
Chapter Number 1: Introduction to Computer Systems.
Degree: BS (SE, CS, BIO)
Contents:
This chapter will cover the following topics:
1.Computer Hardware and Information Technology Infrastructure
2. The Computer System
3. How Computers Represent Data
4. The CPU and Primary Storage
5. Microprocessors and Processing Power
6. Multiple Processors and Parallel Processing
7. Storage Input, and Output Technology
8. Secondary Storage Technology
9. Input and Output Devices
10. Categories of Computers and Computer Systems
11. Computer Software
The document discusses various types of intruders including masqueraders, misfeasors, and clandestine users. It also covers intrusion techniques like password cracking, intrusion detection methods using statistical anomaly detection and rule-based approaches, and the importance of audit records and covering tracks to hide evidence of intrusion. Distributed intrusion detection systems are also mentioned as a more effective defense approach.
Mainframe computers are extremely large and powerful machines that can process large amounts of data quickly. They contain multiple fast processors that can either work together on shared tasks or separately on individual tasks. Mainframe computers have large memory capacities of several terabytes and use hard disk packs and tape backups for data storage. Users connect to mainframes through dumb terminals with no local processing or memory.
Data backup involves copying files and data to external or online storage so they are preserved if the original files are lost or damaged. Reasons for data loss include hardware failures, viruses, file corruption, and disasters. The main purpose of data backup is to avoid data loss of important financial, customer, and company information that would be difficult to replace. Backup options include external drives, internal drives, department servers, online backup sites, and cloud storage services.
This document discusses different types of computer memory and storage devices. It describes volatile memory like RAM that loses data when power is removed, and non-volatile memory like ROM that retains data without power. RAM is divided into SRAM and DRAM. Storage devices include hard disks with platters that store data magnetically, floppy disks, CDs, DVDs, and magnetic tapes. Each storage type has advantages for different use cases in terms of capacity, portability, write capabilities, and more.
The development of intelligent network forensic tools to focus on specific type of network traffic analysis is a challenge in terms of future perspective.
This will reduce time delays, less computational resources requirement; minimize attacks, providing reliable and secured evidences, and efficient investigation with minimum efforts
The document discusses the need for information security and the threats organizations face. It describes how security performs four important functions: protecting the organization's ability to function, enabling safe application operation, protecting data, and safeguarding assets. It then outlines various threats such as viruses, worms, hacking, human error, natural disasters, and more. It emphasizes that security is a management responsibility and missing or inadequate policies and controls can increase organizations' vulnerability to threats.
The document discusses various components of computer systems. It describes hardware components like the system unit, motherboard, processor, RAM, ROM, video cards, sound cards, and internal storage drives. It also discusses software types like system software and application software. Emerging technologies discussed include artificial intelligence, vision enhancement technologies, robotics, and quantum cryptography.
This document provides information about operating systems and their functions. It discusses that an operating system is software that manages computer hardware and software resources and provides common services for computer programs. It describes the main functions of an operating system including processor management, device management, memory management, and file management. It also discusses different types of operating systems such as single-user OS, multi-user OS, real-time OS, and distributed OS. Finally, it lists some commonly used operating systems like Windows, Linux, Android, iOS, and Symbian.
USER AUTHENTICATION
MEANS OF USER AUTHENTICATION
PASSWORD AUTHENTICATION
PASSWORD VULNERABILITIES
USE OF HASHED PASSWORDS – IN UNIX
PASSWORD CRACKING TECHNIQUES
USING BETTER PASSWORDS
TOKEN AUTHENTICATION
BIO-METRIC AUTHENTICATION
This document provides an overview of computers and their components. It discusses the importance of computer literacy and defines a computer. It describes the basic components of a computer including hardware such as the system unit, storage devices, input/output devices, and software. It explains different types of computers including personal computers, handheld computers, internet appliances, mid-range servers, and mainframes. It also provides an introduction to computer networks and the internet.
This presentation discusses different types of storage devices. It begins by introducing storage capacity and properties of storage units like access time and cost. The main types covered are optical storage devices like CDs, DVDs, and Blu-Ray discs which can store large amounts of data but are fragile. Magnetic storage devices discussed are floppy disks with small capacity and hard disks which are the primary computer storage. Solid state flash memory and memory sticks are also covered as portable options.
The document summarizes the seven layers of the OSI model and security threats that can occur at each layer. It describes the functions of each layer and common attacks such as IP spoofing at the network layer, ARP spoofing at the data link layer, and viruses/worms at the application layer. The document provides examples of security measures that can be implemented to mitigate threats at different OSI layers.
A computer is a programmable machine that can execute a prerecorded list of instructions. It has four basic functions: accepting input, processing data, producing output, and storing results. A computer system includes the computer hardware, peripheral devices, and software. Software provides instructions that tell the computer what tasks to perform. There are two main types of software: system software which includes operating systems and utilities, and applications software for tasks like word processing.
This document discusses security and protection mechanisms in operating systems. It begins by defining what security and protection mean in the context of an OS. Protection mechanisms ensure that processes only access authorized objects, while security deals with issues like authentication, threats, and policies. The document then covers topics like authentication, authorization, threats from inside and outside the system, and protection models like the monitor model and multilevel security model. It discusses techniques used by viruses, trojans, and worms to compromise systems. Finally, it defines the components of a protection system and Lampson's protection model.
This document summarizes the evolution of operating systems over 5 phases:
Phase 0 (1940-1955) had no operating systems and programs were manually loaded via card decks. Phase 1 (1955-1970) introduced batch processing with batch monitors. Phase 2 (1970-1980) enabled timesharing with systems like CTSS allowing multiple interactive users. Phase 3 (1980-1990) saw the rise of personal computers running single-user operating systems like MS-DOS. Phase 4 (1990-2000) focused on networking and client-server models. Phase 5 (2000-present) includes modern mobile and GUI-based operating systems on computers and phones.
FellowBuddy.com is an innovative platform that brings students together to share notes, exam papers, study guides, project reports and presentation for upcoming exams.
We connect Students who have an understanding of course material with Students who need help.
Benefits:-
# Students can catch up on notes they missed because of an absence.
# Underachievers can find peer developed notes that break down lecture and study material in a way that they can understand
# Students can earn better grades, save time and study effectively
Our Vision & Mission – Simplifying Students Life
Our Belief – “The great breakthrough in your life comes when you realize it, that you can learn anything you need to learn; to accomplish any goal that you have set for yourself. This means there are no limits on what you can be, have or do.”
Like Us - https://www.facebook.com/FellowBuddycom
The document provides a history of graphics technology, beginning with Intel's first video graphics controller board in 1983. It discusses several important graphics cards throughout the years, including the Commodore Amiga (1985), Nvidia GeForce 3 (programmable shading), and ATI Radeon 9700. Modern graphics cards consist of a printed circuit board containing a GPU, video memory, RAMDAC, and output ports. Integrated graphics use system memory while dedicated cards have independent video memory for improved performance.
This document provides an overview of the history and components of computers. It discusses the evolution of computer hardware from early mechanical devices like the abacus and Babbage's Difference Engine to modern integrated circuits and microprocessors. It describes the key components of modern computer systems including the CPU, memory, storage, buses, and input/output devices. It also explains the functioning of the CPU and memory in more detail.
Operating System - Types Of Operating System Unit-1abhinav baba
In This Slide There is Operating System And it's types ( Types of operating system)
Batch Operating System
Network Operating System
Time Sharing Operating System
Real Time Operating System
Distributed Operating System
This document provides an introduction and overview of basic computer concepts for a computer essentials course. It defines what a computer is, explaining that computers follow user instructions quickly as calculators. It also defines the components of a computer system and differences between hardware and software. Key concepts covered in 3 sentences or less include:
Computers consist of physical hardware that executes software instructions to perform tasks. Hardware includes input devices like keyboards and mice and output devices like monitors and printers. Memory and storage devices are also explained as important components for running programs and saving files.
The document summarizes the five generations of computers based on the underlying technologies used. The first generation used vacuum tube technology, while the second used transistors. The third generation was based on integrated circuits, and the fourth used microchips and microprocessors. The fifth generation aims to develop true artificial intelligence capabilities such as thinking and learning. Each generation brought improvements in size, cost, speed, reliability and capabilities.
The document discusses the five generations of computers from the first generation that used vacuum tubes in 1942-1955 to the present fifth generation that uses artificial intelligence. It describes the defining technologies of each generation including vacuum tubes, transistors, integrated circuits, microprocessors, and artificial intelligence. It also outlines some of the advantages and disadvantages of the computers from each generation in terms of size, speed, reliability, cost and other factors.
Mainframe computers are extremely large and powerful machines that can process large amounts of data quickly. They contain multiple fast processors that can either work together on shared tasks or separately on individual tasks. Mainframe computers have large memory capacities of several terabytes and use hard disk packs and tape backups for data storage. Users connect to mainframes through dumb terminals with no local processing or memory.
Data backup involves copying files and data to external or online storage so they are preserved if the original files are lost or damaged. Reasons for data loss include hardware failures, viruses, file corruption, and disasters. The main purpose of data backup is to avoid data loss of important financial, customer, and company information that would be difficult to replace. Backup options include external drives, internal drives, department servers, online backup sites, and cloud storage services.
This document discusses different types of computer memory and storage devices. It describes volatile memory like RAM that loses data when power is removed, and non-volatile memory like ROM that retains data without power. RAM is divided into SRAM and DRAM. Storage devices include hard disks with platters that store data magnetically, floppy disks, CDs, DVDs, and magnetic tapes. Each storage type has advantages for different use cases in terms of capacity, portability, write capabilities, and more.
The development of intelligent network forensic tools to focus on specific type of network traffic analysis is a challenge in terms of future perspective.
This will reduce time delays, less computational resources requirement; minimize attacks, providing reliable and secured evidences, and efficient investigation with minimum efforts
The document discusses the need for information security and the threats organizations face. It describes how security performs four important functions: protecting the organization's ability to function, enabling safe application operation, protecting data, and safeguarding assets. It then outlines various threats such as viruses, worms, hacking, human error, natural disasters, and more. It emphasizes that security is a management responsibility and missing or inadequate policies and controls can increase organizations' vulnerability to threats.
The document discusses various components of computer systems. It describes hardware components like the system unit, motherboard, processor, RAM, ROM, video cards, sound cards, and internal storage drives. It also discusses software types like system software and application software. Emerging technologies discussed include artificial intelligence, vision enhancement technologies, robotics, and quantum cryptography.
This document provides information about operating systems and their functions. It discusses that an operating system is software that manages computer hardware and software resources and provides common services for computer programs. It describes the main functions of an operating system including processor management, device management, memory management, and file management. It also discusses different types of operating systems such as single-user OS, multi-user OS, real-time OS, and distributed OS. Finally, it lists some commonly used operating systems like Windows, Linux, Android, iOS, and Symbian.
USER AUTHENTICATION
MEANS OF USER AUTHENTICATION
PASSWORD AUTHENTICATION
PASSWORD VULNERABILITIES
USE OF HASHED PASSWORDS – IN UNIX
PASSWORD CRACKING TECHNIQUES
USING BETTER PASSWORDS
TOKEN AUTHENTICATION
BIO-METRIC AUTHENTICATION
This document provides an overview of computers and their components. It discusses the importance of computer literacy and defines a computer. It describes the basic components of a computer including hardware such as the system unit, storage devices, input/output devices, and software. It explains different types of computers including personal computers, handheld computers, internet appliances, mid-range servers, and mainframes. It also provides an introduction to computer networks and the internet.
This presentation discusses different types of storage devices. It begins by introducing storage capacity and properties of storage units like access time and cost. The main types covered are optical storage devices like CDs, DVDs, and Blu-Ray discs which can store large amounts of data but are fragile. Magnetic storage devices discussed are floppy disks with small capacity and hard disks which are the primary computer storage. Solid state flash memory and memory sticks are also covered as portable options.
The document summarizes the seven layers of the OSI model and security threats that can occur at each layer. It describes the functions of each layer and common attacks such as IP spoofing at the network layer, ARP spoofing at the data link layer, and viruses/worms at the application layer. The document provides examples of security measures that can be implemented to mitigate threats at different OSI layers.
A computer is a programmable machine that can execute a prerecorded list of instructions. It has four basic functions: accepting input, processing data, producing output, and storing results. A computer system includes the computer hardware, peripheral devices, and software. Software provides instructions that tell the computer what tasks to perform. There are two main types of software: system software which includes operating systems and utilities, and applications software for tasks like word processing.
This document discusses security and protection mechanisms in operating systems. It begins by defining what security and protection mean in the context of an OS. Protection mechanisms ensure that processes only access authorized objects, while security deals with issues like authentication, threats, and policies. The document then covers topics like authentication, authorization, threats from inside and outside the system, and protection models like the monitor model and multilevel security model. It discusses techniques used by viruses, trojans, and worms to compromise systems. Finally, it defines the components of a protection system and Lampson's protection model.
This document summarizes the evolution of operating systems over 5 phases:
Phase 0 (1940-1955) had no operating systems and programs were manually loaded via card decks. Phase 1 (1955-1970) introduced batch processing with batch monitors. Phase 2 (1970-1980) enabled timesharing with systems like CTSS allowing multiple interactive users. Phase 3 (1980-1990) saw the rise of personal computers running single-user operating systems like MS-DOS. Phase 4 (1990-2000) focused on networking and client-server models. Phase 5 (2000-present) includes modern mobile and GUI-based operating systems on computers and phones.
FellowBuddy.com is an innovative platform that brings students together to share notes, exam papers, study guides, project reports and presentation for upcoming exams.
We connect Students who have an understanding of course material with Students who need help.
Benefits:-
# Students can catch up on notes they missed because of an absence.
# Underachievers can find peer developed notes that break down lecture and study material in a way that they can understand
# Students can earn better grades, save time and study effectively
Our Vision & Mission – Simplifying Students Life
Our Belief – “The great breakthrough in your life comes when you realize it, that you can learn anything you need to learn; to accomplish any goal that you have set for yourself. This means there are no limits on what you can be, have or do.”
Like Us - https://www.facebook.com/FellowBuddycom
The document provides a history of graphics technology, beginning with Intel's first video graphics controller board in 1983. It discusses several important graphics cards throughout the years, including the Commodore Amiga (1985), Nvidia GeForce 3 (programmable shading), and ATI Radeon 9700. Modern graphics cards consist of a printed circuit board containing a GPU, video memory, RAMDAC, and output ports. Integrated graphics use system memory while dedicated cards have independent video memory for improved performance.
This document provides an overview of the history and components of computers. It discusses the evolution of computer hardware from early mechanical devices like the abacus and Babbage's Difference Engine to modern integrated circuits and microprocessors. It describes the key components of modern computer systems including the CPU, memory, storage, buses, and input/output devices. It also explains the functioning of the CPU and memory in more detail.
Operating System - Types Of Operating System Unit-1abhinav baba
In This Slide There is Operating System And it's types ( Types of operating system)
Batch Operating System
Network Operating System
Time Sharing Operating System
Real Time Operating System
Distributed Operating System
This document provides an introduction and overview of basic computer concepts for a computer essentials course. It defines what a computer is, explaining that computers follow user instructions quickly as calculators. It also defines the components of a computer system and differences between hardware and software. Key concepts covered in 3 sentences or less include:
Computers consist of physical hardware that executes software instructions to perform tasks. Hardware includes input devices like keyboards and mice and output devices like monitors and printers. Memory and storage devices are also explained as important components for running programs and saving files.
The document summarizes the five generations of computers based on the underlying technologies used. The first generation used vacuum tube technology, while the second used transistors. The third generation was based on integrated circuits, and the fourth used microchips and microprocessors. The fifth generation aims to develop true artificial intelligence capabilities such as thinking and learning. Each generation brought improvements in size, cost, speed, reliability and capabilities.
The document discusses the five generations of computers from the first generation that used vacuum tubes in 1942-1955 to the present fifth generation that uses artificial intelligence. It describes the defining technologies of each generation including vacuum tubes, transistors, integrated circuits, microprocessors, and artificial intelligence. It also outlines some of the advantages and disadvantages of the computers from each generation in terms of size, speed, reliability, cost and other factors.
The document provides information about a group project on computer generations. It lists group members and describes the contents and introduction of computers. It then summarizes the five generations of computers from vacuum tubes to modern devices incorporating artificial intelligence. For each generation it highlights the technology used and examples, as well as advantages and disadvantages.
This document provides an overview of computer generations and operating systems. It discusses 5 generations of computers from the first generation using vacuum tubes to the current fifth generation using ultra-large scale integration. Each generation saw improvements in size, speed, reliability and cost. The document also covers the evolution of operating systems from no OS in early computers to modern graphical user interface systems. Key events and characteristics of operating systems are presented for each computer generation.
The document discusses the different generations of computers from the 1st to 5th generation. It provides details about the characteristics of each generation including the technologies used and some examples of computers from each generation. The 1st generation used vacuum tubes and magnetic drums. The 2nd generation saw the introduction of transistors replacing vacuum tubes. The 3rd generation brought integrated circuits and operating systems. Personal computers emerged in the 4th generation along with other advances. The 5th generation focuses on parallel processing and artificial intelligence.
The document discusses the five generations of computers from 1945 to the present. The first generation used vacuum tubes and were large, expensive machines like UNIVAC and ENIAC. The second generation introduced transistors, magnetic core memory, and higher-level languages. The third generation saw the advent of integrated circuits, microprocessors, and smaller desktop computers. The fourth generation featured further miniaturization and lower costs. The fifth generation is developing technologies like parallel processing, artificial intelligence, and supercomputers. Each generation brought improvements in size, cost, speed, reliability and applications.
General features of computer – Evolution of computers; Computer Applications – Data Processing – Information Processing – Commercial – Office Automation – Industry and Engineering – Healthcare – Education – Disruptive technologies.
The document summarizes the five generations of computers based on their technological developments. The first generation used vacuum tubes, the second used transistors, the third used integrated circuits, the fourth used microprocessors, and the fifth generation focuses on artificial intelligence. Each generation brought improvements in size, cost, speed, reliability and other factors. The document also categorizes computers based on their principles of operation and configuration.
Improvement of hardware in computer generationaman ferdous
This document discusses the five generations of computers from 1946 to the present. It describes the key technologies and improvements that define each generation, including the transition from vacuum tubes to transistors to integrated circuits. The first generation used vacuum tubes and magnetic core memory. Subsequent generations saw the introduction of transistors, integrated circuits, microprocessors, and advances like graphic interfaces and artificial intelligence. The document also briefly defines different types of computers including microcomputers, minicomputers, mainframes, and supercomputers.
This document discusses the history of computers through their generations of development. It begins by defining a computer and its basic components like the CPU, memory, and input/output devices. It then outlines the four main generations of computers, describing the underlying technologies used and examples for each generation from vacuum tubes to integrated circuits to microprocessors. The generations progressed from machine-dependent to machine-independent languages and from bulky to more compact sizes. It concludes by mentioning a potential 5th generation involving artificial intelligence and parallel processing.
This document provides an overview of computer programming and the history of computers. It discusses:
1) The history of computers from the first to fifth generations, describing the components and technologies used in each generation.
2) A block diagram of a basic computer system including the input, output, central processing, and memory units.
3) The components that make up a computer system including the motherboard, processor, memory, storage drives, power supply, and peripherals.
4) An introduction to computer programming including definitions and the first programming languages.
The document discusses the history and evolution of computers over five generations from the 1940s to present. The first generation used vacuum tubes and were large, expensive machines. The second generation saw the introduction of transistors, making computers smaller and more reliable. The third generation used integrated circuits which further miniaturized computers. The fourth generation began using microprocessors which enabled the development of personal computers. Current and future computers are exploring artificial intelligence and neural networks. The document also covers computer components, classifications, and how different units work together in basic computer organization.
The document provides an introduction to computers including definitions, components, characteristics, limitations, and applications. It describes that a computer is an electronic device that can store, process, and retrieve data according to a set of instructions. A computer consists of both hardware and software. It then discusses the characteristics of computers such as speed, accuracy, reliability, storage capacity, versatility, and automation. Some limitations are that computers must be programmed with instructions and cannot learn or make decisions on their own. The document also provides a brief history of early mechanical calculating devices and generations of computers. It concludes with describing common computer applications in various fields such as education, entertainment, medicine, science, and government.
The first generation of computers used vacuum tubes as their basic electronic components which made the computers very large and prone to overheating issues. They had limited memory using punched cards and could only perform basic mathematical calculations. Programming was done in low-level machine languages. Some examples of first generation computers include ENIAC, EDVAC, and UNIVAC.
This document provides an introduction to microcomputers and applications. It contains 6 chapters that cover topics such as the history of computers through the generations from vacuum tubes to modern microprocessors. It also discusses common computer software applications like word processing, spreadsheets, networking and computer viruses. The introduction defines computers and their characteristics including how they are programmed to perform tasks quickly and efficiently.
The document discusses the history and generations of computers. It covers 5 generations from the 1940s to present. The first generation used vacuum tubes and were very large, expensive machines. The second generation used transistors and were smaller and more reliable. The third generation used integrated circuits, making computers smaller yet. The fourth generation used microprocessors and VLSI circuits, leading to personal computers. The fifth generation uses ULSI and focuses on artificial intelligence, parallel processing, and natural language capabilities. Each generation brought improvements in cost, size, reliability and capabilities.
The document discusses the evolution of computers from mechanical calculators to modern devices. It covers the development of early computers using vacuum tubes and transistors, as well as the advent of integrated circuits, microprocessors, and microcontrollers. Computers are also classified according to attributes like price and performance, as well as by usage in embedded systems, personal computers, workstations, servers, mainframes, and supercomputers.
This document provides information about different types of computers:
- Digital computers operate using binary digits (0s and 1s) and can be classified by purpose (special vs general) and size/performance (microcomputers, minicomputers, mainframes, supercomputers).
- Analog computers use continuously variable physical quantities like voltage to model problems. Hybrid computers have features of both analog and digital computers.
- The document describes characteristics of microcomputers, minicomputers, mainframes, and supercomputers and provides examples of computers in each category. It also briefly discusses analog, hybrid, and personal computers.
The document discusses some key fundamentals of computer systems including:
1) It describes how digital signals operate using two fixed levels (on or off) compared to analog signals which vary continuously.
2) It explains binary numbering which uses only two digits (0 and 1) and how this is used to represent data and instructions in computers.
3) It provides an overview of common computer components such as the motherboard, processors, input devices like keyboards and mice, output devices like monitors and printers, and support hardware like power supplies.
This handbook provides international students with information about policies and procedures at UNI, including:
- Contact details and opening hours for the university
- An overview of the student code of conduct and expectations for behavior, attendance, and class participation
- Policies regarding course progress and attendance monitoring, as well as the intervention strategy for students at risk of not meeting progress requirements
- Processes for student transfers, deferrals, suspensions, and cancellations of study
- Complaints and appeals procedures
- Information about recognition of prior learning and credit transfers
- An outline of the student assessment policy, including assessment dates, re-submissions, extensions, and academic integrity
Universal Network of Infotech (UNI) is a leading training provider based in Adelaide, South Australia. It offers a Diploma of Information Technology (Website Development) that provides the skills to become a competent web designer. The course is accredited, internationally recognized, and takes 24 months to complete. It covers topics such as managing project quality, validating system design specifications, disaster recovery planning, and developing dynamic websites and information architectures.
Universal Network of Infotech (UNI) is a leading training provider based in Adelaide, South Australia. It offers vocational programs designed by experienced staff to prepare students for further education or employment. Its Adelaide campus is located in the city center, close to public transportation, shops, and cultural attractions. The document encourages the reader to enroll in UNI's Web Design Diploma course, noting it will be a challenging but rewarding experience.
UNI is an international education provider present in 60 locations globally. It offers accredited diploma programs in information technology, including a two-year Diploma of Information Technology in Website Design. The diploma provides skills in website design, development, databases, and project management. UNI aims to provide students with a high quality education to further their careers, supported by professional staff and facilities.
UNI is an international education provider present in 60 locations globally. It offers accredited diploma programs in information technology, including a Diploma of Information Technology in Website Design. The diploma is designed to be completed over two years and includes units covering website development, database design, and project management. UNI also provides student support services and has partnerships with universities and industry organizations to help students continue their education and careers.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Presentation of the OECD Artificial Intelligence Review of Germany
Computer Fundamentals
1. 1. GENERATIONS OF COMPUTERS:
The tem generation indicates the type of technology used in the
computer construction. As new technology was emerging, it was
being used in the making of computer. The new technology improved
the speed, accuracy and storage capacity of the computers.
Different technologies have been used for computers in different
times.
Therefore, computers can be divided into five generations
depending upon the technologies used. These are:
First Generation (1942 – 1955)
Second Generation (1955 – 1964)
Third Generation (1964 – 1975)
Fourth Generation (Since 1975)
Fifth Generation (Since 1980)
1. First Generation Computers (1942 – 1955):
The vacuum tube technology was used in first-generation
computers. Mark-1m, ENIAC, EDSAC, EDVAC, UNIVAC-1 etc.
machines belong to the first generation of computers. The machine
language only was used in first-generation computers.
Principle:-
The vacuum tubes consisted of filaments inside it which on heating
emit electrons. These electrons were responsible for the
amplification and deamplification of electrical signals.
Input:-
Punched cards
Output:-
Print-outs
Advantages:
2. *These computers were the fastest of their time.
*They were programmed using machine language.
*The electronic digital computers were introduced due to the
vacuum tube technology.
Disadvantages:
*Very large in size
*Not reliable
*Consumed large amount of energy
*Constant maintenance required
*More heat generated and air-conditioning was required
*More costly
*Very slow in speed (data processing)
*It was difficult to programmed, because they used only machine
language
*Non-portable
*Limited commercial use
2. Second Generation Computers (1955 – 1964):
The transistor technology was used in second-generation
computers. The electronic component transistor was invented in
1948 at Bell Laboratories. The transistor is smaller in size and
more reliable than vacuum tube. Therefore, the transistor
technology was used in computer in place of vacuum tube
technology. The programming assembly language was also introduced
in second-generation of computers.
The transistors consisted of BASE, COLLECTOR and EMITTER.
Their functions include:-
BASE: input gate for the transistor.
COLLECTOR: collect the amplified signals
EMITTER: output gate for emitting the amplified signals to the
external environment.
3. Advantages:
*Low in cost
*Smaller in size
*Fast in speed
*Less heat generated more reliable and accurate in calculations
*Consume low power etc.
*Used for commercial purposes
*Portable
*Assembly language was introduced. This language is easy to write
program than machine language
Disadvantages:
*Air-conditioning required
*Commercial production was difficult and these were very costly
*Constant (or frequent) maintenance required
*Only used for special purposes
3. Third Generation Computers (1964 – 1975):
The IC (Integrated Circuits) technology was used in third-
generation computers. In a small IC chip (5 mm square size) a
circuit is designed having large number of electronic components
like transistors, capacitors, diodes, resistors etc. Initially, an IC
contained only about ten to twenty components. Thus the IC
technology was named as Small Scale Integration (SSI). The third-
generation was based on IC technology and the computers were
designed using this technology.
Advantages:
*Smaller in size
*Production cost was low
*Very fast in computational power
4. *More reliable
*Low power consumption
*Maintenance cost was low because failure rate of hardware was
very low
*Magnetic disk, used for external storage
*More storage capacity
*Easily portable
*Easy to operate
*Upgraded easily
*Widely used for various commercial applications all over the world
*Lower heat generated
*High-level languages were commonly used
*Many input/output devices were introduced such as mouse and
keyboard etc.
Disadvantages:
*Air-conditioning required
*Highly sophisticated technology required for the manufacturer
chips
4. Fourth Generation Computers (1975 onwards):
The microchip technology was introduced in this generation of
computers. With the advancement in IC technology, LSI (Large
Scale Integration) chips were developed. It was possible to
integrate over 30,000 or more components on to single LSI chip.
After LSI, the VLSI (Very Large Scale Integration) was developed
and the development of microprocessor possible. It is expected
that more than one million components will be integrated on a single
chip of VLSI. Using VLSI technology, the entire CPU is designed on
a single silicon chip. The use of microprocessor as CPU introduced
another class of computers called the microcomputers. Thus
fourth-generation may be called Microcomputer generation. IBM
5. introduced its personal computer for use in 1981.
Advantages:
*Smaller in size
*Production cost is very low
*Very reliable
*Hardware failure is negligible
*Easily portable because of their small size
*Totally general purpose
*Air conditioning is not compulsory
*Very high processing speed
*Very large internal and external storage capacity
*Used advanced input & output devices such as optical readers,
laser printers, CD-ROM/DVD-ROM drives etc.
Disadvantages:
*Highly sophisticated technology required for the manufacturer of
microprocessor chips
5. Fifth Generation Computers (In process):
The main drawback of first to fourth generation computers is that
the computers have not their own thinking power. These are totally
depending upon the instructions given by the users.
Fifth generation computers are supposed to be the ideal computers,
but do not exist. The scientists are working to design such
computers that will have the following features.
*Having their own thinking power
*Making decisions themselves
*Having capabilities of learning
*Having capabilities of reasoning
*Having large capacity of internal storage
6. *Having extra high processing speed
*Having capabilities of parallel processing
Technologies used in fifth generation computers:
*ULSIC (Ultra Large Scale Integrated Circuits) technology
*Artificial Intelligence (AI) technology also called the knowledge
Processor. The AI means automatic programs that let the machines
to think and decide themselves. The programming languages LISP
(List Processor) and PROLOG (Programming with Logic) are used for
artificial intelligence. The scientists at ICOT in Japan use the
PROLOG to develop the Artificial Intelligence software.
Advantages:
*Laptops, Pocket computers and PDA were developed.
*Development of parallel processors
*Development of centralized servers
*Development of optical disc technology
*Invention of internet and its advantages.
2. CLASSIFICATION OF COMPUTERS:
Computers can be classified based on the following criteria:
According to Technology:
$Analog Computers
$Digital Computers
$Hybrid Computers
According to Purpose:
$General purpose Computers
$Special Computers
According to size:
$Supercomputers
7. $Mainframe Computers
$Minicomputers
$Microcomputers, or Personal Computers
Based on operating principles and technology:
Analog computers:
$These computers represent data in the form of continuous
electrical signals.
$These are fast and multi-tasked.
$Results displayed by these computers are less accurate.
$Powerful in solving differential equations.
$These computers use OP-AMP (Operational Amplifier).
$The features of OP-AMP include:
*High voltage gain. The voltage gain is defined as the ratio of
output voltage to input voltage.
*Infinite input resistance. The input resistance is defined as the
ratio of change in input voltage to the change in input current.
*Zero output resistance. The output resistance is defined as the
nominal resistance measured with nil loads.
$The basic OP-AMP circuit is represented as:
Wherein, Rin is the input resistance, RF is the feedback resistance
A is the amplifier which is used to invert the incoming signals of
8. voltage Vin to Vout.
Digital computers:
$These are called the Digital information processing systems.
$These systems store and process the data in digital form (strings
of 0's and 1's)
$They are capable of processing analog signals but the analog
signals have to be converted into digital signals using an ADC
before feeding into the digital computers.
Hardware components:
$Arithmetic Logical unit (ALU)
$Control unit
$Memory unit
$Input unit
$I/O units
Hybrid computers:
$These are the combination of both Analog and Digital computers
encompassing the best features of both the computers.
$Fast, efficient and reliable computer systems.
$The Data is measured and processed in the form of electrical
signals and stored with digital components.
$The input is accepted in the form of varying electrical signals and
is converted into discrete values for performing operations.
$They are used in hospitals to measure heartbeat and have
engineering and scientific applications.
Based on Applications:
General purpose computers:
$Work in all environments
$Versatile computers
$Store number of programs to perform distinct operations
9. $More expensive
$Not efficient and consume large time to generate results
Special purpose computers:
$Work on specific tasks
$Non-versatile
$Speed and memory of these computers depend on the task
performed
$More efficient and consumes less time to process results
$Less expensive
Based on size and capability:
Micro-computers:
$These are small cheap digital computers for individuals
Hardware components:
Microprocessor, storage unit, I/O channels, power supply,
connecting cables.
Software components:
Operating System (OS), Utility software, Device drivers
Available in the forms of:
PC's, Work stations, Notebook computers.
The various components are:
Microprocessors:
$This incorporates all functions of the CPU into a single unit. The
various units of microprocessor are:
$ALU: performs arithmetic and logical operations.
$Registers: Store data and instructions temporarily needed by the
ALU.It includes several types like: Accumulator (ACC), Program
Counter (PC), etc.
$CU: Control unit-Used to manage and control the functions of
microprocessors, I/O devices
10. $Memory:
Used to store data and instructions. It is of two types:
*Primary memory: It stores temporarily the data and instructions
needed by the microprocessor
*Secondary memory: It stores data permanently. Examples include
magnetic tapes, floppy, CD, USB, etc,
Peripheral devices:
$Input devices: They are used to transfer data into the
computer. Examples: keyboard, mouse, etc,
$Output devices: Used to display the results processed by the
computer. Examples: Monitor, printer, etc,
System bus:
$It is also called the FRONTSIDE BUS, MEMORY BUS, LOCAL or
HOST BUS.
$It is used to connect microprocessor, memory and peripheral
devices into a unit.
SYSTEM BUS= ADDRESS BUS+DATA BUS+CONTROL BUS
$Address bus: Unidirectional bus to identify the peripheral devices
and memory.
$Data bus: Bi-directional bus used to transfer data among the
microprocessor, peripheral devices and memory.
$Control bus: bus used by the microprocessor to send control
11. signals to various devices.
Depending on size, Microcomputers are of three types:
*Desktop computers:
They are used in single location, cheap, and have good storage.
Examples include: Apple, IBM.
*Laptop computers:
Portable computers also called as the notebook computers or mobile
computers. They are smaller in size, more expensive and are
rechargable.Examples include Apple, Acer, Hewlett Packard (HP)
*Hand-held computers:
Also called as Personal Digital Assistants (PDA's), Palmtop or
Mobile device. They are smaller in size, have smaller display and
input device is generally an electronic stylus. Their storage
capacity is small. Examples include Apple Newton, Franklin eBook
man.
Mini computers:
$These were introduced by Digital Equipment Corporation (DEC) in
1960.
$They can process more data and can support more I/O devices
They are less powerful than mainframe computers but more
powerful than micro computers. Hence they are called MID-RANGE
COMPUTERS.
They cater to the needs of 4-200 users at a time
$They are used in business as a centralized computer or as an
internet server
$They are less expensive than Mainframe computers. Examples
include PDP 11, IBM 8000 series, etc,
Mainframe computers:
$These are capable of handling millions of records a day.
$These are bigger and more expensive than mini-computers
12. $They require a large space and closely monitored humidity and
temperature.
Characteristics:
*A typical mainframe computer consists of 16 microprocessors and
even more than that.
*RAM capacity is 128MB to 8GB
*They can run multiple operating systems. Hence they are called-
VIRTUAL MACHINES
*They handle a large amount of I/O devices which are arranged in
separate CABINETS or FREMES and hence the name.
Applications:
*they are used in large financial transactions
*Enterprise Resource Planning (ERP)
*Industry and consumer statistics
*Census
Supercomputers:
$These are the fastest and complex computers with very high
speed
$First Supercomputer was presented by SEYMOUR CRAY in 1960 in
Control Data Corporation (CDC)
$Used exclusively in applications where large complex calculations
have to be performed to get the output
$These are very expensive and designed to perform only small
number of programs at a time
$The manufacturers of Supercomputers are: IBM, SILICON
GRAPHICS, FUJSTU, INTEL, etc.
These are the fastest as they employ 1000's of processors, 100’s
of GB of RAM, 1000’s of GB of secondary storage
$the principles used in these computers are:
*Pipelining: This enables the processor to execute second
instruction even before the first is completed provided, it has the
required resources
13. *Parallelism: Enables the processor to execute several instructions
at a time
$Examples include: CRAY 3, Cyber-205, PARAM, etc.
3. Characteristics of computers:
Computers have distinguishing characters which make them ideal
machines. But they do not certain characters which human possess.
Some of the important characteristics of computers are:
(I)Automatic:
#Computers are automatic machines which once started a job,
carry it on until it is complete provided they are given the required
instructions by the users.
(II)Speed:
#The computers are capable of taking logical decissions,
performing arithmetic and non-arithmetic operations on alphabets
and copying at an unbelievable speed
#The units of speed for a computer are in microseconds (10 -6) or
even in nano and Pico second (10 -9 or 10 -12)
#A powerful computer can perform 3 to 4 million arithmetic
operations per second
#The speed of the computers is attributed to the fact that THE
COMPUTERS ARE ELECTRONIC DEVICES WORKING ON
ELECTRICAL PULSES WHICH TRAVEL AT HIGH SPEED.
(III)Accuracy:
#The computer produces highly accurate and reliable results
#The errors in the calculations may be due to the error in the
logic of the human but not due to the computer.
#The computers perform accurate calculations 'n' number of times
(IV)Versatility:
#A computer is capable of performing a wide variety of functions
14. #It can accept data and produce results.
#It can perform the basic arithmetic and logic operations
#It can transfer data internally
#Several applications can be run at a time. For example MS paint
Adobe Photoshop and VLC media player can be run at a time.
(V)Diligence:
#A computer is capable of performing the same task over and over
again with the same degree of accuracy and reliability as the first
one
#This is because, unlike human beings, computers are free from
monotony, tiredness, lack of concentration, etc, and can work
hours together without creating bugs.
(VI)Large and perfect memory:
#A computer can store and recall any amount of information
because of its secondary storage capability with perfect accuracy
unlike human beings.
#The storage capacity of the computer is enormous and is perfect
#A computer recalls a data with greater accuracy even after
several times and does not lose any information unless it is
prompted to do so.
(VII)No I.Q and feelings:
#A computer is not intelligent on its own and cannot think on its
own
#It can only perform the tasks specified by the human but the
difference is that it does this with greater accuracy and speed.
#It cannot decide on its own and only the user can determine what
the computer must do
#A computer has no feelings but a human does.
4. Basic computer organization:
15. The organization of computers involves the interfacing of various
components of the computer and the co-ordination of the
operations performed by them. The various functional units of a
computer include:
(I)Input unit
(II)Memory unit
(III)Central processing unit
(IV)Output unit
BASIC COMPUTER ORGANISATION
Input devices:
@These devices are used to feed the data inside the computer.
@The most commonly used input devices are:
*keyboard
*mouse
16. *light pen
*digitizer
*trackball
*joystick
*OCR (Optical Character Recognizer)
*MICR (Magnetic Ink Character Recognizer)
*OMR (Optical Mark recognizer)
Keyboard:
@It is used to enter alphanumeric data into the computer and to
perform special functions:
@Alphanumeric keys: Used to enter alphabets and Numbers
@Function keys: Used to perform special functions. These include
F1 to F12.For example F5 is used for refreshing a page or desktop
@Modifier keys: SHIFT and CONTROL keys are called modifier
keys and they perform special functions. For example CTRL+X for
cut
@Spacebar key: Used to move by one space in a document or
worksheet or DBMS.
@Enter key: Used to open something like a file or a web-page or
to move to the succeeding line in a document.
17. Mouse:
@It is an electronic device used for selecting and pointing
purposes. Hence it is called the POINTING DEVICE.
@Left mouse button: Used for selection purpose
@Right mouse button: Used to perform special functions like OPEN,
EXPLORE, COMPRESS, etc.
@Ball at the bottom: The ball at the bottom of the mouse moves
and the cursor moves on the screen in whichever direction the ball
rotates
@Wheel at the top: To scroll a web-page or a document.
Scanner:
@Used to scan images and documents
@The scanned images are converted into the DIGITISED IMAGES
understandable by the computers.
@Color images can also be scanned using the scanner depending
upon the RED GREEN BLUE (RGB) PROPORTIONS
@The principle used in the BARCODE READER is similar to that of
a scanner.
Memory unit:
18. MEMORY PYRAMID OF A COMPUTER
The memory unit is used to store data on a temporary or
permanent basis. The various types of memory can be depicted as:
0100090000032a0200000200a20100000000a201000026060f003a0357
4d464301000000000001007a8e0000000001000000180300000000000
018030000010000006c00000000000000000000001a000000370000000
000000000000000b63100006418000020454d46000001001803000012
00000002000000000000000000000000000000f6090000e40c0000d800
0000170100000000000000000000000000005c4b030068430400160000
000c000000180000000a00000010000000000000000000000009000000
10000000de050000e1020000250000000c0000000e000080250000000c
0000000e000080120000000c0000000100000052000000700100000100
0000d2ffffff000000000000000000000000900100000000000004400022
430061006c006900620072006900000000000000000000000000000000
21. *IR
*MAR
*MBR
*MDR
Primary memory:
@This memory is also called as MAIN MEMORY.
@The information stored in this memory that are needed by the
microprocessor during the time of processing are temporary and
the information in the ROM are permanent
Read Only Memory (ROM):
@It is a non-volatile memory
@The contents of this memory are permanent
@These are cost-effective
@They are available in high storage capacity
@Processing speed is very low
@Generally, the OS supporting programmes and the Basic Input
Output System (BIOS) programs are stored in this
@In this trigonometric and logarithmic functions are also stored
@In PROM (Programmable Read Only Memory), The flexibility of
data alteration is also provided.
22. Random Access Memory (RAM):
@This is the part of computer's temporary storage where the ALL
THE DATA, INSTRUCTIONS NEEDED BY THE MICROPROCESSOR
AND THE RESULTS EXECUTED BY IT are stored.
@It is a volatile memory
@The contents are temporary
@Cost is very high
@They are available in small storage capacity
@processing speed is high
@User defined programmes can be stored at any time
Principle:-Each bit in a RAM stores information by means of
electric charge, where the presence of electrical signals indicate'1'
and the absence of electrical signals indicate'0'.
Types of RAM:
DYNAMIC RAM:
@The electric charges tend to be leak out in a few milliseconds, so
the information present in RAM is lost
@Hence, the set of data stored in Dynamic ram has to be
refreshed periodically before all the charges have leaked.
STATIC RAM:
23. @Special refresh circuitry is provided
@They are not leaky and hence do not require any refreshing
Cache memory:
A small memory between the CPU and the main memory is called
the cache memory.
@It is faster than main memory and the access time of this
memory is close to the processing speed of the CPU
@It acts as a HIGH-SPEED BUFFER between the CPU and the
main memory
It is used to store temporary and active (most frequently used)
data during the time of processing
The other components of primary memory include:
@PROM (Programmable Read Only Memory)
@EPROM (Erasable PROM)
@EEPROM (Electrically Erasable PROM)
Secondary memory:
@The secondary memory is also called as the AUXILLARY
MEMORY
@It may be in-built or may be introduced into the computer and is
24. used to store the data on a PERMANENT basis
@The various types of secondary memory include:
Magnetic storage device:
@Use magnetism property to store data
@The data stored in these devices can be stored, erased,
rewritten many number of times
@Examples include: Magnetic tapes, magnetic disks, floppy disk,
etc
Optical storage device:
@Uses the LASER BEAMS to store data
@The data stored in these devices can be erased and rewritten
many number of times
@Examples include: CD, CD-ROM-CD-RW, DVD, etc.
Magneto-Optical storage device:
@This uses both MAGNETISM and LASER BEAMS to store data
@These devices are generally used in the cases of BACKUP, DATA
RECOVERY, etc,
@The data stored in the can be erased and rewritten many times
25. CU (CONTROL UNIT) ALU (ARITHMETIC LOGICAL UNIT)
Universal Serial Bus (USB):
@These are commonly called as pen drives
@These are compact and store large data than the other storage
devices
@These are connected to the USB PORT in the CPU
Central processing unit:
The CPU is the main functional unit of the computer. The basic
functions of CPU include:
@fetch the data from the user
@Decode the data into a computer-understandable form
@Process the data or execute the operation
@Store the result
This sequence of functions is called the INSTRUCTION CYCLE and is
represented as:
26. MAIN MEMORY
EXECUTE
FETCH
The various units of the CPU include:
ARITHMETIC LOGICAL UNIT (ALU):
@It is used to perform the various arithmetic and logical
operations.
@Arithmetic operations like +,-,*, / are performed by ALU
@Logical operations like >, <, =, =/= are performed by the ALU
with the help of logical gates like NOT, OR, AND, etc.
@The three basic concepts of the ALU include:
@Opcodes: The data on which the operation is to be executed
@Operands: The operations to be performed on the data
@Format code: The format in which the data is represented, For
27. example like, STATIONARY POINT OR FLOATING Pointed, etc
CONTROL UNIT (CU):
@The CU control the flow of information
@It is called THE BRAIN OF THE CPU
@It directs the tasks performed by the ALU and also the
functions carried out by the I/O DEVICES
REGISTERS:
@The CPU contains certain temporary storage units called as the
REGISRES.
@The various types of registers and their functions are:
@Program Counter (PC): Stores the next operation to be performed
@Information Counter(IC): Store the information or data to be
processed by the CPU
@Memory Address Register (MAR): Stores the address of the next
location in the memory
@Memory Buffer Register: Stores the data received from or sent
to the CPU
@Memory Data Register (MDR): Stores Operands and Data
Output units:
28. @These are electronic or electromechanical devices which give the
desired result or output to the user in the USER-
UNDERSTANDABLE format.
@The various output units include:
*Visual Display Unit (VDU) or Monitor
*Printer
*Computer Output Micro file
*Plotter
Monitor:
@The monitor is a visual display unit used to display the output to
the user and is often referred to as the ELECTRONIC MEDIA
Principle:-When the beams of electrons in the electrical signals
strike the inner side of the monitor containing RED, BLUE and
GREEN PHOSPHOROUS, the color and detail of the object is
visible depending on the PROPORTIONS OF THE COLOR and the
INTENSITY OF THE ELECTRON BEAMS
@The types of monitor include: Cathode Ray Tube (CRT) and Liquid
Crystal Display (LCD)
@Cathode ray is better at picture perception when compared to
LCD
29. @The CRITICAL PARAMETERS of the monitor include:
*SIZE (length, breadth, thickness)
*RESOLUTIONS (pixels)
@Video card is to be installed for better graphical perception
Printer:
@printer is used to get a printed or PHOTOCOPY of the document
or image present in the system.
@The types of printers include:
@DOTMATRIX PRINTERS: High speed printers but the quality of
the image is poor
@INKJET PRINTERS: Slower than the Dot-Matrix printers but
have good picture quality
@LASER PRINTERS: Since these printers have their own ROM,
RAM and MICROPROCESSOR and hence produce HIGH QUALITY
IMAGES WITH GOOD SPEED.
Quality of printers is measured using the CRITICAL PARAMETERS:
*DOT PER INCHES (DPI)
*PAGES PER MINUTE (PPM)
Speaker:
@It is an electromechanical device that converts ELECTRICAL
30. SIGNALS into SOUND WAVES
@AUDIO DEVICE DRIVER has to be installed for the speaker to
perform
@Speaker may be in-built or it may be separately attached to the
computer
@Quality of the speaker is dependent on the SOUND CARD
installed in the computer
@Sophisticated Speakers contain SUB-WOOFER SYSTEM to
increase the BASS OUTPUT
5. MEMORY UNIT OF THE COMPUTER AND VARIOUS TYPES OF
MEMORY (REFER MEMORY UNIT IN THE ORGANISATION OF
COMPUTERS)
6. EVOLUTION OF COMPUTERS:
• In earlier years, people used fingers, stones, pebbles, and
notches in sticks and knots in ropes to perform simple
arithmetic calculations.
• SAND TABLE:(EARLIER PERIOD)
Used stones for calculations
It contains three channels filled with sand and each
chamber can hold a maximum of 10 stones
• ABACUS:(2500 BC)
It was invented by ASIA MINOR in 2500 BC.
31. It consisted of a wooden frame with strings and
beads which were used for calculations.
• NAPIER’S BONE:(1614)
This is a complicated manual device calculated by
JOHN NAPIER in 1614
This device consisted of a board with nine rods
The rod on the left corner consisted of digits from 1
to 9
The rod at the extreme right consisted of ZEROS
and was called the CONSTANT ROD
It was exclusively used for multiplying or dividing two
numbers only if one of the numbers is a single digit
• SLIDE RULE:(1620)
It was devised by EDMUND GUNTER in 1620
It consisted of two graduated scales sliding over
each other
It was used to perform not only simple arithmetics,
but also complex calculations like LOGARITHM and
TRIGNOMETRIC ROOTS, etc.
• PASCALINE:(1642)
It was devised by BLAISE PASCAL in 1642
It was also called as the ROTATING WHEEL
CALCULATOR or NUMERICAL WHEEL CALCULATOR
It was designed to handle numbers up to
999,999.999
• STEPPED RECKONER:(1694)
Pascaline was improved by a German mathematician,
GOTFRIED WILHELM VON LEIBNIZ into a stepped
reckoner
It performs multiplication, division and also SQUARE
ROOT of a number
• DIFFERENCE ENGINE:(1822)
It was devised by CHARLES BABBAGE in 1882
32. It used the features of the modern digital
computers like
INPUT,OUTPUT,STORAGE,PROCESSOR and
CONTROL UNITS
It was designed to perform mathematical
calculations by getting two inputs from the user(a)a
set of programs that contains the instructions to be
executed(b)a list of variables on which the operation
is to be performed
It was a digital automatic programmable general
purpose computer
Its disadvantage was that it was a slow engine
taking 3 minutes to multiply 2 numbers of 20 digits
each
• 1883-The idea of analytical difference engine was given by
CHARLES BABBAGE
• 1889-The idea of punched cards as input was introduced by
HERMAN HOLERITH
• MARK-I(1994):
It was devised by Aiken,an American mathematician
in 1937 and completed in 1994
It is faster than difference engine. For example, it
can multiply two digits of 20 digits each within 6
seconds
But it was slow in processing the results(RATE OF
RESULT=ONE RESULT/SECOND)
It was noisy and large in size
• COLOSSUS(1994):
It was devised by ALAN MATHINSON, a British
mathematician in 1994
It was a pure electronic digital programmable
computer
It used VACUUM TUBE TECHNOLOGY
33. It was designed to perform only specific operations
• ENIAC(1946):
Electronic Numerical Integrator And Calculator
developed by John Eckert and John Mauchley in
1946
Used Vacuum tube technology for basic circuits
Consisted of 17468 vacuum tubes, 7200 crystal
diodes, 10000 capacitors, 1000 relays
1000 times faster than MARK-I
Perform simple arithmetic and advanced operations
Used DECIMAL SYSTEM for representing and
processing values
• EDVAC(1949):
Electronic Discrete Variable Automatic Computer
devised by Eckert and Mauchley in 1949
Worked on the principles of STORED
PROGRAMS(program and data are considered as
strings of BINARY DIGITS)
Units are: Magnetic tape, Control unit, Dispatcher
unit, Processor, Timer, Dual memory, Three
temporary tanks to hold a single word
• EDSAC(1949):
Electrical Delay Storage Automatic Calculator
developed by Maurice Wilkes in 1949
Vacuum tube technology used for basic circuits and
MERCURY DELAY LINES for memory construction
Input unit- Punched cards
Output unit- Teleprinter
Able to carry out 650 instructions per second
• UNIVAC(1951):
Universal Automatic Computer developed by
ECKERT-MAUCHLEY corporation in 1951
5200 Vacuum tubes were used for basic logic
34. circuits and MERCURY DATA LINES for memory
construction
It can process numbers and alphabets
Since it provided separate processors for handling
input, output and processing units, it was UNIQUE
among the early computers