This document provides a syllabus and lecture materials for an introductory computing skills course. The syllabus covers introduction to computers, operating systems, internet usage, Microsoft Word, and Microsoft Excel. The lecture materials define the need for computers, describe their evolution through different generations using vacuum tubes, transistors, integrated circuits, and microprocessors. It also covers different types of computers like analog, digital and hybrid computers, as well as computers categorized by size as supercomputers, mainframes, mini computers, and personal computers. The document explains how a computer functions by accepting input, storing and processing data, and providing output.
Very-large-scale integration (VLSI) is the process of creating an integrated circuit (IC) by combining thousands of transistors into a single chip. VLSI began in the 1970s when complex semiconductor and communication technologies were being developed. The microprocessor is a VLSI device. Before the introduction of VLSI technology most ICs had a limited set of functions they could perform. An electronic circuit might consist of a CPU, ROM, RAM and other glue logic. VLSI lets IC designers add all of these into one chip.
The History of the transistor dates to the mid-1920s when several inventors attempted devices that were intended to control current in solid-state diodes and convert them into triodes. Success came after World War II, when the use of silicon and germanium crystals as radar detectors led to improvements in fabrication and theory. Scientists who had worked on radar returned to solid-state device development. With the invention of transistors at Bell Labs in 1947, the field of electronics shifted from vacuum tubes to solid-state devices.
With the small transistor at their hands, electrical engineers of the 1950s saw the possibilities of constructing far more advanced circuits. However, as the complexity of circuits grew, problems arose.
One problem was the size of the circuit. A complex circuit like a computer was dependent on speed. If the components were large, the wires interconnecting them must be long. The electric signals took time to go through the circuit, thus slowing the computer.
The Invention of the integrated circuit by Jack Kilby and Robert Noyce solved this problem by making all the components and the chip out of the same block (monolith) of semiconductor material. The circuits could be made smaller, and the manufacturing process could be automated. This led to the idea of integrating all components on a single silicon wafer, which led to small-scale integration (SSI) in the early 1960s, medium-scale integration (MSI) in the late 1960s, and then large-scale integration (LSI) as well as VLSI in the 1970s and 1980s, with tens of thousands of transistors on a single chip (later hundreds of thousands, then millions, and now billions (109)).
Project Report Format for Final Year Engineering Studentscutericha10
Project report is a written evidence of tasks, processes and activities that are undertaken and accomplished by the students while pursuing their projects and implementing it.
This report is an official document that reflects precise and concrete information about the different aspects of the project ranging from the overview, requirements, practical aspects, theoretical considerations, tasks furnished, outcomes gained, objectives listed, reports attached, abstracts, experiments and results, conclusions and recommendations to the implementation and scope of the project.
In this presentation of mine, a basic Design approach of VLSI has been explained. The ppt explains the market level of VLSI and also the fabrication process and also its various applications. An integration of various switches, gates, etc on Ic's has also been showcased in the same.
Internet of things based electrocardiogram monitoring system using machine l...IJECEIAES
In Bangladesh’s rural regions, almost 30% of the population lives in poverty. Rural residents also have restricted access to nursing and diagnostic services due to obsolete healthcare infrastructure. Consequently, as cardiac failure occurs, they usually fail to call the services and adopt the facilities. The internet of things (IoT) offers a massive advantage in addressing cardiac problems. This study proposed a smart IoT-based electrocardiogram (ECG) monitoring system for heart patients. The system is divided into several parts: ECG sensing network (data acquisition), IoT cloud (data transmission), result analysis (data prediction) and monetization. P, Q, R, S, and T are ECG signal properties fetched, pre-processed, analyzed and predicted to age level for future health management. ECG data are saved in the cloud and accessible via message queuing telemetry transport (MQTT) and hypertext transfer protocol (HTTP) servers. The linear regression method is utilized to determine the impact of electrocardiogram signal characteristics and error rate. The prediction was made to see how much variation there was in PQRST regularity and its sufficiency to be utilized in an ECG monitoring device. Recognizing the quality parameter values, acceptable outcomes are achieved. The proposed system will diminish future medical costs and difficulties for heart patients.
Very-large-scale integration (VLSI) is the process of creating an integrated circuit (IC) by combining thousands of transistors into a single chip. VLSI began in the 1970s when complex semiconductor and communication technologies were being developed. The microprocessor is a VLSI device. Before the introduction of VLSI technology most ICs had a limited set of functions they could perform. An electronic circuit might consist of a CPU, ROM, RAM and other glue logic. VLSI lets IC designers add all of these into one chip.
The History of the transistor dates to the mid-1920s when several inventors attempted devices that were intended to control current in solid-state diodes and convert them into triodes. Success came after World War II, when the use of silicon and germanium crystals as radar detectors led to improvements in fabrication and theory. Scientists who had worked on radar returned to solid-state device development. With the invention of transistors at Bell Labs in 1947, the field of electronics shifted from vacuum tubes to solid-state devices.
With the small transistor at their hands, electrical engineers of the 1950s saw the possibilities of constructing far more advanced circuits. However, as the complexity of circuits grew, problems arose.
One problem was the size of the circuit. A complex circuit like a computer was dependent on speed. If the components were large, the wires interconnecting them must be long. The electric signals took time to go through the circuit, thus slowing the computer.
The Invention of the integrated circuit by Jack Kilby and Robert Noyce solved this problem by making all the components and the chip out of the same block (monolith) of semiconductor material. The circuits could be made smaller, and the manufacturing process could be automated. This led to the idea of integrating all components on a single silicon wafer, which led to small-scale integration (SSI) in the early 1960s, medium-scale integration (MSI) in the late 1960s, and then large-scale integration (LSI) as well as VLSI in the 1970s and 1980s, with tens of thousands of transistors on a single chip (later hundreds of thousands, then millions, and now billions (109)).
Project Report Format for Final Year Engineering Studentscutericha10
Project report is a written evidence of tasks, processes and activities that are undertaken and accomplished by the students while pursuing their projects and implementing it.
This report is an official document that reflects precise and concrete information about the different aspects of the project ranging from the overview, requirements, practical aspects, theoretical considerations, tasks furnished, outcomes gained, objectives listed, reports attached, abstracts, experiments and results, conclusions and recommendations to the implementation and scope of the project.
In this presentation of mine, a basic Design approach of VLSI has been explained. The ppt explains the market level of VLSI and also the fabrication process and also its various applications. An integration of various switches, gates, etc on Ic's has also been showcased in the same.
Internet of things based electrocardiogram monitoring system using machine l...IJECEIAES
In Bangladesh’s rural regions, almost 30% of the population lives in poverty. Rural residents also have restricted access to nursing and diagnostic services due to obsolete healthcare infrastructure. Consequently, as cardiac failure occurs, they usually fail to call the services and adopt the facilities. The internet of things (IoT) offers a massive advantage in addressing cardiac problems. This study proposed a smart IoT-based electrocardiogram (ECG) monitoring system for heart patients. The system is divided into several parts: ECG sensing network (data acquisition), IoT cloud (data transmission), result analysis (data prediction) and monetization. P, Q, R, S, and T are ECG signal properties fetched, pre-processed, analyzed and predicted to age level for future health management. ECG data are saved in the cloud and accessible via message queuing telemetry transport (MQTT) and hypertext transfer protocol (HTTP) servers. The linear regression method is utilized to determine the impact of electrocardiogram signal characteristics and error rate. The prediction was made to see how much variation there was in PQRST regularity and its sufficiency to be utilized in an ECG monitoring device. Recognizing the quality parameter values, acceptable outcomes are achieved. The proposed system will diminish future medical costs and difficulties for heart patients.
Class materials for teaching the use of the HC-SR04 ultrasonic sensor with an Arduino Uno. These materials were originally used in Startathon 2016.
The code is available here. https://github.com/SustainableLivingLab/ultrasonic-hc-sr04-usage
High Performance & High Throughput Computing - EUDAT Summer School (Giuseppe ...EUDAT
Giuseppe will present the differences between high-performance and high-throughput applications. High-throughput computing (HTC) refers to computations where individual tasks do not need to interact while running. It differs from High-performance (HPC) where frequent and rapid exchanges of intermediate results is required to perform the computations. HPC codes are based on tightly coupled MPI, OpenMP, GPGPU, and hybrid programs and require low latency interconnected nodes. HTC makes use of unreliable components distributing the work out to every node and collecting results at the end of all parallel tasks.
Visit: https://www.eudat.eu/eudat-summer-school
Class materials for teaching the use of the HC-SR04 ultrasonic sensor with an Arduino Uno. These materials were originally used in Startathon 2016.
The code is available here. https://github.com/SustainableLivingLab/ultrasonic-hc-sr04-usage
High Performance & High Throughput Computing - EUDAT Summer School (Giuseppe ...EUDAT
Giuseppe will present the differences between high-performance and high-throughput applications. High-throughput computing (HTC) refers to computations where individual tasks do not need to interact while running. It differs from High-performance (HPC) where frequent and rapid exchanges of intermediate results is required to perform the computations. HPC codes are based on tightly coupled MPI, OpenMP, GPGPU, and hybrid programs and require low latency interconnected nodes. HTC makes use of unreliable components distributing the work out to every node and collecting results at the end of all parallel tasks.
Visit: https://www.eudat.eu/eudat-summer-school
General features of computer – Evolution of computers; Computer Applications – Data Processing – Information Processing – Commercial – Office Automation – Industry and Engineering – Healthcare – Education – Disruptive technologies.
Overview of a computer system Introduction This Unit explores the basics of computer systems, their evolution, operation, classification, components etc
INTRODUCTION
Today, almost all of us in the world make use of computers in one way or the other. It finds applications in various fields of education, entertainment, agriculture, engineering, medicine, commercial, research and others.
Not only in these sophisticated areas, but also in our daily lives, computers have become indispensable.
They are present everywhere, in all the devices that we use daily like cars, games, washing machines, microwaves etc. and in day to day computations like banking, reservations, electronic mails, internet and many more.
Nothing epitomizes modern life better than the
computer. Computers are such an integral part of our everyday
life now most people take them and what they have added to life
totally for granted. Even more so the generation who have grown
from infancy within the global desktop and laptop revolution
since the 1980s. The history of computer development is often
referred to in reference to the different generations of computing
devices. A generation refers to the state of improvement in the
product development process. This term is also used in the
different advancements of new computer technology. As new
technology was emerging, it was being used in the making of
computer. With each new generation, the circuitry has gotten
smaller and more advanced than the previous generation before
it. As a result of the miniaturization, speed, power, and com-
puter memory has proportionally increased. New discoveries are
constantly being developed that affect the way we live, work and
play.
The Elphinstonian 1988-College Building Centenary Number (2).pdfMukesh Tekwani
This is the 1988 issue of The Elphinstonian, the annual magazine of Elphinstone College, Mumbai. This is the special issue to commemorate the Century of the Elphinstone College Building in Mumbai.
What is gravitation, Newton's law of gravitation, projection of a satellite, derivations, weightlessness explained, change in value of g with altitude, time period of a satellite, binding energy, escape velocity of a satellite,
ISCE-Class 12-Question Bank - Electrostatics - PhysicsMukesh Tekwani
This is a 14 page question bank on the chapters of Electrostatics. This is based on the syllabus of most Board exams such as CBSE, ISCE and state boards.
Extremely important topic for Digital electronics, digital circuits, computer architecture and computer science.
Full video is available on Youtube: https://youtu.be/oyOaXqx06pY
This video explains the method of converting a decimal number to a binary number. Many solved examples are given here and also two exercises which you can attempt on your own and then check the answers.
I have also discussed the concept of LSB (least significant bit) and MSB (most significant bit), and also least significant digit (LSD) and most significant digit (MSD).
This topic is important for following courses: class 11 and 12 computer science of all state boards, class 11 and 12 physics, BSc Computer science, BSc IT, MCA (Masters degree in Computer Applications), BTech, BE (First Year), and many competitive examinations.
Free Lectures on YouTube for IGCSE Physics for the syllabus effective 2020-21. These lectures cover the syllabus of IGCSE and a major part of GCSE syllabus also.
1. The Hidden Meaning of Words in Science Question Papers
2. Scientific Notation or Powers of Ten Notation
3. Units and Base Quantities
4. What is Physics?
What is Cyber Law? Why is cyber security law needed? International cyber law. What is copyright? What are security, controls, privacy, piracy and ethics? Code of ethics for computer professionals. What is cyber insurance?
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
2. SYLLABUS FOR SEM 1
1.Introduction to Computers
2.Operating System
3.Internet and purposive surfing
4.Microsoft Word
5.Microsoft Excel
Mukesh N Tekwani (2019)
2
3. INTRODUCTION TO COMPUTERS
Session 1
1.Need of Computers
2.Evolution of Computers
3.Various Types of Computers
4.How Does a Computer Function
Mukesh N Tekwani (2019)
3
4. INTRODUCTION TO COMPUTERS
Need Of Computers
Where are computers used?
Commercial applications – data processing
Scientific applications – large calculations, simulations
Medical applications – operating medical equipment
Storing data
Communications – Internet, email, instant messaging,
social media
Power generation
Teaching and Learning …. (add your points)
Mukesh N Tekwani (2019)
4
5. INTRODUCTION TO COMPUTERS
Need Of Computers
Computer is a number crunching machine
Video, audio, movie production – all computerised
Weather prediction – relies heavily on computers
Military applications
Space travel and space exploration – not possible without
computers
Sports – look at how much technology is used in cricket
Drug manufacturing and research – development of new
molecules
Mukesh N Tekwani (2019)
5
6. INTRODUCTION TO COMPUTERS
Characteristics Of Computers
Computer is very fast in doing calculations.
Millions and millions of calculations per second
Very fast at retrieving or accessing data
Does not get “bored” like a human being
Will carry out all instructions given by programmer
Does not have a “mood“ and so its output is consistent
Cannot think on its own – we give the instructions and it will
obey those instructions
Doesnot have a “heart”, “feelings”, “emotions”, and “common-
sense” Mukesh N Tekwani (2019)
6
9. INTRODUCTION TO COMPUTERS
Evolution Of Computers
First Generation of Computers – 1940s – 1950s
1. ENIAC – Electronic Numerical Integrator and Computer
2. First electronic computers
3. Used Vacuum Tubes
4. These computers were huge and difficult to use
5. Designed and built for US army
6. It did not use binary code
Mukesh N Tekwani (2019)
9
11. INTRODUCTION TO COMPUTERS
Evolution Of Computers
Second Generation of Computers – 1955 – 1960
1. These computers used transistors
2. These transistors were reliable and easy to handle
3. They required much less power to operate
4. Magnetic disk storage was used in this period
5. Programming languages lke COBOL, FORTRAN, ALGOL SNOBOL
were developed during this period
6. Concept of multiprogramming and batch processing operating
systems were introduced
7. IBM 1401 was the most popular compute of this period
Mukesh N Tekwani (2019)
11
13. INTRODUCTION TO COMPUTERS
Evolution Of Computers
Third Generation of Computers – 1960s – 1970s
1. These computers used Integrated Circuits (ICs)
2. ICs are very small in size and so computer size reduced
3. Power consumption was very less compared to the previous two
generations
4. IBM System/360 was most popular computer of this period
5. Minicomputers were introduced in this generation
6. Large capacity magnetic disks and tapes were used for storing
large amounts of data
7. FORTRAN and COBOL became very popular for scientific and
business applications
Mukesh N Tekwani (2019)
13
15. INTRODUCTION TO COMPUTERS
Evolution Of Computers
Fourth Generation of Computers – 1971 – 1980s
1. These computers used Very Large Scale Integrated (VLSI) circuits
2. Microcomputers of this generation used very small circuits which
reduced size of computer
3. Power consumption was less
4. These computers were more durable, affordable, powerful, compact
and reliable.
5. Personal Computers (PC) were developed during this period
6. Time sharing, real-time networks, and distributed OS were used
7. High-level languages like C, C++ were developed and used
Mukesh N Tekwani (2019)
15
16. INTRODUCTION TO COMPUTERS
Evolution Of Computers
Fifth Generation of Computers – 1980s to present
1. This topic is for you to research
2. What do you think are the most important
features a computer has today, without which you
think it would be very difficult to use a computer?
3. What do you think are the features of a computer
that make it not very easy to use?
4. What are the features you would like to see in
future computers? Why do you want those
features?
Mukesh N Tekwani (2019)
16
17. VARIOUS TYPES OF COMPUTERS
Analog Computers
Digital Computers
Hybrid Computers
Mukesh N Tekwani (2019)
17
18. VARIOUS TYPES OF COMPUTERS
ANALOG COMPUTERS
1. These computers work by measuring analog quantities
2. Analog quantities are those quantities that change continuously
3. Examples of analog quantities are temperature, pressure, motion,
volume, voltage
4. Such a computer works on continuous supply of electrical signals
and displays output continuously
5. Such computers are used for simulations of aircraft, nuclear-power
plants, an chemical processes.
6. Accuracy is low
7. Simple examples: digital thermometer, light meter, sound meter,
medical equipmentMukesh N Tekwani (2019)
18
19. VARIOUS TYPES OF COMPUTERS
DIGITAL COMPUTERS
1. These computers work with digits.
2. Uses binary digits 0 & 1 for representing EVERYTHING in the
computer
3. Processing takes place very fast on these bits of 0s and 1s
4. Digital circuits are manufactured with very high precision and these
computers have a long life and high accuracy.
5. Most modern computers are digital
6. Mobile phones, tablets, wrist watches, smart TVs… so many devices
we use now are all digital devices.
7. A computer program operates on data supplied to the computer.
Operating System is essential for thes ecomputers.
8. This data is processed to give an output
Mukesh N Tekwani (2019)
19
20. VARIOUS TYPES OF COMPUTERS
HYBRID COMPUTERS
1. These computers work with digital and analog data.
2. They combine the best features of analog and digital computers
3. These are used for scientific and industrial applications
4. Medical equipment which captures analog data and gives digital
output is an example of hybrid computer
Mukesh N Tekwani (2019)
20
21. TYPES OF COMPUTERS BY SIZE
1. Supercomputer
2.Mainframe Computer
3.Mini Computer
4.Personal Computer
Mukesh N Tekwani (2019)
21
22. SUPERCOMPUTER
1. These are the most powerful computers used
2.Used to process very large amount of data
3.Typical applications:
1. military,
2.weather forecasting,
3.large financial transaction processing,
4.simulations of physical processes,
5.climate research,
6.oil and natural gas exploration...
4.Their performance is measured in number of FLOating Point
operations per Second (FLOPS).
5.Typical speeds are a quadrillion FLOPS -1 quadrillion = 1015
Mukesh N Tekwani (2019)
22
23. MAINFRAME COMPUTER
1. These are designed to handle large amounts of data for
financial, scientific and research purpose
2.Typical applications: railway and airlines reservation system.
3.Many terminals are connected to the mainframe computer
4.Terminals are used only for input and output
5.Terminals donot process data. Used only for i/p and o/p
Mukesh N Tekwani (2019)
23
24. MAINFRAME COMPUTER
6. Terminals send and received commands from the mainframe
computer. Terminals use data stored in databases on
mainframe computers
7. Mainframe computers have large disk storage and memory
8. They also use magnetic tapes for storing data for longer
time
9. Many printers can be connected to a mainframe
Mukesh N Tekwani (2019)
24
25. MINI COMPUTERS
1. A mini computer has properties and capabilities between a
mainframe and a personal computer
2. These computers can handle large amount of data
3. More than one printer and many terminals can be
connected to a mini computer
4. These are useful for small business houses as they are
cheaper compared to mainframe computers
Mukesh N Tekwani (2019)
25
26. PERSONAL COMPUTERS
1. Almost all computers that we use today in our homes,
colleges, small offices, are personal computers
2. Examples of PCs are desktop and laptop computers
3. Personal computers have a large amount of hard disk space
(about 1 TB) and can be connected with a modem/router to
the Internet
4. They can be connected to a printer, scanner, digital camera,
audio/video system such as a projector
5. They are used for word processing, spreadsheet, database,
communication, art, Internet, and entertainment
6. PCs can be interconnected to form a networkMukesh N Tekwani (2019)
26
27. HOW DOES A COMPUTER FUNCTION?
Every computer performs 5 basic functions:
1. It accepts data and instructions as input
2. It stores data
3. It can process data as required by the user
4. It gives results in the form of output
5. The IPO cycle – Input – Process - Output
Mukesh N Tekwani (2019)
27