Artificial Intelligence: Introduction, Typical Applications. State Space Search: Depth Bounded
DFS, Depth First Iterative Deepening. Heuristic Search: Heuristic Functions, Best First Search,
Hill Climbing, Variable Neighborhood Descent, Beam Search, Tabu Search. Optimal Search: A
*
algorithm, Iterative Deepening A*
, Recursive Best First Search, Pruning the CLOSED and OPEN
Lists
this presentation is about planning process in AI. The presentation specifically explained POP(Partial order Planning). There are also another planning. In this presentation with help of an example the presentation is briefly explained the planning is done in AI
Artificial Intelligence: Introduction, Typical Applications. State Space Search: Depth Bounded
DFS, Depth First Iterative Deepening. Heuristic Search: Heuristic Functions, Best First Search,
Hill Climbing, Variable Neighborhood Descent, Beam Search, Tabu Search. Optimal Search: A
*
algorithm, Iterative Deepening A*
, Recursive Best First Search, Pruning the CLOSED and OPEN
Lists
this presentation is about planning process in AI. The presentation specifically explained POP(Partial order Planning). There are also another planning. In this presentation with help of an example the presentation is briefly explained the planning is done in AI
Natural language processing provides a way in which human interacts with computer / machines by means of voice.
"Google Search by voice is the best example " which makes use of natural language processing..
In the field of artificial intelligence (AI), planning refers to the process of developing a sequence of actions or steps that an intelligent agent should take to achieve a specific goal or solve a particular problem. AI planning is a fundamental component of many AI systems and has applications in various domains, including robotics, autonomous systems, scheduling, logistics, and more. Here are some key aspects of planning in AI:
Definition of Planning: Planning involves defining a problem, specifying the initial state, setting a goal state, and finding a sequence of actions or a plan that transforms the initial state into the desired goal state while adhering to certain constraints.
State-Space Representation: In AI planning, the problem is often represented as a state-space, where each state represents a snapshot of the system, and actions transform one state into another. The goal is to find a path through this state-space from the initial state to the goal state.
Search Algorithms: AI planning typically relies on search algorithms to explore the state-space efficiently. Uninformed search algorithms, such as depth-first search and breadth-first search, can be used, as well as informed search algorithms, like A* search, which incorporates heuristics to guide the search.
Heuristics: Heuristics are used in planning to estimate the cost or distance from a state to the goal. Heuristic functions help inform the search algorithms by providing an estimate of how close a state is to the solution. Good heuristics can significantly improve the efficiency of the search.
Plan Execution: Once a plan is generated, the next step is plan execution, where the agent carries out the actions in the plan to achieve the desired goal. This often requires monitoring the environment to ensure that the actions are executed as planned.
Temporal and Hierarchical Planning: In more complex scenarios, temporal planning deals with actions that have temporal constraints, and hierarchical planning involves creating plans at multiple levels of abstraction, making planning more manageable in complex domains.
Partial and Incremental Planning: Sometimes, it may not be necessary to create a complete plan from scratch. Partial and incremental planning allows agents to adapt and modify existing plans to respond to changing circumstances.
Applications: Planning is used in a wide range of applications, from manufacturing and logistics (e.g., scheduling production and delivery) to robotics (e.g., path planning for robots) and game playing (e.g., chess and video games).
Challenges: Challenges in AI planning include dealing with large search spaces, handling uncertainty, addressing resource constraints, and optimizing plans for efficiency and performance.
AI planning is a critical component in creating intelligent systems that can autonomously make decisions and solve complex problems.
Knowledge representation and reasoning (KR) is the field of artificial intelligence (AI) dedicated to representing information about the world in a form that a computer system can utilize to solve complex tasks such as diagnosing a medical condition or having a dialog in a natural language
An introduction to the Transformers architecture and BERTSuman Debnath
The transformer is one of the most popular state-of-the-art deep (SOTA) learning architectures that is mostly used for natural language processing (NLP) tasks. Ever since the advent of the transformer, it has replaced RNN and LSTM for various tasks. The transformer also created a major breakthrough in the field of NLP and also paved the way for new revolutionary architectures such as BERT.
Artificial Intelligence: Introduction, Typical Applications. State Space Search: Depth Bounded
DFS, Depth First Iterative Deepening. Heuristic Search: Heuristic Functions, Best First Search,
Hill Climbing, Variable Neighborhood Descent, Beam Search, Tabu Search. Optimal Search: A
*
algorithm, Iterative Deepening A*
, Recursive Best First Search, Pruning the CLOSED and OPEN
Lists
Much of data is sequential – think speech, text, DNA, stock prices, financial transactions and customer action histories. Modern methods for modelling sequence data are often deep learning-based, composed of either recurrent neural networks (RNNs) or attention-based Transformers. A tremendous amount of research progress has recently been made in sequence modelling, particularly in the application to NLP problems. However, the inner workings of these sequence models can be difficult to dissect and intuitively understand.
This presentation/tutorial will start from the basics and gradually build upon concepts in order to impart an understanding of the inner mechanics of sequence models – why do we need specific architectures for sequences at all, when you could use standard feed-forward networks? How do RNNs actually handle sequential information, and why do LSTM units help longer-term remembering of information? How can Transformers do such a good job at modelling sequences without any recurrence or convolutions?
In the practical portion of this tutorial, attendees will learn how to build their own LSTM-based language model in Keras. A few other use cases of deep learning-based sequence modelling will be discussed – including sentiment analysis (prediction of the emotional valence of a piece of text) and machine translation (automatic translation between different languages).
The goals of this presentation are to provide an overview of popular sequence-based problems, impart an intuition for how the most commonly-used sequence models work under the hood, and show that quite similar architectures are used to solve sequence-based problems across many domains.
The Text Classification slides contains the research results about the possible natural language processing algorithms. Specifically, it contains the brief overview of the natural language processing steps, the common algorithms used to transform words into meaningful vectors/data, and the algorithms used to learn and classify the data.
To learn more about RAX Automation Suite, visit: www.raxsuite.com
Natural language processing provides a way in which human interacts with computer / machines by means of voice.
"Google Search by voice is the best example " which makes use of natural language processing..
In the field of artificial intelligence (AI), planning refers to the process of developing a sequence of actions or steps that an intelligent agent should take to achieve a specific goal or solve a particular problem. AI planning is a fundamental component of many AI systems and has applications in various domains, including robotics, autonomous systems, scheduling, logistics, and more. Here are some key aspects of planning in AI:
Definition of Planning: Planning involves defining a problem, specifying the initial state, setting a goal state, and finding a sequence of actions or a plan that transforms the initial state into the desired goal state while adhering to certain constraints.
State-Space Representation: In AI planning, the problem is often represented as a state-space, where each state represents a snapshot of the system, and actions transform one state into another. The goal is to find a path through this state-space from the initial state to the goal state.
Search Algorithms: AI planning typically relies on search algorithms to explore the state-space efficiently. Uninformed search algorithms, such as depth-first search and breadth-first search, can be used, as well as informed search algorithms, like A* search, which incorporates heuristics to guide the search.
Heuristics: Heuristics are used in planning to estimate the cost or distance from a state to the goal. Heuristic functions help inform the search algorithms by providing an estimate of how close a state is to the solution. Good heuristics can significantly improve the efficiency of the search.
Plan Execution: Once a plan is generated, the next step is plan execution, where the agent carries out the actions in the plan to achieve the desired goal. This often requires monitoring the environment to ensure that the actions are executed as planned.
Temporal and Hierarchical Planning: In more complex scenarios, temporal planning deals with actions that have temporal constraints, and hierarchical planning involves creating plans at multiple levels of abstraction, making planning more manageable in complex domains.
Partial and Incremental Planning: Sometimes, it may not be necessary to create a complete plan from scratch. Partial and incremental planning allows agents to adapt and modify existing plans to respond to changing circumstances.
Applications: Planning is used in a wide range of applications, from manufacturing and logistics (e.g., scheduling production and delivery) to robotics (e.g., path planning for robots) and game playing (e.g., chess and video games).
Challenges: Challenges in AI planning include dealing with large search spaces, handling uncertainty, addressing resource constraints, and optimizing plans for efficiency and performance.
AI planning is a critical component in creating intelligent systems that can autonomously make decisions and solve complex problems.
Knowledge representation and reasoning (KR) is the field of artificial intelligence (AI) dedicated to representing information about the world in a form that a computer system can utilize to solve complex tasks such as diagnosing a medical condition or having a dialog in a natural language
An introduction to the Transformers architecture and BERTSuman Debnath
The transformer is one of the most popular state-of-the-art deep (SOTA) learning architectures that is mostly used for natural language processing (NLP) tasks. Ever since the advent of the transformer, it has replaced RNN and LSTM for various tasks. The transformer also created a major breakthrough in the field of NLP and also paved the way for new revolutionary architectures such as BERT.
Artificial Intelligence: Introduction, Typical Applications. State Space Search: Depth Bounded
DFS, Depth First Iterative Deepening. Heuristic Search: Heuristic Functions, Best First Search,
Hill Climbing, Variable Neighborhood Descent, Beam Search, Tabu Search. Optimal Search: A
*
algorithm, Iterative Deepening A*
, Recursive Best First Search, Pruning the CLOSED and OPEN
Lists
Much of data is sequential – think speech, text, DNA, stock prices, financial transactions and customer action histories. Modern methods for modelling sequence data are often deep learning-based, composed of either recurrent neural networks (RNNs) or attention-based Transformers. A tremendous amount of research progress has recently been made in sequence modelling, particularly in the application to NLP problems. However, the inner workings of these sequence models can be difficult to dissect and intuitively understand.
This presentation/tutorial will start from the basics and gradually build upon concepts in order to impart an understanding of the inner mechanics of sequence models – why do we need specific architectures for sequences at all, when you could use standard feed-forward networks? How do RNNs actually handle sequential information, and why do LSTM units help longer-term remembering of information? How can Transformers do such a good job at modelling sequences without any recurrence or convolutions?
In the practical portion of this tutorial, attendees will learn how to build their own LSTM-based language model in Keras. A few other use cases of deep learning-based sequence modelling will be discussed – including sentiment analysis (prediction of the emotional valence of a piece of text) and machine translation (automatic translation between different languages).
The goals of this presentation are to provide an overview of popular sequence-based problems, impart an intuition for how the most commonly-used sequence models work under the hood, and show that quite similar architectures are used to solve sequence-based problems across many domains.
The Text Classification slides contains the research results about the possible natural language processing algorithms. Specifically, it contains the brief overview of the natural language processing steps, the common algorithms used to transform words into meaningful vectors/data, and the algorithms used to learn and classify the data.
To learn more about RAX Automation Suite, visit: www.raxsuite.com
High level introduction to text mining analytics, which covers the building blocks or most commonly used techniques of text mining along with useful additional references/links where required for background/literature and R codes to get you started.
Machine Translation (MT) refers to the use of computers for the task of translating
automatically from one language to another. The differences between languages and
especially the inherent ambiguity of language make MT a very difficult problem. Traditional
approaches to MT have relied on humans supplying linguistic knowledge in the form of rules
to transform text in one language to another. Given the vastness of language, this is a highly
knowledge intensive task. Statistical MT is a radically different approach that automatically
acquires knowledge from large amounts of training data. This knowledge, which is typically
in the form of probabilities of various language features, is used to guide the translation
process. This report provides an overview of MT techniques, and looks in detail at the basic
statistical model.
Honest Reviews of Tim Han LMA Course Program.pptxtimhan337
Personal development courses are widely available today, with each one promising life-changing outcomes. Tim Han’s Life Mastery Achievers (LMA) Course has drawn a lot of interest. In addition to offering my frank assessment of Success Insider’s LMA Course, this piece examines the course’s effects via a variety of Tim Han LMA course reviews and Success Insider comments.
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
Palestine last event orientationfvgnh .pptxRaedMohamed3
An EFL lesson about the current events in Palestine. It is intended to be for intermediate students who wish to increase their listening skills through a short lesson in power point.
The Roman Empire A Historical Colossus.pdfkaushalkr1407
The Roman Empire, a vast and enduring power, stands as one of history's most remarkable civilizations, leaving an indelible imprint on the world. It emerged from the Roman Republic, transitioning into an imperial powerhouse under the leadership of Augustus Caesar in 27 BCE. This transformation marked the beginning of an era defined by unprecedented territorial expansion, architectural marvels, and profound cultural influence.
The empire's roots lie in the city of Rome, founded, according to legend, by Romulus in 753 BCE. Over centuries, Rome evolved from a small settlement to a formidable republic, characterized by a complex political system with elected officials and checks on power. However, internal strife, class conflicts, and military ambitions paved the way for the end of the Republic. Julius Caesar’s dictatorship and subsequent assassination in 44 BCE created a power vacuum, leading to a civil war. Octavian, later Augustus, emerged victorious, heralding the Roman Empire’s birth.
Under Augustus, the empire experienced the Pax Romana, a 200-year period of relative peace and stability. Augustus reformed the military, established efficient administrative systems, and initiated grand construction projects. The empire's borders expanded, encompassing territories from Britain to Egypt and from Spain to the Euphrates. Roman legions, renowned for their discipline and engineering prowess, secured and maintained these vast territories, building roads, fortifications, and cities that facilitated control and integration.
The Roman Empire’s society was hierarchical, with a rigid class system. At the top were the patricians, wealthy elites who held significant political power. Below them were the plebeians, free citizens with limited political influence, and the vast numbers of slaves who formed the backbone of the economy. The family unit was central, governed by the paterfamilias, the male head who held absolute authority.
Culturally, the Romans were eclectic, absorbing and adapting elements from the civilizations they encountered, particularly the Greeks. Roman art, literature, and philosophy reflected this synthesis, creating a rich cultural tapestry. Latin, the Roman language, became the lingua franca of the Western world, influencing numerous modern languages.
Roman architecture and engineering achievements were monumental. They perfected the arch, vault, and dome, constructing enduring structures like the Colosseum, Pantheon, and aqueducts. These engineering marvels not only showcased Roman ingenuity but also served practical purposes, from public entertainment to water supply.
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Acetabularia Information For Class 9 .docxvaibhavrinwa19
Acetabularia acetabulum is a single-celled green alga that in its vegetative state is morphologically differentiated into a basal rhizoid and an axially elongated stalk, which bears whorls of branching hairs. The single diploid nucleus resides in the rhizoid.
2. z
Syllabus
Natural Language Processing
Natural Language Models
Syntactic Analysis
Augmented Grammar
Semantic Interpretation
Machine Translation
Ambiguity and Disambiguation
Discourse understanding
Grammar Induction
3. z
Natural Language processing (NLP)
NLP allows computer interaction with humans
A field in Artificial Intelligence
Computer LinguisticsNLP
Fig 1: NLP
Fig 2: Research Area in NLP
AI
ML
4. z
Trending Topics in Natural Language Processing
Natural Language Understanding
Natural Language Generation
Text Extraction
Language Translation
Parsing
Parts of Speech Tagging
Fig 3 : Projection of NLP projects
8. z
Research Topics in Natural Language Processing
Named Entity Recognition
Hamming Problem
Neural networkbased Transition & parsing
Ontology
Dependency parsing
Query entity recognising & Disambiguity
Sentiment analysis & mining
Text Categorization and Summarization
Online Browsing
Text Mining
Plagiarism Detection
Information retrieval
Machine translation
Speech recognition
Deep learning in NLP
Opinion analysis & mining
Text to 3D scene Generation
Sentence completion
9. z
Applications of Natural Language Processing
Biomedical
Forensic Science
Advertisement
Education
Politics
E-governance
Business Development
Marketing
10. zTools & Software : Purpose
Stanford NLP : provide models files for analysis of English, written in java
Apache Opennlp: provide support for common NLP task such as tokenization, sentence Segmentation.
Jig LDA nlp: used for parameter estimation & inference implemented in java
Scala NLP: Umbrella project for several libraries, including Breeze ,Epic
Apache Lucene Core: full-featured text Search engine library implemented in java
GateNLP: Java suits of tools which include information extraction support system to support various Lang
NLTK: build a python program to work with human language
11. z NLP - Stages
Editors
Jupyter NB
Google Collab
Pychram
Software Libraries
NLTK
TensorFlow
Keras
Pytorch Pragmatic Analysis
Disclosure Integration
Semantic Analysis
Syntax Analysis
Lexical Analysis
33. z
Parsing & Approaches – Syntax Analysis
Parsing - taking input text and giving structural representation to it after
checking the syntax as per formal grammar ( rule ).
Top-down Parsing
The parser starts constructing the parse tree from the start
symbol and then tries to transform the start symbol to the
input.
Bottom-up Parsing
The parser starts with the input symbol and tries to
construct the parser tree up to the start symbol.
35. z
How to get /Solve this structure / Grammar of POS?
Parts of Speech (POS) ?
Subject , Object , Predicate !
Grammar ?
36. z
Grammar – Example
S->NP VP
S-> VP NP
S->NP VP NP
NP -> Det Noun | NP->Noun |Nominal
VP-> Verb NP | V | Verb NP PP | V PP
V->Verb
Det-> Det | Article |Aux
L
H
S
R
H
S
Terminal
Non
Terminal
37. z
Context Free Grammar / Backus Norm
Form / Phrase Structure Gramme r
CFG has 4 components.
G={ V,T,P,S}
Set of Non-Terminals: ( It is denoted by V). The non-terminals are syntactic variables that
denote the sets of strings, such as Verb Phrase or noun phrase.( LHS of a Grammar )
Set of Terminals: ( It is denoted by T). Strings are further cannot be sub divided like Noun,
Verb, determinant, article, auxiliary ( RHS of a Grammar )
Set of Production Rule : (It is denoted by P). The rule to defines how the terminals and
non-terminals can be combined. Every production(P) consists of non-terminals, an arrow,
and terminals
Start Symbol: (( It is denoted by S)The production begins from the start symbol. Non-
terminal symbol is always designated as start symbol.
38. z
CFG - Construct the parsing
To solve :- The flight includes a meal
S -> NP VP NP
S -> NP VP
S -> VP NP
VP -> V NP
NP - > Det N
V -> Verb
Verb -> includes
Det -> the
Det ->a
N -> Flight
N -> Meal
S
NP VP
Det
Flight
NP NP
Det N
the Meala
N
includes
V
S
NP VP
Det
Flight
NP NP
Det
the a
N
includes
V
Meal
N
N
39. z
Some Difficult Examples
From the newspapers:
Squad helps dog bite victim.
Helicopter powered by human flies.
Levy won’t hurt the poor.
Once-sagging cloth diaper industry saved by full
dumps.
Ambiguities:
Lexical: meanings of ‘hot’, ‘back’.
Syntactic: I heard the music in my room.
Referential: The cat ate the mouse. It was ugly.
40. z
CKY uses C Norm Form
When Sentence contain
Ambiguity
Recursive / Repeated Sub Structure ,
How to resolve ?
CNF
Allowed Rules in CNF
S -> B ( single terminal) NP -> the N (incorrect ), so we introduce dummy variable
S->B C ( 2 non terminal ) NP -> Det N
NP -> N pp Det -> the ( Dummy Variable)
44. z
Augmented Grammar :
On the fly if the given sentence
generates a new grammar , add
that rule to the rule table , this is
referred as AG.
A -> B, B ->C, A -> C
e.g. Students like coffee.
Todd likes coffee.
Todd like coffee.
Examples
S -> NP[number] VP[number]
NP[number] -> N[number]
N[number=singular] -> “Todd”
N[number=plural] -> “students”
VP[number] -> V[number] NP
V[number=singular] -> “likes”
V[number=plural] -> “like”
46. z
Word Net
Contain DB of Nouns, Verbs, Adjectives & Adverbs)
Ambiguity : single word having multiple meaning (e.g. bank)
Synonyms: similar words (e.g. big, large)(fare/price)- oddity
Antonyms: (big, small , good bad, fast slow )
Complementary pair: ( male female, alive dead, present dead)
Relation pair: ( married- not single , single – not married )
Hyponymy: ( vehicle(car))
Meronymy : ( part of a whole (apple <- apple tree))
Homonymy: ( whole to a part (apple tree ->apple))
47. z
Semantic Analysis- Building blocks
Entities − It represents the individual such as a particular person, location etc.
For example, Haryana. India, Ram all are entities.
Concepts − It represents the general category of the individuals such as a
person, city, etc.
Relations − It represents the relationship between entities and concept. For
example, Ram is a person.
Predicates − It represents the verb structures. For example, semantic roles and
case grammar are the examples of predicates.
48. z
Semantic Analysis
Two approaches FOL / WordNet
First Order Logic
Flight 707 serve lunch S -> Np Vp ( DCL( Np Vp ))
Server lunch S -> IMP( VP NP )
Does Flight 207 server lunch S -> Aux NP VP ( YNQ( NP VP ))
Which flight server lunch S -> (WHQ (NP VP))
Atlanta’s airport S -> N VIP (GN ( N ))
I told harry to go the queen ( infinite verb phrase) S -> NP VP NP ( S-> NP λ VP NP))
50. z
Discourse Integration
Let S0 and S1 to represent the meaning of the two related sentences i.e. Text Coherence
Results: It infers that the state asserted by term S0 could cause the state asserted by S1.
e.g. Ram was caught in the fire. His skin burned.
Explanation: It infers that the state asserted by S1 could cause the state asserted by S0.
e.g. Ram fought with Shyam’s friend. He was drunk.
Parallel: It infers p(a1,a2,…) of S0 & p(b1,b2,…) from S1. Here ai and bi are similar for all i.
e.g. Ram wanted car. Shyam wanted money.
Elaboration : It infers the same proposition P from both the assertions − S0 and S1 for
e,g, Ram was from Chandigarh. Shyam was from Kerala.
Occasion: It happens when a change of state can be inferred from the assertion of S0, final state of
which can be inferred from S1 and vice-versa.
e.g. Ram picked up the book. He gave it to Shyam.
51. z
Example of Discourse Integration
S1 − Ram went to the bank to deposit money.
S2 − He then took a train to Shyam’s cloth shop.
S3 − He wanted to buy some clothes.
S4 − He do not have new clothes for party.
S5 − He also wanted to talk to Shyam regarding
his health
52. z
Grammar Induction :
Unsupervised learning of a language’s syntax from a corpus of observed
sentences
– Ability to uncover an underlying grammar
– Ability to parse
– Ability to judge grammaticality
solve using parsing 1. CFG 2. CKY form and generate a parsed tree or by
Language models like Markov Chain Model
Demonstration of sentence completion using Grammar Induction using NN
54. z
54
Speech Recognition
Human languages are limited to a set of about
40 to 50 distinct sounds called phones: e.g.,
[ey] bet
[ah] but
[oy] boy
[em] bottom
[en] button
These phones are characterized in terms of acoustic features, e.g.,
frequency and amplitude, that can be extracted from the sound
waves
55. z
55
Difficulties
Why isn't this easy?
just develop a dictionary of pronunciation
e.g., coat = [k] + [ow] + [t] = [kowt]
but: recognize speech wreck a nice beach
Problems:
homophones: different fragments sound the same
e.g., rec and wreck
segmentation: determining breaks between words
e.g., nize speech and nice beach
signal processing problems
56. 56
Speech Recognition Architecture
• Large vocabulary, continuous speech (words not separated), speaker-
independent
Speech
Waveform
Spectral
Feature
Vectors
Phone
Likelihoods
P(o|q)
Words
Feature Extraction
(Signal Processing)
Phone Likelihood
Estimation (Gaussians
or Neural Networks)
Decoding (Viterbi
or Stack Decoder)
Neural Net
N-gram Grammar
HMM Lexicon
57. z
57
Signal Processing
Sound is an analog energy source resulting from pressure waves
striking an eardrum or microphone
A device called an analog-to-digital converter can be used to record
the speech sounds
sampling rate: the number of times per second that the sound level is
measured
quantization factor: the maximum number of bits of precision for the
sound level measurements
e.g., telephone: 3 KHz (3000 times per second)
e.g., speech recognizer: 8 KHz with 8 bit samples
so that 1 minute takes about 500K bytes