Control Systems (CS)
Lecture-1 & 2
Introduction to the Subject
Course Outline
Classical Control
• System Modelling
• Transfer Function
• Block Diagrams
• Signal Flow Graphs
• System Analysis
• Time Domain Analysis (Test Signals)
• Frequency Domain Analysis
• Bode Plots
• Nyquist Plots
• Nichol’s Chart
Modern Control
• State Space Modelling
• Eigenvalue Analysis
• Observability and Controllability
• Solution of State Equations (state Transition
Matrix)
• State Space to Transfer Function
• Transfer Function to State Space
• Direct Decomposition of Transfer Function
• Cascade Decomposition of Transfer
Function
• Parallel Decomposition of Transfer Function
Text Books
1. Modern Control Engineering, (5th
Edition)
By: Katsuhiko Ogata.
(Prof Emeritus)
Mechanical Engineering
University of Minnesota
2. Control Systems Engineering, (6th
Edition)
By: Norman S. Nise. (Professor Emeritus)
Electrical and Computer
Engineering Department
at California State Polytechnic University
Reference Books
1. Feedback Control Systems, 3rd
edition, Stefani, Sarvant, Oxford University Press
2. Modern Control Systems, (12th
Edition) ,Richard C. Dorf and Robert H. Bishop.
3. Feedback control of dynamic systems, Franklin and Powel, 5th
edition, Pearson
Prerequisites
• For Classical Control Theory
– Differential Equations
– Laplace Transform
– Basic Physics
– Ordinary and Semi-logarithimic graph papers
• For Modern Control theory above &
– Linear Algebra
– Matrices
Practical Sessions
 Software Based (Lab)
 Matlab
 Simulink
 Control System Toolbox
 Hardware Based
(Instrumentation &
Measurement Lab)
 Pressure Control Trainer
• Practicals are divided into two sessions
What is Control System?
• A system Controlling the operation of another
system.
• A system that can regulate itself and another
system.
• A control System is a device, or set of devices
to manage, command, direct or regulate the
behaviour of other device(s) or system(s).
Definitions
System – An interconnection of elements and devices for a desired purpose.
plant -A plant performs a particular operation. It is the physical object to be
controlled
Control System – An interconnection of components forming a system
configuration that will provide a desired response.
Process – The device, plant, or system under control. The input and
output relationship represents the cause-and-effect relationship of the
process.
Process Output
Input
Definitions
Controlled Variable– It is the quantity or condition that is measured and
Controlled. Normally controlled variable is the output of the control
system.
Manipulated Variable– It is the quantity of the condition that is varied
by the controller so as to affect the value of controlled variable.
Control – Control means measuring the value of controlled variable of
the system and applying the manipulated variable to the system to
correct or limit the deviation of the measured value from a desired
value.
Feedback control-refers to an operation that, in the presence of
disturbances, tends to reduce the difference between the output of a
system and some reference input and does so on the basis of the
difference.
Definitions
Disturbances– A disturbance is a signal that tends to adversely affect
the value of the system. It is an unwanted input of the system.
• If a disturbance is generated within the system, it is
called internal disturbance. While an external disturbance is
generated outside the system.
Controller
Output
Or
Controlled Variable
Input
or
Set point
or
reference
Process
Manipulated Variable
Point of View
Control systems are made up of components each of which
• Have input and output signal(s) (Examples of signals: position,
velocity, acceleration, temperature, voltage, current, concentration,
etc.)
• Process their inputs to produce their outputs (outputs are caused by
the interaction of the input and the component)
• Are described by differential equations
• Engineers produce mathematical models of the components so they
can design controllers.
Types of Control System
• Natural Control System
– Universe
– Human Body
• Manmade Control System
– Vehicles
– Aeroplanes
Types of Control System
• Manual Control Systems
– Room Temperature regulation Via Electric Fan
– Water Level Control
• Automatic Control System
– Room Temperature regulation Via A.C
– Human Body Temperature Control
Open-Loop Control Systems utilize a controller or control actuator to
obtain the desired response.
• Output has no effect on the control action.
• In other words output is neither measured nor fed back.
Controller
Output
Input
Process
Examples:- Washing Machine, Toaster, Electric Fan
Types of Control System
Open-Loop Control Systems
Open-Loop Control Systems
Types of Control System
• Since in open loop control systems reference input is not
compared with measured output, for each reference input there
is fixed operating condition.
• Therefore, the accuracy of the system depends on calibration.
• The performance of open loop system is severely affected by the
presence of disturbances, or variation in operating/
environmental conditions.
Closed-Loop Control Systems utilizes feedback to compare the actual
output to the desired output response.
Examples:- Refrigerator, Iron
Types of Control System
Closed-Loop Control Systems
Controller
Output
Input
Process
Comparator
Measurement
Multivariable Control System
Types of Control System
Controller
Outputs
Temp
Process
Comparator
Measurements
Humidity
Pressure
Feedback Control System
Types of Control System
• A system that maintains a prescribed relationship between the output
and some reference input by comparing them and using the difference
(i.e. error) as a means of control is called a feedback control system.
• Feedback can be positive or negative.
Controller Output
Input Process
Feedback
-
+ error
Servo System
Types of Control System
• A Servo System (or servomechanism) is a feedback control system in
which the output is some mechanical position, velocity or acceleration.
Antenna Positioning System Modular Servo System (MS150)
Linear Vs Nonlinear Control System
Types of Control System
• A Control System in which output varies linearly with the input is called a
linear control system.
5
3 
 )
(
)
( t
u
t
y
y(t)
u(t) Process
1
2 

 )
(
)
( t
u
t
y
0 2 4 6 8 10
5
10
15
20
25
30
35
y=3*u(t)+5
u(t)
y(t)
0 2 4 6 8 10
-20
-15
-10
-5
0
5
y(t)
u(t)
y=-2*u(t)+1
Linear Vs Nonlinear Control System
Types of Control System
• When the input and output has nonlinear relationship the system is said
to be nonlinear.
0 0.02 0.04 0.06 0.08
0
0.1
0.2
0.3
0.4
Adhesion Characteristics of Road
Creep
Adhesion
Coefficient
Linear Vs Nonlinear Control System
Types of Control System
• Linear control System Does not
exist in practice.
• Linear control systems are
idealized models fabricated by
the analyst purely for the
simplicity of analysis and design.
• When the magnitude of signals
in a control system are limited to
range in which system
components exhibit linear
characteristics the system is
essentially linear.
0 0.02 0.04 0.06 0.08
0
0.1
0.2
0.3
0.4
Adhesion Characteristics of Road
Creep
Adhesion
Coefficient
Linear Vs Nonlinear Control System
Types of Control System
• Temperature control of petroleum product in a distillation column.
Temperature
Valve Position
°C
% Open
0% 100%
500°C
25%
Time invariant vs Time variant
Types of Control System
• When the characteristics of the system do not depend upon time itself
then the system is said to time invariant control system.
• Time varying control system is a system in which one or more
parameters vary with time.
1
2 

 )
(
)
( t
u
t
y
t
t
u
t
y 3
2 
 )
(
)
(
Lumped parameter vs Distributed Parameter
Types of Control System
• Control system that can be described by ordinary differential equations
are lumped-parameter control systems.
• Whereas the distributed parameter control systems are described by
partial differential equations.
kx
dt
dx
C
dt
x
d
M 

2
2
2
2
2
1
dz
x
g
dz
x
f
dy
x
f





Continuous Data Vs Discrete Data System
Types of Control System
• In continuous data control system all system variables are function of a
continuous time t.
• A discrete time control system involves one or more variables that are
known only at discrete time intervals.
x(t)
t
X[n]
n
Deterministic vs Stochastic Control System
Types of Control System
• A control System is deterministic if the response to input is predictable
and repeatable.
• If not, the control system is a stochastic control system
y(t)
t
x(t)
t
z(t)
t
Types of Control System
Adaptive Control System
• The dynamic characteristics of most control systems are not constant
for several reasons.
• The effect of small changes on the system parameters is attenuated in
a feedback control system.
• An adaptive control system is required when the changes in the
system parameters are significant.
• Adaptive control is the control method used by a controller which
must adapt to a controlled system with parameters which vary, or are
initially uncertain.
• For example, as an aircraft flies, its mass will slowly decrease as a
result of fuel consumption; a control law is needed that adapts itself to
such changing conditions.
Types of Control System
Learning Control System
• A control system that can learn from the
environment it is operating is called a learning
control system.
Classification of Control Systems
Control Systems
Natural Man-made
Manual Automatic
Open-loop Closed-loop
Non-linear linear
Time variant Time invariant
Non-linear linear
Time variant Time invariant
L
T
I
C
o
n
t
r
o
l
S
y
s
t
e
m
s
(
L
i
n
e
a
r
ti
m
e
i
n
v
a
r
i
a
n
t
c
o
n
t
r
o
l
s
y
s
t
e
m
s
)
Examples of Control Systems
Water-level float regulator
Examples of Control Systems
Examples of Modern Control Systems
Examples of Modern Control Systems
Examples of Modern Control Systems
END OF LECTURES-1 & 2

lecture_1__2-introduction to Control Systems.pptx

  • 1.
    Control Systems (CS) Lecture-1& 2 Introduction to the Subject
  • 2.
    Course Outline Classical Control •System Modelling • Transfer Function • Block Diagrams • Signal Flow Graphs • System Analysis • Time Domain Analysis (Test Signals) • Frequency Domain Analysis • Bode Plots • Nyquist Plots • Nichol’s Chart Modern Control • State Space Modelling • Eigenvalue Analysis • Observability and Controllability • Solution of State Equations (state Transition Matrix) • State Space to Transfer Function • Transfer Function to State Space • Direct Decomposition of Transfer Function • Cascade Decomposition of Transfer Function • Parallel Decomposition of Transfer Function
  • 3.
    Text Books 1. ModernControl Engineering, (5th Edition) By: Katsuhiko Ogata. (Prof Emeritus) Mechanical Engineering University of Minnesota 2. Control Systems Engineering, (6th Edition) By: Norman S. Nise. (Professor Emeritus) Electrical and Computer Engineering Department at California State Polytechnic University
  • 4.
    Reference Books 1. FeedbackControl Systems, 3rd edition, Stefani, Sarvant, Oxford University Press 2. Modern Control Systems, (12th Edition) ,Richard C. Dorf and Robert H. Bishop. 3. Feedback control of dynamic systems, Franklin and Powel, 5th edition, Pearson
  • 5.
    Prerequisites • For ClassicalControl Theory – Differential Equations – Laplace Transform – Basic Physics – Ordinary and Semi-logarithimic graph papers • For Modern Control theory above & – Linear Algebra – Matrices
  • 6.
    Practical Sessions  SoftwareBased (Lab)  Matlab  Simulink  Control System Toolbox  Hardware Based (Instrumentation & Measurement Lab)  Pressure Control Trainer • Practicals are divided into two sessions
  • 7.
    What is ControlSystem? • A system Controlling the operation of another system. • A system that can regulate itself and another system. • A control System is a device, or set of devices to manage, command, direct or regulate the behaviour of other device(s) or system(s).
  • 8.
    Definitions System – Aninterconnection of elements and devices for a desired purpose. plant -A plant performs a particular operation. It is the physical object to be controlled Control System – An interconnection of components forming a system configuration that will provide a desired response. Process – The device, plant, or system under control. The input and output relationship represents the cause-and-effect relationship of the process. Process Output Input
  • 9.
    Definitions Controlled Variable– Itis the quantity or condition that is measured and Controlled. Normally controlled variable is the output of the control system. Manipulated Variable– It is the quantity of the condition that is varied by the controller so as to affect the value of controlled variable. Control – Control means measuring the value of controlled variable of the system and applying the manipulated variable to the system to correct or limit the deviation of the measured value from a desired value. Feedback control-refers to an operation that, in the presence of disturbances, tends to reduce the difference between the output of a system and some reference input and does so on the basis of the difference.
  • 10.
    Definitions Disturbances– A disturbanceis a signal that tends to adversely affect the value of the system. It is an unwanted input of the system. • If a disturbance is generated within the system, it is called internal disturbance. While an external disturbance is generated outside the system. Controller Output Or Controlled Variable Input or Set point or reference Process Manipulated Variable
  • 11.
    Point of View Controlsystems are made up of components each of which • Have input and output signal(s) (Examples of signals: position, velocity, acceleration, temperature, voltage, current, concentration, etc.) • Process their inputs to produce their outputs (outputs are caused by the interaction of the input and the component) • Are described by differential equations • Engineers produce mathematical models of the components so they can design controllers.
  • 12.
    Types of ControlSystem • Natural Control System – Universe – Human Body • Manmade Control System – Vehicles – Aeroplanes
  • 13.
    Types of ControlSystem • Manual Control Systems – Room Temperature regulation Via Electric Fan – Water Level Control • Automatic Control System – Room Temperature regulation Via A.C – Human Body Temperature Control
  • 14.
    Open-Loop Control Systemsutilize a controller or control actuator to obtain the desired response. • Output has no effect on the control action. • In other words output is neither measured nor fed back. Controller Output Input Process Examples:- Washing Machine, Toaster, Electric Fan Types of Control System Open-Loop Control Systems
  • 15.
    Open-Loop Control Systems Typesof Control System • Since in open loop control systems reference input is not compared with measured output, for each reference input there is fixed operating condition. • Therefore, the accuracy of the system depends on calibration. • The performance of open loop system is severely affected by the presence of disturbances, or variation in operating/ environmental conditions.
  • 16.
    Closed-Loop Control Systemsutilizes feedback to compare the actual output to the desired output response. Examples:- Refrigerator, Iron Types of Control System Closed-Loop Control Systems Controller Output Input Process Comparator Measurement
  • 17.
    Multivariable Control System Typesof Control System Controller Outputs Temp Process Comparator Measurements Humidity Pressure
  • 18.
    Feedback Control System Typesof Control System • A system that maintains a prescribed relationship between the output and some reference input by comparing them and using the difference (i.e. error) as a means of control is called a feedback control system. • Feedback can be positive or negative. Controller Output Input Process Feedback - + error
  • 19.
    Servo System Types ofControl System • A Servo System (or servomechanism) is a feedback control system in which the output is some mechanical position, velocity or acceleration. Antenna Positioning System Modular Servo System (MS150)
  • 20.
    Linear Vs NonlinearControl System Types of Control System • A Control System in which output varies linearly with the input is called a linear control system. 5 3   ) ( ) ( t u t y y(t) u(t) Process 1 2    ) ( ) ( t u t y 0 2 4 6 8 10 5 10 15 20 25 30 35 y=3*u(t)+5 u(t) y(t) 0 2 4 6 8 10 -20 -15 -10 -5 0 5 y(t) u(t) y=-2*u(t)+1
  • 21.
    Linear Vs NonlinearControl System Types of Control System • When the input and output has nonlinear relationship the system is said to be nonlinear. 0 0.02 0.04 0.06 0.08 0 0.1 0.2 0.3 0.4 Adhesion Characteristics of Road Creep Adhesion Coefficient
  • 22.
    Linear Vs NonlinearControl System Types of Control System • Linear control System Does not exist in practice. • Linear control systems are idealized models fabricated by the analyst purely for the simplicity of analysis and design. • When the magnitude of signals in a control system are limited to range in which system components exhibit linear characteristics the system is essentially linear. 0 0.02 0.04 0.06 0.08 0 0.1 0.2 0.3 0.4 Adhesion Characteristics of Road Creep Adhesion Coefficient
  • 23.
    Linear Vs NonlinearControl System Types of Control System • Temperature control of petroleum product in a distillation column. Temperature Valve Position °C % Open 0% 100% 500°C 25%
  • 24.
    Time invariant vsTime variant Types of Control System • When the characteristics of the system do not depend upon time itself then the system is said to time invariant control system. • Time varying control system is a system in which one or more parameters vary with time. 1 2    ) ( ) ( t u t y t t u t y 3 2   ) ( ) (
  • 25.
    Lumped parameter vsDistributed Parameter Types of Control System • Control system that can be described by ordinary differential equations are lumped-parameter control systems. • Whereas the distributed parameter control systems are described by partial differential equations. kx dt dx C dt x d M   2 2 2 2 2 1 dz x g dz x f dy x f     
  • 26.
    Continuous Data VsDiscrete Data System Types of Control System • In continuous data control system all system variables are function of a continuous time t. • A discrete time control system involves one or more variables that are known only at discrete time intervals. x(t) t X[n] n
  • 27.
    Deterministic vs StochasticControl System Types of Control System • A control System is deterministic if the response to input is predictable and repeatable. • If not, the control system is a stochastic control system y(t) t x(t) t z(t) t
  • 28.
    Types of ControlSystem Adaptive Control System • The dynamic characteristics of most control systems are not constant for several reasons. • The effect of small changes on the system parameters is attenuated in a feedback control system. • An adaptive control system is required when the changes in the system parameters are significant. • Adaptive control is the control method used by a controller which must adapt to a controlled system with parameters which vary, or are initially uncertain. • For example, as an aircraft flies, its mass will slowly decrease as a result of fuel consumption; a control law is needed that adapts itself to such changing conditions.
  • 29.
    Types of ControlSystem Learning Control System • A control system that can learn from the environment it is operating is called a learning control system.
  • 30.
    Classification of ControlSystems Control Systems Natural Man-made Manual Automatic Open-loop Closed-loop Non-linear linear Time variant Time invariant Non-linear linear Time variant Time invariant L T I C o n t r o l S y s t e m s ( L i n e a r ti m e i n v a r i a n t c o n t r o l s y s t e m s )
  • 31.
    Examples of ControlSystems Water-level float regulator
  • 32.
  • 33.
    Examples of ModernControl Systems
  • 34.
    Examples of ModernControl Systems
  • 35.
    Examples of ModernControl Systems
  • 36.