Sliding Mode Control.
#1



Submitted By:
Shibayan Chatterjee

[attachment=7681]

Introduction.
Definition.
Control theory is a combined branch of mathematics and Engineering that deals with the dynamical systems. The desired output of the system is called reference and when one or more outputs desire to follow the reference over time a controller is designed and. It helps in manipulating the input to provide the desired output.
The role of control theory can be understood with the help of a simple example. Let us consider the driving system of an automobile. The desired speed can be changed or maintained by controlling the pressure on the accelerator pedal. This automobile system constitutes a control strategy where the input to the system is the pressure on the accelerator pedal which causes the carburetor valve to open or close so as to increase or decrease the fuel flowing to the engine, thus bringing the automobile speed to a controllable form(figure 1).
The diagrammatic representation of the above example is shown below. Here each block represents an element, device, plant or a mechanism. Each block has an input and output signal which are linked to each other with the help of a particular relationship.

Types of Control Systems.
Control Systems are basically of two types open loop control system and closed loop control system. Open loop control system deals with those physical systems which does not automatically correct for variations in its output. For these kind of systems the out remains invariant for a particular input until the external conditions are altered. The output can only be changed by changing the internal parameters, external conditions of the system. Normally open control are used in those cases where the system is capable of withstanding these kind of variations
Closed Loop system on the other hand are those systems where the input to the system depends on the output, the output signal is trapped and compared with the desired output and it in turn generates an error signal which is feed to the controller. The feedback system is usually a sensor or a transducer which keeps continuous track of the output signals. These controllers are normally insensitive to changes in external factors because the error signal generates a desire=red output keeping in voes y-the error signal also.(figure 2,3)
For designing of any control system it is essential to know the physical characteristics of the system whose controlling is to be done. To know the physical system it is essential to know its output and input parameters. An idealized physical system is called a physical model. A physical model should be made under certain degree of accuracy with a specific type of problem. The next thin which is done is to have a mathematical formulae describing the system. This may be a differential equation, integral equation, or any other input output equation which completely defines the system. This equation defines the control block and hence is used for control system problems.


The Control theory is basically of two types: Linear Control Theory and Non-Linear Control Theory.
Linear Control Systems.
Linear Control Theory deals with those systems which are linear in nature. The output is proportional to input and which are mainly concerned with Single Input and Single Output systems.
Non-Linear Control Systems.
Non-Linear control Theory deals with those systems which are non linear, time invariant in nature, which does not follow any specific where the output does not follow ant=y specific pattern with the input and has multiple inputs and multiple outputs. Most of the systems found physically are non linear in nature. So it becomes very important to learn non-linear systems.
Single Input Single Output systems are concerned with only a single input and opts corresponding output. Multiple input multiple out system deals with multiple no’s of input and their out puts at specific times. Usually a MIMO systems can be analyzed using SISO systems but with certain degree of approximations. Here in that a case only one input has to be taken and all other are taken to be zero and the outputs are noted .This thing is then repeated for all other inputs and their outputs are noted.
Controllability and observability
Controllability and observability are main issues in the analysis of a system before deciding the best control strategy to be applied, or whether it is even possible to control or stabilize the system. Controllability is related to the possibility of forcing the system into a particular state by using an appropriate control signal. If a state is not controllable, then no signal will ever be able to control the state. If a state is not controllable, but its dynamics are stable, then the state is termed Stabilizable.
Observability instead is related to the possibility of "observing", through output measurements, the state of a system. If a state is not observable, the controller will never be able to determine the behavior of an unobservable state and hence cannot use it to stabilize the system. However, similar to the stabilizability condition above, if a state cannot be observed it might still be detectable.
Solutions to problems of uncontrollable or unobservable system include adding actuators and sensors.


Control Strategies.
Every control system must guarantee first the stability of the closed-loop behavior. For linear systems, this is obtained by directly placing the poles. Non-linear control systems use specific theories to ensure stability without regard to the inner dynamics of the system. The possibility to fulfill different specifications varies from the model considered and the control strategy chosen. Here a summary list of the main control techniques is shown:
Adaptive control
Adaptive control uses on-line identification of the process parameters, or modification of controller gains, thereby obtaining strong robustness properties. Adaptive controls were applied for the first time in the aerospace industry in the 1950s, and have found particular success in that field.

Hierarchical control
A Hierarchical control system is a type of Control System in which a set of devices and governing software is arranged in a hierarchical order When the links in the tree are implemented by a computer network, then that hierarchical control system is also a form of Networked control system.
Intelligent control
Intelligent control uses various AI computing approaches like neural networks, Bayesian probability, fuzzy logic, machine learning, evolutionary computation and genetic algorithms to control a dynamic system.
Optimal control
Optimal control is a particular control technique in which the control signal optimizes a certain "cost index": for example, in the case of a satellite, the jet thrusts needed to bring it to desired trajectory that consume the least amount of fuel. Two optimal control design methods have been widely used in industrial applications, as it has been shown they can guarantee closed-loop stability. These are Model Predictive Control (MPC) and Linear-Quadratic-Gaussian control
Robust control
Robust control deals explicitly with uncertainty in its approach to controller design. Controllers designed using robust control methods tend to be able to cope with small differences between the true system and the nominal model used for design.


Stochastic control
Stochastic control deals with control design with uncertainty in the model. In typical stochastic control problems, it is assumed that there exist random noise and disturbances in the model and the controller, and the control design must take into account these random deviations.



Reply

Important Note..!

If you are not satisfied with above reply ,..Please

ASK HERE

So that we will collect data for you and will made reply to the request....OR try below "QUICK REPLY" box to add a reply to this page
Popular Searches: ppt for speed control of bldc motor using sliding mode control, vacations deals, sliding mode control matlab, terminal sliding mode control of boost converter simulink, sliding mesh gear simple diagrams, adaptive second order terminal sliding mode control for aircraft re entry attitude, sliding mode control ppt,

[-]
Quick Reply
Message
Type your reply to this message here.

Image Verification
Please enter the text contained within the image into the text box below it. This process is used to prevent automated spam bots.
Image Verification
(case insensitive)

Possibly Related Threads...
Thread Author Replies Views Last Post
  DC MOTOR CONTROL USING CHOPPER seminar class 3 5,720 25-11-2015, 02:00 PM
Last Post: seminar report asees
  Cellular through remote control switch computer science crazy 4 4,684 22-07-2013, 12:10 PM
Last Post: computer topic
  Four-Quadrant Control of Switched Reluctance Motors full report seminar topics 3 5,140 14-05-2013, 01:16 PM
Last Post: Guest
  COORDINATED POWER FLOW CONTROL USING MULTIPLE FACTS seminar class 1 2,701 03-12-2012, 11:54 AM
Last Post: seminar details
  INTRODUCTION OF CONTROL SYSTEMS computer girl 0 2,424 11-06-2012, 04:51 PM
Last Post: computer girl
  Implementing Embedded Speed Control for AC Induction Motors computer girl 0 1,295 09-06-2012, 03:56 PM
Last Post: computer girl
  Open loop control vs. closed loop (feedback) control computer girl 0 1,437 09-06-2012, 10:35 AM
Last Post: computer girl
  AIRCRAFT STABILITY AND CONTROL computer girl 0 1,133 08-06-2012, 01:02 PM
Last Post: computer girl
  DTMF Based Interfacing and Control of Induction Motor using TRIAC computer girl 0 1,817 07-06-2012, 11:02 AM
Last Post: computer girl
  TO DEVELOP A SPEED CONTROL SYSTEM FOR AC MOTOR USING MULTILEVEL (3,5,7) INVERTERS. computer girl 0 943 06-06-2012, 04:56 PM
Last Post: computer girl

Forum Jump: