IC 1003 OPTIMAL CONTROL 3 0 0 100AIM
To gain knowledge on formulation and application of optimal control problems.
i. To study various performance measures and programming techniques.
ii. To study the computational procedure for solving optimal control problems.
iii. To study the calculus of variations.
iv. To study the variational approach to optimal
v. To study the applications of Pontryagin’s minimum principle.
1. INTRODUCTION 9
Statement of optimal control problem – Problem formulation and forms of optimal control – Performance measures for optimal control – Selection of performance measure – Various methods of optimization – Linear programming – Non-linear programming – Dynamic programming.
2. DYNAMIC PROGRAMMING 9
Principle of optimality – Recurrent relation of dynamic programming for optimal control problem – Computational procedure for solving optimal control problems – Characteristics of dynamic programming solution – Hamilton Jacobi Bellman equation – Application to a continuous linear regulator problem.
3. CALCULUS OF VARIATIONS 9
Fundamentals concepts – Functional of a single function – Functional involving several
independent functions – Piecewise smooth extremals – Constrained extrema.
4. VARIATIONAL APPROACH TO OPTIMAL CONTROL 9
Necessary conditions for optimal control – Linear regulator problems – Pontryagin’s
minimum principle and state inequality constraints.
5. APPLICATIONS OF PONTRYAGIN’S MINIMUM PRINCIPLE 9
Minimum time problem – Minimum control effort problems: minimum fuel problem, minimum energy problem – singular intervals in optimal control problems.
L = 45 Total = 45
1. B. Sarkar, ‘Control System Design – The Optimal Approach’, Wheeler Publishing, New Delhi, 1997.
2. M. Gopal, ‘Modern Control System Theory’, New Age International Ltd., 2002.
1. Donald E. Kirk, ‘Optimal Control Theory – An introduction ‘, Pearson Education, 1970.
2. Kemin Zbou, J.C. Doyle, ‘Robust & Optimal Control’, Pearson Education, 1996.