A New $H_2$ Bound for $H_{\infty}$ Entropy

  • Zhang, Hui (State Key Laboratory of Industrial Control Technology, Institute of Industrial Process Control, Department of Control Science & Engineering, Zhejiang University) ;
  • Sun, Youxian (State Key Laboratory of Industrial Control Technology, Institute of Industrial Process Control, Department of Control Science & Engineering, Zhejiang University)
  • Published : 2008.08.31

Abstract

The $H_{\infty}$ entropy in $H_{\infty}$ control theory is discussed based on investigating information transmission in continuous-time linear stochastic systems. It is proved that the stabilizing feedback does not change the time-average information transmission between system input and output, and the $H_{\infty}$ entropies of open- and closed-loop stable transfer functions are bounded by mutual information rate between input and output in the open-loop system. Furthermore, a new $H_2$ upper bound for $H_{\infty}$ entropy is introduced with a numerical example. Thus the $H_{\infty}$ entropy of a stable transfer function is sandwiched between $H_2$ norms of the original system and a static feedback system.

Keywords

References

  1. D. Mustafa, K. Glover, Minimum Entropy $H_\infty$ Control, Springer-Verlay, Berlin, 1990
  2. P. A. Iglesias and M. A. Peters, "An entropy formula for nonlinear systems," International Journal of Franklin Institute, vol. 337, pp. 859- 874, 2000 https://doi.org/10.1016/S0016-0032(00)00055-7
  3. S.-H. Lee and J.-S. Kim, "Mixed $H_2/H_\infty$ - controller realization with entropy integral," International Journal of Control, Automation, and Systems, vol. 1, no. 2, pp. 206-209, 2003
  4. S. Boyd, Linear Controller Design - Limits of Performance, Prentice-Hall, New Jersey, 1991
  5. A. A. Stoorvogel and J. H. Van Schuppen, "System identification with information theoretic criteria," Identification, Adaptation, Learning (S. Bittanti, G. Picc, Eds.), pp. 289-338, Springer, Berlin, 1996
  6. H. Zhang and Y. X. Sun, "Information theoretic interpretations for $H_\infty$ entropy," Proc. of the 16th IFAC World Congress, Prague, July 3-8, 2005
  7. T. Cover and S. Pombra, "Gaussian feedback capacity," IEEE Trans. on Information Theory, vol. 35, no. 1, pp. 37-43, 1989 https://doi.org/10.1109/18.42174
  8. S. Ihara, Information Theory for Continuous Systems, World Scientific Publishing Co. Pte. Ltd., Singapore, 1993
  9. M. S. Pinsker, Information and Information Stability of Random Variables and Processes, Holden-Day, Inc., 1964
  10. K. Zhou, Essential of Robust Control, Prentice- Hall, New Jersey, 1998
  11. T. Kailath, Linear Estimation, Prentice Hall, New Jersey, 2000
  12. D. P. Palomar and S. Verdú, "Representation of mutual information via input estimates," IEEE Trans. on Information Theory, vol. 53, no. 2, pp. 453-470, 2007 https://doi.org/10.1109/TIT.2006.889728
  13. T. E. Duncan, "On the calculation of mutual information," SIAM Journal on Applied Mathematics, vol. 19, no. 1, pp. 215-220, 1970 https://doi.org/10.1137/0119020