DOI QR코드

DOI QR Code

Infrared Sensitive Camera Based Finger-Friendly Interactive Display System

  • Ghimire, Deepak (Computer Engineering Department Chonbuk National University) ;
  • Kim, Joon-Cheol (Department of Electronics Engineering Seonam University) ;
  • Lee, Kwang-Jae (Department of Electronics Engineering Seonam University) ;
  • Lee, Joon-Whoan (Computer Engineering Department Chonbuk National University)
  • Received : 2010.07.28
  • Accepted : 2010.12.20
  • Published : 2010.12.28

Abstract

In this paper we present a system that enables the user to interact with large display system even without touching the screen. With two infrared sensitive cameras mounted on the bottom left and bottom right of the display system pointing upwards, the user fingertip position on the selected region of interest of each camera view is found using vertical intensity profile of the background subtracted image. The position of the finger in two images of left and right camera is mapped to the display screen coordinate by using pre-determined matrices, which are calculated by interpolating samples of user finger position on the images taken by pointing finger over some known coordinate position of the display system. The screen is then manipulated according to the calculated position and depth of the fingertip with respect to the display system. Experimental results demonstrate an efficient, robust and stable human computer interaction.

Keywords

References

  1. Mitsubishi DiamondTouch, last visited July 20, 2010, http://www.merl.com/projects/DiamondTouch/
  2. Morrison G. D., “A CMOS Camera-Based Man-Machine Input Device for Large-Format Interactive Displays”, ACM SIGGRAPH 2007 courses, SIGGRAPH ’07. ACM, New York, NY, 65-74.
  3. V. I. Pavlovic, R. Sharma, T. S. Huang, “Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review”, IEEE Transaction on Pattern Analysis and Machine Intelligence, Vol. 19, No. 7, July 1997, pp. 677-695. https://doi.org/10.1109/34.598226
  4. T.-S. Cheung, “Stereo Computer Vision with Multiple Cameras”, Hongkong University of Science and Technology.
  5. T. Fukasawa, K. Fukushi, H. Koike, “A Vision-Based Noncontact Interactive Advertisement with a Display Wall”, ICEC 2006, LNCS 4161, pp. 394-397.
  6. I. Katz, K. Gabayan, H. Aghajan, “A Multi-touch Surface Using Multiple Cameras”, ACIVS 2007, LNCS 4678, pp. 97-108.
  7. Ye Zhou, G. Morrison, “A Real-time algorithm for Finger Detection in a Camera Based Finger-Friendly Interactive Board System”, Proc. ICVS, 2007.
  8. A. Sanghi, H. Arora, K. Gupta, V.B. Vats, “A Fingertip Detection and Tracking System as a Virtual Mouse, a Signature Input Device and an Application Selector”, Proc. of the 7th International Caribbean Conference on Devices, Circuits, and Systems, April 2008.
  9. D.-D. Yang, L.-W. Jin, J.-X. Yin, L.-X. Zhen, J.-C. Huang, “An Effective Robust Fingertip Detection Method for Finger Writing Character Recognition System”, Proc. of the 4th International Conference on Machine Learning and Cybernetics, August 2005, pp. 4991-4996.
  10. J. Ravikiran, K. Mahesh, S. Mahishi, R. Dheeraj, S. Sudheender, N. V. Pujari, “Finger Detection for Sign Language Recognition”, Proc. IMECS 2009, March 2009.
  11. Academics, Perspective Transform Estimation, last visited November 11, 2010, http://alumni.media.mit.edu/-cwren/interpolator/
  12. R. Jain, R. Kasturi, B. G. Schunck, “Machine Vision”, McGRAW-HILL International editions, pp. 382-383, 1995.
  13. I. MacKenzie and C.Ware, “Lag as a determinant of human performance in interactive systems”, Conference on Computer-Human Interaction, vol. 1, no. 4, pp. 356, 1994.