Issues in Control of a Robotic Spatial Augmented Reality System

로보틱 공간증강현실 시스템의 제어의 문제

  • Received : 2011.06.23
  • Accepted : 2011.10.04
  • Published : 2011.12.01

Abstract

A robotic spatial augmented reality (RSAR) system combines a robotics technology with a spatial augmented reality system (SAR) where cameras are used to recognize real objects and projectors augment information and user interface directly on the surface of the recognized objects, rather than relying on handheld display devices. Moreover, a robotic module is actively used to discover and utilize the context of users and environments. The control of a RSAR system involves several issues from different technical fields such as classical inverse kinematics of motors where projector-camera pairs are mounted, inverse projection problems to find appropriate internal/external parameters of projectors and cameras, and image warping in graphics pipeline to compensate the kinematic constraints. In this paper, we investigate various control issues related to a RSAR system and propose basic approaches to handle them, specially focused on the prototype RSAR system developed in ETRI.

Keywords

References

  1. Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., MacIntyre, B., "Recent Advance in Augmented Reality," IEEE Computer Graphics and Applications Vol. 21, No. 6, pp. 34-47, 2001. https://doi.org/10.1109/38.963459
  2. Hainich, R., The End of Hardware: Augmented Reality and Beyond, BookSurge, 2009.
  3. Bimber, O. and Raskar, R., Spatial Augmented Reality: Merging Real and Virtual Worlds, AK Peters, 2004.
  4. Raskar, R., Majumder, A., Lensch H. P. A., Bimber, O., "Projectors for Graphics," SIGGRAPH Course Notes, SIGGRAPH, LA, 2008.
  5. Mistry, P., and Maes, P., "SixthSense: A Wearable Gestural Interface" ACM SIGGRAPH ASIA'09 Sketches, Article 11, 1, 2009.
  6. Raskar, R., Welch, G., Chen, W.-C., "Table-Top Spatially-Augmented Reality: Bringing Physical Models to Life with Projected Imagery," IEEE and ACM International Workshop on Augmented Reality, pp. 64, 1999.
  7. Raskar, R., van Baar, J., Beardsley, P., Willwacher, T., Rao, S., Forlines, C., "iLamps: Geometrically Aware and Self-Configuring Projectors," Proc. ACM SIGGRAPH'03, 2003.
  8. Raskar, R., Beardsley, P., van Baar, J., Wang, Y., Dietz, P., Lee, J., Leigh, D., Willwacher, T., "RFIG Lamps: Interacting with a Self-describing World via Photosensing Wireless Tags and Projectors". Proc. ACM SIGGRAPH'04, 2004.
  9. Linder, N. and Maes, P., "LuminAR: Portable Robotic Augmented Reality Interface Design and Prototype". Adjunct Proc. ACM UIST'10, 2010.
  10. Lee, J. -E., Miyashita, S., Azuma, K., Lee, J. -H., Park, G. -T., "Anamorphosis Projection by Ubiquitous Display in Intelligent Space," Proc. Int. Conf. Universal Access in Human-Computer Interaction(UAHCI '09), pp. 209-217, 2009.
  11. Yang, R., Gotz, D., Hensley, J., Towles, H. and Brown, M. S., "PixelFlex: A Reconfigurable Multi-projector Display System," Proc. Visualization'01, pp. 167, 2001.
  12. Ziola, R., Grampurohit, S., Landes, N., Fogarty, J. and Harrison, B., "OASIS: Examining a Framework for Interacting with General-Purpose Object Recognition," Intel Labs Seattle Technical Report, 2010.
  13. Lai, K., Bo, L., Ren, X. and Fox, D., "A Large-Scale Hierarchical Multi-View RGB-D Object Dataset," IEEE International Conference on Robotics and Automation, 2011.
  14. Andrew D. Wilson, Hrvoje Benko, "Combining Multiple Depth Cameras and Projectors for Interactions on, Above and between Surfaces," Proc. ACM UIST'10, October 03-06, 2010.
  15. Jones, B. R., Sodhi, R., Campbell, R. H., Garnett, G. and Bailey, B. P., "Build Your World and Play In It: Interacting with Surface Particles on Complex Objects," Proc. Int. Symp. Mixed and Augmented Reality'10, pp. 165-174, 2010.
  16. Hisada, M., Yamamoto, K., Kanaya, I. and Satao, K., "Free-form Shape Design System Using Stereoscopic Projector: HyperReal 2.0," SICE-ICASE Int. Jnt. Conf, pp. 4832-4835, 2006.
  17. Holman, D., and Benko, H., "SketchSpace: Designing Interactive behaviors with Passive Materials," Proc. Human Factors in Computing Systems(CHI EA '11). ACM, 2011.
  18. Gruber, L., Gauglitz, S., Ventura, J., Zollmann, S., Huber, M., Schlegel, M., Klinker, G., Schmalstieg, D. and Hollerer, T., "The City of Sights: Design, Construction and Measurement of an Augmented Reality Stage Set," IEEE and ACM Int. Symp. Mixed and Augmented Reality, Seoul, Korea, Oct. 13-16, 2010.
  19. Suh Y. -H., Kim, H., Lee J. -H., Cho, J., Lee, M., Yeom, J., and Cho, E.-S.. 2011. "Future Robotic Computer: A New Type of Computing Device with Robotic Functions," In Proc. Int. Conf. Human-Robot Interaction(HRI '11), pp. 261-262, 2011.
  20. FRC Demo Movie, http://www.youtube.com/watch?v=-KZgEZgUAuw.
  21. Lee, J.-H., "Inverse Perspective Projection of Convex Quadrilaterals," ETRI Technical Report, 2011.
  22. Park, S.-Y. and Park, G. G., "Active Calibration of Camera-Projector Systems based on Planar Homography," Proc. Int. Conf. Pattern Recognition'10, pp. 320-323, 2010.
  23. Mark Fiala, "Automatic Projector Calibration Using Self-Identifying Patterns," Proc. IEEE CVPR'05, pp. 113, 2005.
  24. Erison, C., Real-Time Collision Detection, Morgan Kaufmann, 2005.
  25. Nakamura, Y., Hanafusa, H., Yoshikawa, T., "Task-priority Based Redundancy Control of Robot Manipulators," The International Journal of Robotics Research, Vol. 6, No. 2, pp. 3-15, 1987. https://doi.org/10.1177/027836498700600201
  26. Lee, J.-H., "Inverse Perspective Projection of Convex Quadrilaterals," Proc. Asian Conference on Design and Digital Engineering, ACDDE 2011, Shanghi, China, 2011.