DOI QR코드

DOI QR Code

Motion-capture-based walking simulation of digital human adapted to laser-scanned 3D as-is environments for accessibility evaluation

  • Maruyama, Tsubasa (Graduate School of Information Science and Technology, Hokkaido University) ;
  • Kanai, Satoshi (Graduate School of Information Science and Technology, Hokkaido University) ;
  • Date, Hiroaki (Graduate School of Information Science and Technology, Hokkaido University) ;
  • Tada, Mitsunori (National Institute of Advanced Industrial Science and Technology)
  • Received : 2015.09.17
  • Accepted : 2016.03.21
  • Published : 2016.07.01

Abstract

Owing to our rapidly aging society, accessibility evaluation to enhance the ease and safety of access to indoor and outdoor environments for the elderly and disabled is increasing in importance. Accessibility must be assessed not only from the general standard aspect but also in terms of physical and cognitive friendliness for users of different ages, genders, and abilities. Meanwhile, human behavior simulation has been progressing in the areas of crowd behavior analysis and emergency evacuation planning. However, in human behavior simulation, environment models represent only "as-planned" situations. In addition, a pedestrian model cannot generate the detailed articulated movements of various people of different ages and genders in the simulation. Therefore, the final goal of this research was to develop a virtual accessibility evaluation by combining realistic human behavior simulation using a digital human model (DHM) with "as-is" environment models. To achieve this goal, we developed an algorithm for generating human-like DHM walking motions, adapting its strides, turning angles, and footprints to laser-scanned 3D as-is environments including slopes and stairs. The DHM motion was generated based only on a motion-capture (MoCap) data for flat walking. Our implementation constructed as-is 3D environment models from laser-scanned point clouds of real environments and enabled a DHM to walk autonomously in various environment models. The difference in joint angles between the DHM and MoCap data was evaluated. Demonstrations of our environment modeling and walking simulation in indoor and outdoor environments including corridors, slopes, and stairs are illustrated in this study.

Keywords

References

  1. ISO21542, Building construction -Accessibility and usability of the built environment, 2011.
  2. ISO/IEC Guide 71 second edition, Guide for addressing accessibility in standards, 2014.
  3. Thalmann D, Musse RS. Crowd Simulation. London: Springer; 242.
  4. Duives CD, Daamen W, Hoogendoorn SP. State-of-the-art crowd motion simulation models. J. Transp. Res Part C: Emerg. Technol. 2013;35:193-209. https://doi.org/10.1016/j.trc.2013.07.008
  5. Kakizaki T, Urii J, Endo M. Post-Tsunami evacuation simulation using 3D kinematic digital human models and experimental verification. J. Comput. Inf. Sci. Eng. 2014;14(2)021010-1-9. https://doi.org/10.1115/1.4026896
  6. T. Maruyama, S. Kanai, H. Date, MoCap-based adaptive human-like walking simulation in laser-scanned large-scale as-built environments, Lecture Notes in Computer Science, vol. 9185, 2015, pp. 193-204.
  7. Helbing D, Farkas I, Vicsek T. Simulating dynamical features of escape panic. Nature 2000;407:487-90. https://doi.org/10.1038/35035023
  8. F. Tecchia, C. Loscos, R. Conroy, Y. Chrysanthou, Agent behavior simulator (ABS): a platform for urban behavior development, in: Proceedings of the ACM Games Technology Conference, Hong Kong, Jan. 17-20, 2001, pp. 17-21.
  9. J. Pettre, J.P. Laumond, D. Thalmann, A navigation graph for real-time crowd animation on multi-layered and uneven terrain, in: Proceedings of the First International Workshop on Crowd Simulation (V-CROWDS'05), Lausanne, Nov. 24-25, 2005, pp. 81-90.
  10. Oesau S, Lafarge F, Alliez P. Indoor scene reconstruction using feature sensitive primitive extraction. ISPRS J. Photogramm. Remote Sens. 2014;90:68-82. https://doi.org/10.1016/j.isprsjprs.2014.02.004
  11. Nuchter A, Hertzberg J. Towards semantic maps for mobile robots. J. Robot. Auton. Syst. 2008;56(11)915-26. https://doi.org/10.1016/j.robot.2008.08.001
  12. Xiong X, Adan A, Akinci B, Huber D. Automatic creation of semanti-cally rich 3D building models from laser scanned data. J. Autom. Constr. 2013;31:325-37. https://doi.org/10.1016/j.autcon.2012.10.006
  13. Rusu RB, Marton ZC, Blodow N, Dolha M, Beetz M. Towards 3D point cloud based object maps for household environments. J. Robot. Auton. Syst. 2008;56(11)927-41. https://doi.org/10.1016/j.robot.2008.08.005
  14. Xiao J, Furukawa Y. Reconstructing the world's museums. Int. J. Comput. Vis. 2014;110(3)243-58. https://doi.org/10.1007/s11263-014-0711-y
  15. Tang P, Huber D, Akinci B, Lipman R, Lytle A. Automatic reconstruc-tion of as-is building information models from laser-scanned point clouds: a review of related techniques. J. Autom. Constr. 2010;19: 829-43. https://doi.org/10.1016/j.autcon.2010.06.007
  16. Troje NF. Decomposing biological motion: a framework for analysis and synthesis of human gait patterns. J. Vis. 2002;2(5)371-87.
  17. J. Min, H. Liu, J. Chai, Synthesis and editing of personalized stylistic human motion, in: Proceedings of the 2010 ACM SIGGRAPH sympo-sium on Interactive 3D Graphics and Games, Bethesda, Feb. 19-21, 2010, pp. 39-46.
  18. T. Mukai, Motion rings for interactive gait synthesis, in: Proceedings of ACM Symposium on Interactive 3D Graphics and Games, San Francisco, Feb. 18-20, 2011, p.125-132.
  19. Grochow K, Martin LS, Hertzmann A, Popovic Z. Style-based inverse kinematics. ACM Trans. Graph. 2004;23(23)522-31. https://doi.org/10.1145/1015706.1015755
  20. Yin KK, Loken K, Panne M. SIMBICON: Simple Biped Locomotion Control. ACM Trans. Graph. 2007;26(3)105. https://doi.org/10.1145/1276377.1276509
  21. Coros S, Beaudoin P, Panne M. Generalized biped walking control. ACM Trans. Graph. 2010;29(4)130. https://doi.org/10.1145/1778765.1781156
  22. Xiang Y, Arora JS, Abdel-Malek K. Physics-based modeling and simulation of human walking: a review of optimization-based and other approaches. J. Struct. Multidiscip. Optim. 2010;42(1)1-23. https://doi.org/10.1007/s00158-010-0496-8
  23. R.A. Al-Asqhar, T. Komura, M.G. Choi, Relationship descriptors for interactive motion adaptation, in: Proceedings of the 12th ACM SIGGRAPH/Eurographics Symposium on Computer Animation, Califor-nia, July 19-21, 2013, pp. 45-53.
  24. M.P. Reed, Modeling ascending and descending stairs using the Human Motion Simulation Framework, in: Proceedings of the SAE Digital Human Modeling for Design and Engineering Conference, Goteborg, June 9-11, 2009, 2009-01-2282.
  25. Rusu RB. Semantic 3D Object Maps for Everyday Robot Manipulation. Berlin: Springer, Berlin Heidelberg; 225.
  26. Ramer U. An iterative procedure for the polygonal approximation of plane curves. J. Comput. Graph. Image Process. 1972;1(3)244-56. https://doi.org/10.1016/S0146-664X(72)80017-0
  27. VICON, [cited 2015 July 10], Available from: .
  28. C-Motion -Visual 3D, [cited 2015 July 10], Available from: .
  29. Y. Kobayashi, M. Mochimaru, AIST Gait Database 2013, [cited 2015 July 10], Available from: 2013.
  30. H. Pan, X. Hou, C. Gao, Y. Lei, A method of real-time human motion retargeting for 3D terrain adaptation, in Proceedings of the 13th IEEE Joint International Computer Science and Information Technology Conference, Chongqing, Aug. 20-22, 2011, pp. 1-5.
  31. Perry J, Burnfield MJ. GAIT ANALYSIS Normal and Pathological Function, Second ed., New Jersey: SLACK Inc.; 551.
  32. PCL -Point Cloud Library, [cited 2015 July 10], Available from: .

Cited by

  1. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine vol.17, pp.2, 2016, https://doi.org/10.3390/s17020299
  2. Simulation-Based Evaluation of Ease of Wayfinding Using Digital Human and As-Is Environment Models vol.6, pp.9, 2016, https://doi.org/10.3390/ijgi6090267