Fig. 1. Multi-modal sensor system configuration.
Fig. 2. Multi-modal sensor module interface.
Fig. 3. Multi-modal sensor module Head design.
Fig. 4. Wire based vibration damper.
Fig. 5. Wire based vibration damper.
Fig. 6. The muliti-modal sensor module mounted on robot platform.
Fig. 7. An example of mulit-modal sensor data.
Fig. 8. Result of vibration damping. (a) Original, (b) Proposed method, (c) Experimental Platform
Fig. 9. Take 1: Day and Night time database in the lawn playground.
Fig. 10. Take 2: Lying database in the driveway.
Fig. 11. Take 3: Heavy rain and thunder in the driveway.
Fig. 12. Take 4: Daytime in the park.
Table 1. The behavior definition in normal and abnormal situations
Table 2. The database configuration
References
- PETS, https://motchallenge.net/workshops/bmtt-pets2017/ (accessed June, 12, 2018).
- BEHAVE, http://groups.inf.ed.ac.uk/vision/BEHAVEDATA/INTERACTIONS/ (accessed June, 22, 2018).
- i-Lids, www.ilids.co.uk. (accessed July, 2, 2018).
- ViSOR, http://imagelab.ing.unimore.it/visor/ (accessed July, 9, 2018).
- G. Moon and J. Rue, "Remote Person Recognition Test Database for Intelligent Video Surveillance," Journal of Information Security, Vol. 22, No. 4, pp. 38-45, 2012.
- R. Fenrich and J.J. Hull, "Concern in Creation of Image Database," Proceedings of Third International Workshop on Frontiers in Handwriting Recognition, pp. 112-121, 1993.
- B. Zhou, A. Lapedriza, A. Khosla, A. Oliva, and A. Torralba, “Places: A 10 Million Image Database for Scene Recognition,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 40, No. 6, pp. 1452-1464, 2018. https://doi.org/10.1109/TPAMI.2017.2723009
- S. Bae, H. Lee, and D. Cho, "Design and Implementation of a Web Crawler System for Collection of Structured and Unstructured Data," Journal of Korea Multimedia Society, Vol. 21, No. 2, pp. 199-209. 2018. https://doi.org/10.9717/KMMS.2018.21.2.199
- T. Uhm, G. Bae, J. Lee, and Y. Choi, "Multi-modal Sensor Calibration Method for Intelligent Unmanned Outdoor Security Robot," Proceedings of the Sixth Intercational Conference on Green and Human Information Technology, pp. 215-220, 2018.