Browse > Article
http://dx.doi.org/10.7746/jkros.2020.15.3.293

Transparent Manipulators Accomplished with RGB-D Sensor, AR Marker, and Color Correction Algorithm  

Kim, Dong Yeop (KETI (Korea Electronics Technology Institute))
Kim, Young Jee (KETI)
Son, Hyunsik (KETI)
Hwang, Jung-Hoon (KETI)
Publication Information
The Journal of Korea Robotics Society / v.15, no.3, 2020 , pp. 293-300 More about this Journal
Abstract
The purpose of our sensor system is to transparentize the large hydraulic manipulators of a six-ton dual arm excavator from the operator camera view. Almost 40% of the camera view is blocked by the manipulators. In other words, the operator loses 40% of visual information which might be useful for many manipulator control scenarios such as clearing debris on a disaster site. The proposed method is based on a 3D reconstruction technology. By overlaying the camera image from front top of the cabin with the point cloud data from RGB-D (red, green, blue and depth) cameras placed at the outer side of each manipulator, the manipulator-free camera image can be obtained. Two additional algorithms are proposed to further enhance the productivity of dual arm excavators. First, a color correction algorithm is proposed to cope with the different color distribution of the RGB and RGB-D sensors used on the system. Also, the edge overlay algorithm is proposed. Although the manipulators often limit the operator's view, the visual feedback of the manipulator's configurations or states may be useful to the operator. Thus, the overlay algorithm is proposed to show the edge of the manipulators on the camera image. The experimental results show that the proposed transparentization algorithm helps the operator get information about the environment and objects around the excavator.
Keywords
Transparent Manipulators; See-Through; Heavy Instruments; Hydraulic Manipulators; Dual Armed Manipulator;
Citations & Related Records
연도 인용수 순위
  • Reference
1 D. Y. Kim, Y. J. Kim, H. Son, Y. J. Choi, E. Kim, and J.-H. Hwang, "Manipulator transparent visualization for special purpose machinery using AR marker, RGB-D Sensor, and edge detector," 2019 16th International Conference on Ubiquitous Robots (UR), Jeju, Korea, pp. 560-562, 2019, [Online], https://ras.papercept.net/conferences/conferences/UR19/program/UR19_ContentListWeb_2.html.
2 Y. J. Kim, D. Y. Kim, H. S. Son, and J.-H. Hwang, "To overcome constraint of view occurred by manipulator based on AR marker and 3D depth sensor," The Korean Society of Precision Engineering Conference, pp. 317, 2018, [Online], http://www.riss.kr/link?id=A105814523.
3 Y. J. Kim, D. Y. Kim, H. S. Son, and J.-H. Hwang, "Sensor module impact test for special purpose machinery for the application for disaster," The Korean Society of Precision Engineering Conference, pp. 848-849, 2017, [Online], http://www.riss.kr/link?id=A105061998.
4 KETI 1 min. technology - transparent manipulator technology, [Online], https://www.youtube.com/watch?v=0TUgn0W-erg, Accessed: June 11, 2019.
5 T. Yanagi, C. L. Fernando, M. H. D. Y. Saraiji, K. Minamizawa, S. Tachi, and N. Kishi, "Transparent cockpit using telexistence," 2015 IEEE Virtual Reality (VR), pp. 311-312, 2015, DOI: 10.1109/VR.2015.7223420.
6 Land Rover UK, Land Rover Reveals Transparent Bonnet Concept, [Online], https://www.youtube.com/watch?v=1OlqditIsoM, Accessed: June 11, 2019.
7 Land Rover UK, Land Rover's Transparent Trailer and Cargo Sense technologies, [Online], https://www.youtube.com/watch?v=lIUB1aApNEY, Accessed: June 11, 2019.
8 R. P. Boggs, P. J. Goergen, G. A. Harrison, E. T. Sorokowsky, and E. T. Grant, "See-through augmented reality system," U.S. Patent No. 9,581,819. Feb. 28, 2017, [Online], https://patents.google.com/patent/US9581819.
9 How F-35A fighter pilots are harnessing high-tech 'see-through' helmets, [Online] https://www.foxnews.com/tech/how-f-35afighter-pilots-are-harnessing-high-tech-see-through-helmets, Accessed: March 3, 2019.
10 F. Rameau, H. Ha, K. Joo, J. Choi, K. Park, and I. S. Kweon, "A realtime augmented reality system to see-through cars," IEEE transactions on visualization and computer graphics, vol. 22, no. 11, pp. 2395-2404, Nov. 2016, DOI: 10.1109/TVCG.2016.2593768.   DOI
11 A. Prusak, R, Hubert Roth, and R. Schwarte, "Application of 3d-pmd video cameras for tasks in the autonomous mobile robotics," IFAC Proceedings Volumes, vol. 38, no. 1, pp. 138-143, 2005, DOI: 10.3182/20050703-6-CZ-1902.02075.
12 M.-J. Jung, H. Myung, H.-K. Lee, and S. W. Bang, "Ambiguity resolving in structured light 2D range finder for SLAM operation for home robot applications." IEEE Workshop on Advanced Robotics and its Social Impacts, Nagoya, Japan, 2005, DOI: 10.1109/ARSO.2005.1511613.
13 Kinect for Windows SDK 2.0, [Online], https://developer.microsoft.com/en-us/windows/kinect, Accessed: July 5, 2019.
14 Intel RealSense Depth Camera D435i, [Online], https://www.intelrealsense.com/depth-camera-d435i/, Accessed: July 5, 2019.
15 P. Lindemann and G. Rigoll, "Examining the Impact of See-Through Cockpits on Driving Performance in a Mixed Reality Prototype," The 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct, pp. 83-87, 2017, DOI: 10.1145/3131726.3131754.
16 S.-H. Kim, C. Jung, and J. Park, "Three-dimensional visualization system with spatial information for navigation of tele-operated robots," Sensors, vol. 19, no. 3, 2019, DOI: 10.3390/s19030746.
17 Y. Xiong and K. Pulli, "Color correction for mobile panorama imaging," The 1st International Conference on Internet Multimedia Computing and Service, 2009, DOI: 10.1145/1734605.1734657.
18 A. Atapour-Abarghouei, S. Akcay, G. P. de La Garanderie, and T. P. Breckon, "Generative adversarial framework for depth filling via wasserstein metric, cosine transform and domain transfer," Pattern Recognition, pp. 232-244, 2019, DOI: 10.1016/j.patcog.2019.02.010.