• Title/Summary/Keyword: Robot hands

Search Result 98, Processing Time 0.021 seconds

Manipulator with Camera for Mobile Robots (모바일 로봇을 위한 카메라 탑재 매니퓰레이터)

  • Lee Jun-Woo;Choe, Kyoung-Geun;Cho, Hun-Hee;Jeong, Seong-Kyun;Bong, Jae-Hwan
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.17 no.3
    • /
    • pp.507-514
    • /
    • 2022
  • Mobile manipulators are getting lime light in the field of home automation due to their mobility and manipulation capabilities. In this paper, we developed a small size manipulator system that can be mounted on a mobile robot as a preliminary study to develop a mobile manipulator. The developed manipulator has four degree-of-freedom. At the end-effector of manipulator, there are a camera and a gripper to recognize and manipulate the object. One of four degree-of-freedom is linear motion in vertical direction for better interaction with human hands which are located higher than the mobile manipulator. The developed manipulator was designed to dispose the four actuators close to the base of the manipulator to reduce rotational inertia of the manipulator, which improves stability of manipulation and reduces the risk of rollover. The developed manipulator repeatedly performed a pick and place task and successfully manipulate the object within the workspace of manipulator.

Puppet Control System Optimized in the Number of Motors and the Size (구동기 수와 크기에서 최적화된 줄 인형 제어 시스템)

  • Kim, Byeong-Yeol;Han, Young-Jun;Hahn, Hun-Soo
    • The Journal of Korea Robotics Society
    • /
    • v.5 no.4
    • /
    • pp.318-325
    • /
    • 2010
  • This paper proposes a new string controller for puppet which is optimized in terms of the number of motors and its size. To optimize the number of motors needed for generating the essential motions of puppet, the motion of bending a leg is implemented by one string and the walking motion by two legs is implemented by one motor. To minimize the space needed for the controller when generating the essential motions of puppet, cylindrical and articulated joints are used in the controller. The proposed controller is actually implemented to perform various puppet shows and it has been proved that the size of the controller is small enough for two puppets to stand close to shake hands and it is fast enough to simulate fast dance motions.

Robot Gesture Reconition System based on PCA algorithm (PCA 알고리즘 기반의 로봇 제스처 인식 시스템)

  • Youk, Yui-Su;Kim, Seung-Young;Kim, Sung-Ho
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2008.04a
    • /
    • pp.400-402
    • /
    • 2008
  • The human-computer interaction technology (HCI) that has played an important role in the exchange of information between human being and computer belongs to a key field for information technology. Recently, control studies through which robots and control devices are controlled by using the movements of a person's body or hands without using conventional input devices such as keyboard and mouse, have been going only in diverse aspects, and their importance has been steadily increasing. This study is proposing a recognition method of user's gestures by applying measurements from an acceleration sensor to the PCA algorithm.

  • PDF

A Study on Development and realization of control algorithm for robot hand using master hand and slave hand (Master hand와 slave hand를 이용한 로붓 손의 제어 알고리즘 개발 및 구현에 관한 연구)

  • Lee, Seung;Choi, Kyung-Sam;Lee, Jong-Soo
    • Proceedings of the KIEE Conference
    • /
    • 2002.07d
    • /
    • pp.2430-2432
    • /
    • 2002
  • We made a master hand which can be used as tool for getting grasping data. By using the data from the master hand, we analyzed grasping patterns of human hands. Based on this analyzed results, we developed an grasping algorithm for some particular hand actions. To develop the above algorithm, we programmed a 3D simulation S/W using Visual C++. And we made a slave hand to prove the validity of the proposed algorithm.

  • PDF

Task-Based Analysis on Number of Robotic Fingers for Compliant Manipulations

  • Kim, Byoung-Ho
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.9 no.4
    • /
    • pp.333-338
    • /
    • 2009
  • This paper presents a task-based analysis on the number of independent robotic fingers required for compliant manipulations. Based on the stiffness relation between operational space and fingertip space of a multi-fingered object manipulating system, we describe a technique for modulation of the fingertip stiffness without inter-finger coupling so as to achieve the desired stiffness specified in the operational space. Thus, we provides a guide line how many fingers are basically required for successful multi-fingered compliant tasks. Consequently, this paper enables us to assign effectively the number of fingers for various compliant manipulations by robot hands.

The Behavioral Patterns of Neutral Affective State for Service Robot Using Video Ethnography (비디오 에스노그래피를 이용한 서비스 로봇의 대기상태 행동패턴 연구)

  • Song, Hyun-Soo;Kim, Min-Joong;Jeong, Sang-Hoon;Suk, Hyeon-Jeong;Kwon, Dong-Soo;Kim, Myung-Suk
    • Science of Emotion and Sensibility
    • /
    • v.11 no.4
    • /
    • pp.629-636
    • /
    • 2008
  • In recent years, a large number of robots have been developed in several countries, and these robots have been built for the purpose to appeal to users by well designed human-robot interaction. In case of the robots developed so far, they show proper reactions only when there is a certain input. On the other hands, they cannot perform in a standby mode which means there is no input. In other words, if a robot does not make any motion in standby mode, users may feel that the robot is being turned-off or even out of work. Especially, the social service robots maintain the standby status after finishing a certain task. In this period of time, if the robots can make human-like behavioral patterns such like a person in help desk, then they are expected to make people feels that they are alive and is more likely to interact with them. It is said that even if there is no interaction with others or the environment, people normally reacts to internal or external stimuli which are created by themselves such as moving their eyes or bodies. In order to create robotic behavioral patterns for standby mode, we analyze the actual facial expression and behavior from people who are in neutral affective emotion based on ethnographic methodology and apply extracted characteristics to our robots. Moreover, by using the robots which can show those series of expression and action, our research needs to find that people can feel like they are alive.

  • PDF

Hand Interface using Intelligent Recognition for Control of Mouse Pointer (마우스 포인터 제어를 위해 지능형 인식을 이용한 핸드 인터페이스)

  • Park, Il-Cheol;Kim, Kyung-Hun;Kwon, Goo-Rak
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.15 no.5
    • /
    • pp.1060-1065
    • /
    • 2011
  • In this paper, the proposed method is recognized the hands using color information with input image of the camera. It controls the mouse pointer using recognized hands. In addition, specific commands with the mouse pointer is designed to perform. Most users felt uncomfortable since existing interaction multimedia systems depend on a particular external input devices such as pens and mouse However, the proposed method is to compensate for these shortcomings by hand without the external input devices. In experimental methods, hand areas and backgrounds are separated using color information obtaining image from camera. And coordinates of the mouse pointer is determined using coordinates of the center of a separate hand. The mouse pointer is located in pre-filled area using these coordinates, and the robot will move and execute with the command. In experimental results, the recognition of the proposed method is more accurate but is still sensitive to the change of color of light.

Grasping a Target Object in Clutter with an Anthropomorphic Robot Hand via RGB-D Vision Intelligence, Target Path Planning and Deep Reinforcement Learning (RGB-D 환경인식 시각 지능, 목표 사물 경로 탐색 및 심층 강화학습에 기반한 사람형 로봇손의 목표 사물 파지)

  • Ryu, Ga Hyeon;Oh, Ji-Heon;Jeong, Jin Gyun;Jung, Hwanseok;Lee, Jin Hyuk;Lopez, Patricio Rivera;Kim, Tae-Seong
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.11 no.9
    • /
    • pp.363-370
    • /
    • 2022
  • Grasping a target object among clutter objects without collision requires machine intelligence. Machine intelligence includes environment recognition, target & obstacle recognition, collision-free path planning, and object grasping intelligence of robot hands. In this work, we implement such system in simulation and hardware to grasp a target object without collision. We use a RGB-D image sensor to recognize the environment and objects. Various path-finding algorithms been implemented and tested to find collision-free paths. Finally for an anthropomorphic robot hand, object grasping intelligence is learned through deep reinforcement learning. In our simulation environment, grasping a target out of five clutter objects, showed an average success rate of 78.8%and a collision rate of 34% without path planning. Whereas our system combined with path planning showed an average success rate of 94% and an average collision rate of 20%. In our hardware environment grasping a target out of three clutter objects showed an average success rate of 30% and a collision rate of 97% without path planning whereas our system combined with path planning showed an average success rate of 90% and an average collision rate of 23%. Our results show that grasping a target object in clutter is feasible with vision intelligence, path planning, and deep RL.

Technological Trend of Endoscopic Robots (내시경 로봇의 기술동향)

  • Kim, Min Young;Cho, Hyungsuck
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.3
    • /
    • pp.345-355
    • /
    • 2014
  • Since the beginning of the 21st century, emergence of innovative technologies in robotic and telepresence surgery has revolutionized minimally access surgery and continually has advanced them till recent years. One of such surgeries is endoscopic surgery, in which endoscope and endoscopic instruments are inserted into the body through small incision or natural openings, surgical operations being carried out by a laparoscopic procedure. Due to a vast amount of developments in this technology, this review article describes only a technological state-of-the arts and trend of endoscopic robots, being further limited to the aspects of key components, their functional requirements and operational procedure in surgery. In particular, it first describes technological limitations in developments of key components and then focuses on the description of the performance required for their functions, which include position control, tracking, navigation, and manipulation of the flexible endoscope body and its end effector as well, and so on. In spite of these rapid developments in functional components, endoscopic surgical robots should be much smaller, less expensive, easier to operate, and should seamlessly integrate emerging technologies for their intelligent vision and dexterous hands not only from the points of the view of surgical, ergonomic but also from safety. We believe that in these respects a medical robotic technology related to endoscopic surgery continues to be revolutionized in the near future, sufficient enough to replace almost all kinds of current endoscopic surgery. This issue remains to be addressed elsewhere in some other review articles.

EEG Analysis Following Change in Hand Grip Force Level for BCI Based Robot Arm Force Control (BCI 기반 로봇 손 제어를 위한 악력 변화에 따른 EEG 분석)

  • Kim, Dong-Eun;Lee, Tae-Ju;Park, Seung-Min;Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.23 no.2
    • /
    • pp.172-177
    • /
    • 2013
  • With Brain Computer Interface (BCI) system, a person with disabled limb could use this direct brain signal like electroencephalography (EEG) to control a device such as the artifact arm. The precise force control for the artifact arm is necessary for this artificial limb system. To understand the relationship between control EEG signal and the gripping force of hands, We proposed a study by measuring EEG changes of three grades (25%, 50%, 75%) of hand grip MVC (Maximal Voluntary Contract). The acquired EEG signal was filtered to obtain power of three wave bands (alpha, beta, gamma) by using fast fourier transformation (FFT) and computed power spectrum. Then the power spectrum of three bands (alpha, beta and gamma) of three classes (MVC 25%, 50%, 75%) was classified by using PCA (Principal Component Analysis) and LDA (Linear Discriminant Analysis). The result showed that the power spectrum of EEG is increased at MVC 75% more than MVC 25%, and the correct classification rate was 52.03% for left hand and 77.7% for right hand.