• Title/Summary/Keyword: Skin Color Detection

Search Result 289, Processing Time 0.036 seconds

An Accurate Forward Head Posture Detection using Human Pose and Skeletal Data Learning

  • Jong-Hyun Kim
    • Journal of the Korea Society of Computer and Information
    • /
    • v.28 no.8
    • /
    • pp.87-93
    • /
    • 2023
  • In this paper, we propose a system that accurately and efficiently determines forward head posture based on network learning by analyzing the user's skeletal posture. Forward head posture syndrome is a condition in which the forward head posture is changed by keeping the neck in a bent forward position for a long time, causing pain in the back, shoulders, and lower back, and it is known that daily posture habits are more effective than surgery or drug treatment. Existing methods use convolutional neural networks using webcams, and these approaches are affected by the brightness, lighting, skin color, etc. of the image, so there is a problem that they are only performed for a specific person. To alleviate this problem, this paper extracts the skeleton from the image and learns the data corresponding to the side rather than the frontal view to find the forward head posture more efficiently and accurately than the previous method. The results show that the accuracy is improved in various experimental scenes compared to the previous method.

Efficient Intermediate Joint Estimation using the UKF based on the Numerical Inverse Kinematics (수치적인 역운동학 기반 UKF를 이용한 효율적인 중간 관절 추정)

  • Seo, Yung-Ho;Lee, Jun-Sung;Lee, Chil-Woo
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.47 no.6
    • /
    • pp.39-47
    • /
    • 2010
  • A research of image-based articulated pose estimation has some problems such as detection of human feature, precise pose estimation, and real-time performance. In particular, various methods are currently presented for recovering many joints of human body. We propose the novel numerical inverse kinematics improved with the UKF(unscented Kalman filter) in order to estimate the human pose in real-time. An existing numerical inverse kinematics is required many iterations for solving the optimal estimation and has some problems such as the singularity of jacobian matrix and a local minima. To solve these problems, we combine the UKF as a tool for optimal state estimation with the numerical inverse kinematics. Combining the solution of the numerical inverse kinematics with the sampling based UKF provides the stability and rapid convergence to optimal estimate. In order to estimate the human pose, we extract the interesting human body using both background subtraction and skin color detection algorithm. We localize its 3D position with the camera geometry. Next, through we use the UKF based numerical inverse kinematics, we generate the intermediate joints that are not detect from the images. Proposed method complements the defect of numerical inverse kinematics such as a computational complexity and an accuracy of estimation.

Hand Gesture Recognition Algorithm Robust to Complex Image (복잡한 영상에 강인한 손동작 인식 방법)

  • Park, Sang-Yun;Lee, Eung-Joo
    • Journal of Korea Multimedia Society
    • /
    • v.13 no.7
    • /
    • pp.1000-1015
    • /
    • 2010
  • In this paper, we propose a novel algorithm for hand gesture recognition. The hand detection method is based on human skin color, and we use the boundary energy information to locate the hand region accurately, then the moment method will be employed to locate the hand palm center. Hand gesture recognition can be separated into 2 step: firstly, the hand posture recognition: we employ the parallel NNs to deal with problem of hand posture recognition, pattern of a hand posture can be extracted by utilize the fitting ellipses method, which separates the detected hand region by 12 ellipses and calculates the white pixels rate in ellipse line. the pattern will be input to the NNs with 12 input nodes, the NNs contains 4 output nodes, each output node out a value within 0~1, the posture is then represented by composed of the 4 output codes. Secondly, the hand gesture tracking and recognition: we employed the Kalman filter to predict the position information of gesture to create the position sequence, distance relationship between positions will be used to confirm the gesture. The simulation have been performed on Windows XP to evaluate the efficiency of the algorithm, for recognizing the hand posture, we used 300 training images to train the recognizing machine and used 200 images to test the machine, the correct number is up to 194. And for testing the hand tracking recognition part, we make 1200 times gesture (each gesture 400 times), the total correct number is 1002 times. These results shows that the proposed gesture recognition algorithm can achieve an endurable job for detecting the hand and its' gesture.

Vision-based Motion Control for the Immersive Interaction with a Mobile Augmented Reality Object (모바일 증강현실 물체와 몰입형 상호작용을 위한 비전기반 동작제어)

  • Chun, Jun-Chul
    • Journal of Internet Computing and Services
    • /
    • v.12 no.3
    • /
    • pp.119-129
    • /
    • 2011
  • Vision-based Human computer interaction is an emerging field of science and industry to provide natural way to communicate with human and computer. Especially, recent increasing demands for mobile augmented reality require the development of efficient interactive technologies between the augmented virtual object and users. This paper presents a novel approach to construct marker-less mobile augmented reality object and control the object. Replacing a traditional market, the human hand interface is used for marker-less mobile augmented reality system. In order to implement the marker-less mobile augmented system in the limited resources of mobile device compared with the desktop environments, we proposed a method to extract an optimal hand region which plays a role of the marker and augment object in a realtime fashion by using the camera attached on mobile device. The optimal hand region detection can be composed of detecting hand region with YCbCr skin color model and extracting the optimal rectangle region with Rotating Calipers Algorithm. The extracted optimal rectangle region takes a role of traditional marker. The proposed method resolved the problem of missing the track of fingertips when the hand is rotated or occluded in the hand marker system. From the experiment, we can prove that the proposed framework can effectively construct and control the augmented virtual object in the mobile environments.

Detection of Chlorotoluene and Nitrotoluene Compounds by Recombinant Microbial Biosensors (재조합 미생물 바이오센서를 이용한 chlorotoluene과 nitrotoluene 화합물의 검출)

  • Lee, Da Young;Cho, Jae Ho;Lim, Woon Ki;Shin, Hae Ja
    • Journal of Life Science
    • /
    • v.24 no.1
    • /
    • pp.54-60
    • /
    • 2014
  • Aromatic hydrocarbons are toxic environmental pollutants that are detrimental to the ecosystem and human health. Among them, chlorotoluene and nitrotoluene are toxic to hydrobios and irritate the skin, eyes, and respiratory organs of humans. We herein report the development of recombinant microbial biosensors for cheap and rapid monitoring of chlorotoluene and nitrotoluene compounds. Plasmids were constructed by inserting the xylR regulatory gene for BTEX (benzene, toluene, ethylbenzene, and xylene) degradation into upstream of Po' (the DmpR activator promoter Po with the deletion of its own upstream activating sequences) or Pu (the cognate promoter of XylR)::lacZ (the ${\beta}$-galactosidase gene) and transformed into Escherichia coli $DH5{\alpha}$. In the presence of inducers, the biosensor cells immobilized in agarose developed a red color in 1-2 h due to the hydrolysis of chlorophenol red ${\beta}$-D-galactopyranoside (CPRG), a substrate of ${\beta}$-galactosidase that was expressed by the inducers. Among BTEX, high responses were specifically observed with o-, m-, p-chlorotoluene ($0.1{\mu}M-100 mM$) and o-, m-, p-nitrotoluene (0.1 mM-100 mM). Po' demonstrated higher responses than those with Pu. The biosensors immobilized in agarose showed good stability after 21 days' storage at $4^{\circ}C$, and responses in untreated wastewater spiked with chlorotoluene and nitrotoluene, suggesting they can be used to detect compounds in wastewater.

Histochemical Analysis of the Cutaneous Wound Healing in the Amphibian (양서류 피부 상처회복과정에 대한 조직화학적 분석)

  • Lim, Do-Sun;Jeong, Soon-Jeong;Jeong, Je-O;Park, Joo-Cheol;Kim, Heung-Joong;Moon, Myung-Jin;Jeong, Moon-Jin
    • Applied Microscopy
    • /
    • v.34 no.1
    • /
    • pp.1-11
    • /
    • 2004
  • The wound healing is very complex biological processing including inflammatory, reepithelialization and matrix construction. According to the biological systematic category, the ability of the healing is very different. Generally healing ability of the lower animal group has been known more excellent compared to its higher group. Therefore, lower animals have been used as the experimental model to explore the mechanism of the wound healing or repair. To verify histochemical characteristics of the wound healing, we have used skin of the frog (Bombina orientalis) as known common amphibian. At day 1, 10, and 16, the mucous substance was very actively synthesized and strong positive by PAS and Alcian blue (pH 2.5). Day 10 after wounding, margin of the wound was gradually strong positive by PTAH staining for detection of collagen synthesis. At 3 to 6 hour and day 23 to 27, we have found the cell division was active through the MG-P staining, in which the concentration and division of DNA in nucleus was green to deep blue color.

Quilitative certificational plan of heshouwu (하수오(何首烏)의 품질인증(品質認證) 방안(方案))

  • Shin, Mi-Kyung;Roh, Seong-Soo;Kil, Ki-Jeong;Seo, Bu-il;Seo, Young-Bae
    • Journal of Haehwa Medicine
    • /
    • v.13 no.2
    • /
    • pp.205-212
    • /
    • 2004
  • Now many sustitution and false articles is used in korea instead of heshouwu. To use heshouwu correctly, we will make a quilitative certificational plan of heshouwu to investigate all of lieraturea, records and documents. And we could reach conclusions as folloews. 1) Source of plant Heshouwu is a root tuber of a perennial herb Polygonum multiflorum Thunberg(Family : Polygonaceae). 2) Harvest After planting 3-5 yaers, harvesting in an autumn, washin clean the mud, a big heshouwu cut off a half or section, dry in sunny place or at a little fire. When harvesting, we harvest only a big thing, a small thing transfer a field, after culturing of 1-2 years, harvest at big roots. Harvesting is done usually in an autumn after 3 years. When collecting a seed, we must harvest a heshouwu the next year. 3) Process We must process heshouwu at the decoction of black beans, heshouwu suck in the decoction of black beans, heat with steam in an iron pot. Black beans is used every 100 kg of heshouwu. 4) Quility (1) Funstional standards It is good that weight is heavy and outer skin is yellow-brown, section surface is light red color, powdery and has a figure such as clouds in section. (2) Physicochemical standards Heshouwu expesses a various chang of components in process of working. We think that it need to add a standard of detection about 2,3,5,4'-tetrahrdroxystilbene-2-O-${\beta}$ -D-glucoside in a current authentic document which is a water-soluble component of heshouwu. It must that Dry on loss is less than 14.0%, content of ash is less than 5.0%, Content of acid-nonsoluble ash is less than 1.5%, Content of extract is more than 17.0%. A fixed quantity of 2,3,5,4'-tetrahrdroxystilbene-2-O-${\beta}$ -D-glucoside is more than 1.0%. Contens of heavy metal has to detect less than 30 ppm and there is no reminding agriculural medince.

  • PDF

Usefulness of External Monitoring Flap in the Buried Jejunal Free Flap (유리 공장 피판술 후 외부 감시 피판의 유용성)

  • Kim, Baek Kyu;Chang, Hak;Minn, Kyung Won;Hong, Joon Pio;Koh, Kyung Suck
    • Archives of Plastic Surgery
    • /
    • v.34 no.4
    • /
    • pp.432-435
    • /
    • 2007
  • Purpose: The jejunal free flap has the shorter ischemic time than other flap and requires a laparotomy to harvest it. As the evaluation of the perfusion the buried flap is very important, the perfusion of the buried jejunal free flap requires monitoring for its salvage. We tried to improve the monitoring flap method in the jejunal free flap and examined its usefulness. Methods: From March 2002 to March 2006, the monitoring flap method was applied to 4 cases in 8 jejunal free flaps for the pharyngeal and cervical esophageal reconstructions. The distal part of the jejunal flap was exposed without suture fixation through cervical wound for monitoring its perfusion. The status of perfusion was judged by the color change of jejunal mucosa and mesentery. If necessary, pin prick test was performed. Doppler sonography was applied to mesenteric pedicle of the monitoring flap in case of suspicious abnormal circulation. Results: The monitoring flap shows no change in 3 cases, but the congestion happened in one case at the 12 hours after the operation. This congestion was caused by the twisting or kinking of the mesenteric pedicle of the monitoring flap. So, we fixed up the monitoring flap close to adjacent cervical skin for prevention of rotation. Finally, the main part of transferred jejunal flap was intact. Conclusion: The success of a jejunal free flap depends on close postoperative monitoring and early detection of vascular compromise. So, various monitoring methods have been tried, for instance, direct visualization using a fiberoptic pharyngoscope, through a Silastic window placed in the neck flap, or external surface monitoring with an Doppler sonography, use of a buried monitoring probe. But, all of the above have their own shortcomings of simplicity, non-invasiveness, reliability and etc. In our experience, monitoring flap can be a accurate and reliable method.

3D Facial Animation with Head Motion Estimation and Facial Expression Cloning (얼굴 모션 추정과 표정 복제에 의한 3차원 얼굴 애니메이션)

  • Kwon, Oh-Ryun;Chun, Jun-Chul
    • The KIPS Transactions:PartB
    • /
    • v.14B no.4
    • /
    • pp.311-320
    • /
    • 2007
  • This paper presents vision-based 3D facial expression animation technique and system which provide the robust 3D head pose estimation and real-time facial expression control. Many researches of 3D face animation have been done for the facial expression control itself rather than focusing on 3D head motion tracking. However, the head motion tracking is one of critical issues to be solved for developing realistic facial animation. In this research, we developed an integrated animation system that includes 3D head motion tracking and facial expression control at the same time. The proposed system consists of three major phases: face detection, 3D head motion tracking, and facial expression control. For face detection, with the non-parametric HT skin color model and template matching, we can detect the facial region efficiently from video frame. For 3D head motion tracking, we exploit the cylindrical head model that is projected to the initial head motion template. Given an initial reference template of the face image and the corresponding head motion, the cylindrical head model is created and the foil head motion is traced based on the optical flow method. For the facial expression cloning we utilize the feature-based method, The major facial feature points are detected by the geometry of information of the face with template matching and traced by optical flow. Since the locations of varying feature points are composed of head motion and facial expression information, the animation parameters which describe the variation of the facial features are acquired from geometrically transformed frontal head pose image. Finally, the facial expression cloning is done by two fitting process. The control points of the 3D model are varied applying the animation parameters to the face model, and the non-feature points around the control points are changed by use of Radial Basis Function(RBF). From the experiment, we can prove that the developed vision-based animation system can create realistic facial animation with robust head pose estimation and facial variation from input video image.