• Title/Summary/Keyword: Augmented Human

Search Result 282, Processing Time 0.021 seconds

Hand Gesture Interface for Manipulating 3D Objects in Augmented Reality (증강현실에서 3D 객체 조작을 위한 손동작 인터페이스)

  • Park, Keon-Hee;Lee, Guee-Sang
    • The Journal of the Korea Contents Association
    • /
    • v.10 no.5
    • /
    • pp.20-28
    • /
    • 2010
  • In this paper, we propose a hand gesture interface for the manipulation of augmented objects in 3D space using a camera. Generally a marker is used for the detection of 3D movement in 2D images. However marker based system has obvious defects since markers are always to be included in the image or we need additional equipments for controling objects, which results in reduced immersion. To overcome this problem, we replace marker by planar hand shape by estimating the hand pose. Kalman filter is for robust tracking of the hand shape. The experimental result indicates the feasibility of the proposed algorithm for hand based AR interfaces.

Glycine induces enhancement of bactericidal activity of neutrophils

  • Kang, Shin-Hae;Ham, Hwa-Yong;Hong, Chang-Won;Song, Dong-Keun
    • The Korean Journal of Physiology and Pharmacology
    • /
    • v.26 no.4
    • /
    • pp.229-238
    • /
    • 2022
  • Severe bacterial infections are frequently accompanied by depressed neutrophil functions. Thus, agents that increase the microbicidal activity of neutrophils could add to a direct antimicrobial therapy. Lysophosphatidylcholine augments neutrophil bactericidal activity via the glycine (Gly)/glycine receptor (GlyR) α2/TRPM2/p38 mitogen-activated protein kinase (MAPK) pathway. However, the direct effect of glycine on neutrophil bactericidal activity was not reported. In this study, the effect of glycine on neutrophil bactericidal activity was examined. Glycine augmented bactericidal activity of human neutrophils (EC50 = 238 μM) in a strychnine (a GlyR antagonist)-sensitive manner. Glycine augmented bacterial clearance in mice, which was also blocked by strychnine (0.4 mg/kg, s.c.). Glycine enhanced NADPH oxidase-mediated reactive oxygen species (ROS) production and TRPM2-mediated [Ca2+]i increase in neutrophils that had taken up E. coli. Glycine augmented Lucifer yellow uptake (fluid-phase pinocytosis) and azurophil granule-phagosome fusion in neutrophils that had taken up E. coli in an SB203580 (a p38 MAPK inhibitor)-sensitive manner. These findings indicate that glycine augments neutrophil microbicidal activity by enhancing azurophil granule-phagosome fusion via the GlyRα2/ROS/calcium/p38 MAPK pathway. We suggest that glycine could be a useful agent for increasing neutrophil bacterial clearance.

Establishing the Framework of Industry Metaverse based on Digital Twin through Case Studies (디지털트윈 기반의 인더스트리 메타버스 : 사례분석을 통한 프레임워크의 정립)

  • Yang, Kyung Ran;Yoon, Sung Chul;Park, Soo Kyung;Lee, Bong Gyou
    • Journal of Korea Multimedia Society
    • /
    • v.25 no.8
    • /
    • pp.1122-1135
    • /
    • 2022
  • With the development of digital technology and the influence of the global pandemic, the metaverse, a three-dimensional virtual world, is receiving attention in society, economy and overall industry, and the manufacturing industry is also accepting it as a major strategic agenda for digital transformation. Therefore, in this study, the concept of the industry metaverse from the perspective of the manufacturing industry was defined, and the types of the industry metaverse were classified into four types by reflecting the characteristics of the manufacturing industry based on the general metaverse scenario presented in previous studies. These are Virtual behavior simulation, Augmented operation of business objects and Virtual experience simulation, Augmented decision of business subjects. In addition, through case analysis of solutions used in the manufacturing industry, it was confirmed that the central technology of the Industry Metaverse is the digital twin, and that it is being implemented by convergence with major digital technologies such as virtual reality, augmented reality, digital human, and AI. This study will be able to provide guidelines for future research on the metaverse from the perspective of the manufacturing industry and establishment of a digital transformation strategy for the industry.

Dynamic Manipulation of a Virtual Object in Marker-less AR system Based on Both Human Hands

  • Chun, Jun-Chul;Lee, Byung-Sung
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.4 no.4
    • /
    • pp.618-632
    • /
    • 2010
  • This paper presents a novel approach to control the augmented reality (AR) objects robustly in a marker-less AR system by fingertip tracking and hand pattern recognition. It is known that one of the promising ways to develop a marker-less AR system is using human's body such as hand or face for replacing traditional fiducial markers. This paper introduces a real-time method to manipulate the overlaid virtual objects dynamically in a marker-less AR system using both hands with a single camera. The left bare hand is considered as a virtual marker in the marker-less AR system and the right hand is used as a hand mouse. To build the marker-less system, we utilize a skin-color model for hand shape detection and curvature-based fingertip detection from an input video image. Using the detected fingertips the camera pose are estimated to overlay virtual objects on the hand coordinate system. In order to manipulate the virtual objects rendered on the marker-less AR system dynamically, a vision-based hand control interface, which exploits the fingertip tracking for the movement of the objects and pattern matching for the hand command initiation, is developed. From the experiments, we can prove that the proposed and developed system can control the objects dynamically in a convenient fashion.

Discriminant Analysis of Human's Implicit Intent based on Eyeball Movement (안구운동 기반의 사용자 묵시적 의도 판별 분석 모델)

  • Jang, Young-Min;Mallipeddi, Rammohan;Kim, Cheol-Su;Lee, Minho
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.50 no.6
    • /
    • pp.212-220
    • /
    • 2013
  • Recently, there has been tremendous increase in human-computer/machine interaction system, where the goal is to provide with an appropriate service to the user at the right time with minimal human inputs for human augmented cognition system. To develop an efficient human augmented cognition system based on human computer/machine interaction, it is important to interpret the user's implicit intention, which is vague, in addition to the explicit intention. According to cognitive visual-motor theory, human eye movements and pupillary responses are rich sources of information about human intention and behavior. In this paper, we propose a novel approach for the identification of human implicit visual search intention based on eye movement pattern and pupillary analysis such as pupil size, gradient of pupil size variation, fixation length/count for the area of interest. The proposed model identifies the human's implicit intention into three types such as navigational intent generation, informational intent generation, and informational intent disappearance. Navigational intent refers to the search to find something interesting in an input scene with no specific instructions, while informational intent refers to the search to find a particular target object at a specific location in the input scene. In the present study, based on the human eye movement pattern and pupillary analysis, we used a hierarchical support vector machine which can detect the transitions between the different implicit intents - navigational intent generation to informational intent generation and informational intent disappearance.

A Background Segmentation and Feature Point Extraction Method of Human Motion Recognition (동작인식을 위한 배경 분할 및 특징점 추출 방법)

  • You, Hwi-Jong;Kim, Tae-Young
    • Journal of Korea Game Society
    • /
    • v.11 no.2
    • /
    • pp.161-166
    • /
    • 2011
  • In this paper, we propose a novel background segmentation and feature point extraction method of a human motion for the augmented reality game. First, our method transforms input image from RGB color space to HSV color space, then segments a skin colored area using double threshold of H, S value. And it also segments a moving area using the time difference images and then removes the noise of the area using the Hessian affine region detector. The skin colored area with the moving area is segmented as a human motion. Next, the feature points for the human motion are extracted by calculating the center point for each block in the previously obtained image. The experiments on various input images show that our method is capable of correct background segmentation and feature points extraction 12 frames per second.

QA Pair Passage RAG-based LLM Korean chatbot service (QA Pair Passage RAG 기반 LLM 한국어 챗봇 서비스)

  • Joongmin Shin;Jaewwook Lee;Kyungmin Kim;Taemin Lee;Sungmin Ahn;JeongBae Park;Heuiseok Lim
    • Annual Conference on Human and Language Technology
    • /
    • 2023.10a
    • /
    • pp.683-689
    • /
    • 2023
  • 자연어 처리 분야는 최근에 큰 발전을 보였으며, 특히 초대규모 언어 모델의 등장은 이 분야에 큰 영향을 미쳤다. GPT와 같은 모델은 다양한 NLP 작업에서 높은 성능을 보이고 있으며, 특히 챗봇 분야에서 중요하게 다루어지고 있다. 하지만, 이러한 모델에도 여러 한계와 문제점이 있으며, 그 중 하나는 모델이 기대하지 않은 결과를 생성하는 것이다. 이를 해결하기 위한 다양한 방법 중, Retrieval-Augmented Generation(RAG) 방법이 주목받았다. 이 논문에서는 지식베이스와의 통합을 통한 도메인 특화형 질의응답 시스템의 효율성 개선 방안과 벡터 데이터 베이스의 수정을 통한 챗봇 답변 수정 및 업데이트 방안을 제안한다. 본 논문의 주요 기여는 다음과 같다: 1) QA Pair Passage RAG을 활용한 새로운 RAG 시스템 제안 및 성능 향상 분석 2) 기존의 LLM 및 RAG 시스템의 성능 측정 및 한계점 제시 3) RDBMS 기반의 벡터 검색 및 업데이트를 활용한 챗봇 제어 방법론 제안

  • PDF

Implementation of AR based Assembly System for Car C/pad Assembly (차체 C/Pad 조립을 위한 증강현실 기반의 조립시스템 구현)

  • Park, Hong-Seok;Choi, Hung-Won;Park, Jin-Woo
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.25 no.8
    • /
    • pp.37-44
    • /
    • 2008
  • Nowadays, the increasing global competition forces manufacturer to reduce the cost and time for implementation of manufacturing system. The AR(augmented reality) technology as a new human-machine interface introduces a noteworthy perspective for a new manufacturing system design. Using AR technology, a physically existing production environment can be superimposed with virtual planning objects. Therefore, the planning tasks can be validated without modeling the surrounding environment of the production domain during short process planning time. In this paper, we introduce the construction of AR browser and determine the optimal environment parameters for field application of AR system through lots of tests. And, many methods such as multi-marker coordinate system, division of virtual objects and so on, are proposed in order to solve the problems suggested from initial field test. Based on these tests and results, the test-bed of C/Pad assembly system is configured and robot program for C/Pad assembly is generated based on AR system.

Interactive Dynamic Simulation Schemes for Articulated Bodies through Haptic Interface

  • Son, Wook-Ho;Kim, Kyung-Hwan;Jang, Byung-Tae;Choi, Byung-Tae
    • ETRI Journal
    • /
    • v.25 no.1
    • /
    • pp.25-33
    • /
    • 2003
  • This paper describes interactive dynamic simulation schemes for articulated bodies in virtual environments, where user interaction is allowed through a haptic interface. We incorporated these schemes into our dynamic simulator I-GMS, which was developed in an object-oriented framework for simulating motions of free bodies and complex linkages, such as those needed for robotic systems or human body simulation. User interaction is achieved by performing push and pull operations with the PHANToM haptic device, which runs as an integrated part of I-GMS. We use both forward and inverse dynamics of articulated bodies for the haptic interaction by the push and pull operations, respectively. We demonstrate the user-interaction capability of I-GMS through on-line editing of trajectories for 6-dof (degrees of freedom) articulated bodies.

  • PDF

Hand Movement Tracking and Recognizing Hand Gestures (핸드 제스처를 인식하는 손동작 추적)

  • Park, Kwang-Chae;Bae, Ceol-Soo
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.14 no.8
    • /
    • pp.3971-3975
    • /
    • 2013
  • This paper introduces an Augmented Reality system recognizing hand gestures and shows results of the evaluation. The system's user can interact with artificial objects and manipulate their position and motions simply by his hand gestures. Hand gesture recognition is based on Histograms of Oriented Gradients (HOG). Salient features of human hand appearance are detected by HOG blocks. Blocks of different sizes are tested to define the most suitable configuration. To select the most informative blocks for classification multiclass AdaBoostSVM algorithm is applied. Evaluated recognition rate of the algorithm is 94.0%.