• Title/Summary/Keyword: User recognition

Search Result 1,357, Processing Time 0.05 seconds

Design of Gaming Interaction Control using Gesture Recognition and VR Control in FPS Game (FPS 게임에서 제스처 인식과 VR 컨트롤러를 이용한 게임 상호 작용 제어 설계)

  • Lee, Yong-Hwan;Ahn, Hyochang
    • Journal of the Semiconductor & Display Technology
    • /
    • v.18 no.4
    • /
    • pp.116-119
    • /
    • 2019
  • User interface/experience and realistic game manipulation play an important role in virtual reality first-person-shooting game. This paper presents an intuitive hands-free interface of gaming interaction scheme for FPS based on user's gesture recognition and VR controller. We focus on conventional interface of VR FPS interaction, and design the player interaction wearing head mounted display with two motion controllers; leap motion to handle low-level physics interaction and VIVE tracker to control movement of the player joints in the VR world. The FPS prototype system shows that the design interface helps to enjoy playing immersive FPS and gives players a new gaming experience.

Improved Multi-layer Authentication Scheme by Merging One-time Password with Voice Biometric Factor

  • ALRUWAILI, Amal;Hendaoui, Saloua
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.9
    • /
    • pp.346-353
    • /
    • 2021
  • In this proposal, we aim to enhance the security of systems accounts by improving the authentication techniques. We mainly intend to enhance the accuracy of the one-time passwords via including voice biometric and recognition techniques. The recognition will be performed on the server to avoid redirecting voice signatures by hackers. Further, to enhance the privacy of data and to ensure that the active user is legitimate, we propose to periodically update the activated sessions using a user-selected biometric factor. Finally, we recommend adding a pre-transaction re-authentication which will guarantee enhanced security for sensitive operations. The main novelty of this proposal is the use of the voice factor in the verification of the one-time password and the various levels of authentications for a full-security guarantee. The improvement provided by this proposal is mainly designed for sensitive applications. From conducted simulations, findings prove the efficiency of the proposed scheme in reducing the probability of hacking users' sessions.

A Study of Hybrid Automatic Interpret Support System (하이브리드 자동 통역지원 시스템에 관한 연구)

  • Lim, Chong-Gyu;Gang, Bong-Gyun;Park, Ju-Sik;Kang, Bong-Kyun
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.28 no.3
    • /
    • pp.133-141
    • /
    • 2005
  • The previous research has been mainly focused on individual technology of voice recognition, voice synthesis, translation, and bone transmission technical. Recently, commercial models have been produced using aforementioned technologies. In this research, a new automated translation support system concept has been proposed by combining established technology of bone transmission and wireless system. The proposed system has following three major components. First, the hybrid system consist of headset, bone transmission and other technologies will recognize user's voice. Second, computer recognized voice (using small server attached to the user) of the user will be converted into digital signal. Then it will be translated into other user's language by translation algorithm. Third, the translated language will be wirelessly transmitted to the other party. The transmitted signal will be converted into voice in the other party's computer using the hybrid system. This hybrid system will transmit the clear message regardless of the noise level in the environment or user's hearing ability. By using the network technology, communication between users can also be clearly transmitted despite the distance.

Motion-based Controlling 4D Special Effect Devices to Activate Immersive Contents (실감형 콘텐츠 작동을 위한 모션 기반 4D 특수효과 장치 제어)

  • Kim, Kwang Jin;Lee, Chil Woo
    • Smart Media Journal
    • /
    • v.8 no.1
    • /
    • pp.51-58
    • /
    • 2019
  • This paper describes a gesture application to controlling the special effects of physical devices for 4D contents using the PWM (Pulse Width Modulation) method. The user operation recognized by the infrared sensor is interpreted as a command for 3D content control, several of which manipulate the device that generates the special effect to display the physical stimulus to the user. With the content controlled under the NUI (Natural User Interface) technique, the user can be directly put into an immersion experience, which leads to provision of the higher degree of interest and attention. In order to measure the efficiency of the proposed method, we implemented a PWM-based real-time linear control system that manages the parameters of the motion recognition and animation controller using the infrared sensor and transmits the event.

The Development of the User-Customizable Favorites-based Smart Phone UX/UI Using Tap Pattern Similarity (탭 패턴 유사도를 이용한 사용자 맞춤형 즐겨찾기 스마트 폰 UX/UI개발)

  • Kim, Yeongbin;Kwak, Moon-Sang;Kim, Euhee
    • Journal of the Korea Society of Computer and Information
    • /
    • v.19 no.8
    • /
    • pp.95-106
    • /
    • 2014
  • In this paper, we design a smart phone UX/UI and a tap pattern recognition algorithm that can recognize tap patterns from a tapping user's fingers on the screen, and implement an application that provides user-customizable smart phones's services from the tap patterns. A user can generate a pattern by tapping the input pad several times and register it by using a smart phone's favorite program. More specifically, when the user inputs a tap pattern on the input pad, the proposed application searches a stored similar tap pattern and can run a service registered on it by measuring tap pattern similarity. Our experimental results show that the proposed method helps to guarantee the higher recognition rate and shorter input time for a variety of tap patterns.

Development of An Interactive System Prototype Using Imitation Learning to Induce Positive Emotion (긍정감정을 유도하기 위한 모방학습을 이용한 상호작용 시스템 프로토타입 개발)

  • Oh, Chanhae;Kang, Changgu
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.14 no.4
    • /
    • pp.239-246
    • /
    • 2021
  • In the field of computer graphics and HCI, there are many studies on systems that create characters and interact naturally. Such studies have focused on the user's response to the user's behavior, and the study of the character's behavior to elicit positive emotions from the user remains a difficult problem. In this paper, we develop a prototype of an interaction system to elicit positive emotions from users according to the movement of virtual characters using artificial intelligence technology. The proposed system is divided into face recognition and motion generation of a virtual character. A depth camera is used for face recognition, and the recognized data is transferred to motion generation. We use imitation learning as a learning model. In motion generation, random actions are performed according to the first user's facial expression data, and actions that the user can elicit positive emotions are learned through continuous imitation learning.

Emotional Expression Technique using Facial Recognition in User Review (사용자 리뷰에서 표정 인식을 이용한 감정 표현 기법)

  • Choi, Wongwan;Hwang, Mansoo;Kim, Neunghoe
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.22 no.5
    • /
    • pp.23-28
    • /
    • 2022
  • Today, the online market has grown rapidly due to the development of digital platforms and the pandemic situation. Therefore, unlike the existing offline market, the distinctiveness of the online market has prompted users to check online reviews. It has been established that reviews play a significant part in influencing the user's purchase intention through precedents of several studies. However, the current review writing method makes it difficult for other users to understand the writer's emotions by expressing them through elements like tone and words. If the writer also wanted to emphasize something, it was very cumbersome to thicken the parts or change the colors to reflect their emotions. Therefore, in this paper, we propose a technique to check the user's emotions through facial expression recognition using a camera, to automatically set colors for each emotion using research on existing emotions and colors, and give colors based on the user's intention.

Travel mode classification method based on travel track information

  • Kim, Hye-jin
    • Journal of the Korea Society of Computer and Information
    • /
    • v.26 no.12
    • /
    • pp.133-142
    • /
    • 2021
  • Travel pattern recognition is widely used in many aspects such as user trajectory query, user behavior prediction, interest recommendation based on user location, user privacy protection and municipal transportation planning. Because the current recognition accuracy cannot meet the application requirements, the study of travel pattern recognition is the focus of trajectory data research. With the popularization of GPS navigation technology and intelligent mobile devices, a large amount of user mobile data information can be obtained from it, and many meaningful researches can be carried out based on this information. In the current travel pattern research method, the feature extraction of trajectory is limited to the basic attributes of trajectory (speed, angle, acceleration, etc.). In this paper, permutation entropy was used as an eigenvalue of trajectory to participate in the research of trajectory classification, and also used as an attribute to measure the complexity of time series. Velocity permutation entropy and angle permutation entropy were used as characteristics of trajectory to participate in the classification of travel patterns, and the accuracy of attribute classification based on permutation entropy used in this paper reached 81.47%.

Human-Computer Natur al User Inter face Based on Hand Motion Detection and Tracking

  • Xu, Wenkai;Lee, Eung-Joo
    • Journal of Korea Multimedia Society
    • /
    • v.15 no.4
    • /
    • pp.501-507
    • /
    • 2012
  • Human body motion is a non-verbal part for interaction or movement that can be used to involves real world and virtual world. In this paper, we explain a study on natural user interface (NUI) in human hand motion recognition using RGB color information and depth information by Kinect camera from Microsoft Corporation. To achieve the goal, hand tracking and gesture recognition have no major dependencies of the work environment, lighting or users' skin color, libraries of particular use for natural interaction and Kinect device, which serves to provide RGB images of the environment and the depth map of the scene were used. An improved Camshift tracking algorithm is used to tracking hand motion, the experimental results show out it has better performance than Camshift algorithm, and it has higher stability and accuracy as well.

Intelligent Modeling of User Behavior based on FCM Quantization for Smart home (FCM 이산화를 이용한 스마트 홈에서 행동 모델링)

  • Chung, Woo-Yong;Lee, Jae-Hun;Yon, Suk-Hyun;Cho, Young-Wan;Kim, Eun-Tai
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.6
    • /
    • pp.542-546
    • /
    • 2007
  • In the vision of ubiquitous computing environment, smart objects would communicate each other and provide many kinds of information about user and their surroundings in the home. This information enables smart objects to recognize context and to provide active and convenient services to the customers. However in most cases, context-aware services are available only with expert systems. In this paper, we present generalized activity recognition application in the smart home based on a naive Bayesian network(BN) and fuzzy clustering. We quantize continuous sensor data with fuzzy c-means clustering to simplify and reduce BN's conditional probability table size. And we apply mutual information to learn the BN structure efficiently. We show that this system can recognize user activities about 80% accuracy in the web based virtual smart home.