• Title/Summary/Keyword: computer interface

Search Result 2,988, Processing Time 0.031 seconds

An Authoring Framework for Emotion-Aware User Interface of Mobile Applications (모바일 어플리케이션의 감정 적응형 사용자 인터페이스 저작 프레임워크)

  • Lee, Eunjung;Kim, Gyu-Wan;Kim, Woo-Bin
    • Journal of Korea Multimedia Society
    • /
    • v.18 no.3
    • /
    • pp.376-386
    • /
    • 2015
  • Since affective computing has been introduced in 90s, affect recognition technology has achieved substantial progress recently. However, the application of user emotion recognition into software user interface is in its early stages. In this paper, we describe a new approach for developing mobile user interface which could react differently depending on user emotion states. First, an emotion reaction model is presented which determines user interface reactions for each emotional state. We introduce a pair of mappings from user states to different user interface versions. The reacting versions are implemented by a set of variations for a view. Further, we present an authoring framework to help developers/designers to create emotion-aware reactions based on the proposed emotion reaction model. The authoring framework is necessary to alleviate the burden of creating and handling multi versions for views at the development process. A prototype implementation is presented as an extension of the existing authoring tool DAT4UX. Moreover, a proof-of-concept application featuring an emotion-aware interface is developed using the tool.

Study on Communication Interface of Multiple Smartphones for Unit Control in a PC (PC에서 유닛 제어를 위한 다중 스마트폰의 통신 인터페이스 연구)

  • Jung, Hahmin;Kim, Dong Hun
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.19 no.6
    • /
    • pp.520-526
    • /
    • 2013
  • This study presents the unit control of a PC (Personal Computer) program using multiple smartphones based on ad hoc communication. A design of the data packet that multiple smartphones send to a computer, and a framework that manages and controls the unit in a computer after analyzing the data packet, are proposed. As a result, multiple users are able to control their own units using their smartphones while seeing the monitor connected to a computer. In other words, multiple users can share the same game in a computer or control their units embedded in a system using their smartphones. An experimental result shows that a racing game in a PC can be realized by the proposed communication interface, where four iPhones are used to control their units in a computer. Thus, the proposed framework can be effectively used for unit control in a PC using multiple smartphones.

Haptic Modeler using Haptic User Interface (촉감 사용자 인터페이스를 이용한 촉감 모델러)

  • Cha, Jong-Eun;Oakley, Ian;Kim, Yeong-Mi;Kim, Jong-Phil;Lee, Beom-Chan;Seo, Yong-Won;Ryu, Je-Ha
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02a
    • /
    • pp.1031-1036
    • /
    • 2006
  • 햅틱 분야는 디스플레이 되는 콘텐츠를 만질 수 있게 촉감을 제공함으로써 의학, 교육, 군사, 방송 분야 등에서 널리 연구되고 있다. 이미 의학 분야에서는 Reachin 사(社)의 복강경 수술 훈련 소프트웨어와 같이 실제 수술 할 때와 같은 힘을 느끼면서 수술 과정을 훈련할 수 있는 제품이 상용화 되어 있다. 그러나 햅틱 분야가 사용자에게 시청각 정보와 더불어 추가적인 촉감을 제공함으로써 보다 실감 있고 자연스러운 상호작용을 제공하는 장점을 가진 것에 비해 아직은 일반 사용자들에게 생소한 분야다. 그 이유 중 하나로 촉감 상호작용이 가능한 콘텐츠의 부재를 들 수 있다. 일반적으로 촉감 콘텐츠는 컴퓨터 그래픽스 모델로 이루어져 있어 일반 그래픽 모델러를 사용하여 콘텐츠를 생성하나 촉감과 관련된 정보는 콘텐츠를 생성하고 나서 파일에 수작업으로 넣어주거나 각각의 어플리케이션마다 직접 프로그램을 해주어야 한다. 이는 그래픽 모델링과 촉감 모델링이 동시에 진행되지 않기 때문에 발생하는 문제로 촉감 콘텐츠를 만드는데 시간이 많이 소요되고 촉감 정보를 추가하는 작업이 직관적이지 못하다. 그래픽 모델링의 경우 눈으로 보면서 콘텐츠를 손으로 조작할 수 있으나 촉감 모델링의 경우 손으로 촉감을 느끼면서 동시에 조작도 해야 하기 때문에 이에 따른 인터페이스가 필요하다. 본 논문에서는 촉감 상호작용이 가능한 촉감 콘텐츠를 직관적으로 생성하고 조작할 수 있게 하는 촉감 모델러를 기술한다. 촉감 모델러에서 사용자는 3 자유도 촉감 장치를 사용하여 3 차원의 콘텐츠를 실시간으로 만져보면서 생성, 조작할 수 있고 촉감 사용자 인터페이스를 통해서 콘텐츠의 표면 촉감 특성을 직관적으로 편집할 수 있다. 촉감 사용자 인터페이스는 마우스로 조작하는 기존의 2차원 그래픽 사용자 인터페이스와는 다르게 3 차원으로 구성되어 있고 촉감 장치로 조작할 수 있는 버튼, 라디오 버튼, 슬라이더, 조이스틱의 구성요소로 이루어져 있다. 사용자는 각각의 구성요소를 조작하여 콘텐츠의 표면 촉감 특성 값을 바꾸고 촉감 사용자 인터페이스의 한 부분을 만져 그 촉감을 실시간으로 느껴봄으로써 직관적으로 특성 값을 정할 수 있다. 또한, XML 기반의 파일 포맷을 제공함으로써 생성된 콘텐츠를 저장할 수 있고 저장된 콘텐츠를 불러오거나 다른 콘텐츠에 추가할 수 있다.

  • PDF

A Study on Movement Interface in Mobile Virtual Reality (모바일 가상현실에서의 이동 인터페이스에 관한 연구)

  • Hong, Seunghyun;Na, Giri;Cho, Yunsik;Kim, Jinmo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.27 no.3
    • /
    • pp.55-63
    • /
    • 2021
  • This study proposes an interface for providing mobile interaction suitable for mobile virtual reality (VR) and analyzes it through comparative experiments. The proposed interface is premised on not using additional equipment except for the mobile head-mounted display(HMD) in consideration of accessibility and usability. And the interface that controls the movement interaction using the user's gaze is designed in two phases. The key is to minimize the occurrence of negative factors such as VR sickness that can be caused by straight line movement in virtual reality. To this end, two phases are designed: an interface composed of forward/backward buttons to move the gaze toward the ground, and an interface composed of left and right buttons on the front in consideration of the gaze change in real walking motion. An application that can compare and analyze movement interactions through the proposed interface is produced, and a survey experiment is conducted to analyze the user's satisfaction with the interface experience and the negative impact on the movement process. It was confirmed that the proposed movement interaction reduced negative effects such as VR sickness along with a satisfactory interface experience for users.

피지컬 인터페이스의 구현에 관한 연구

  • 오병근
    • Archives of design research
    • /
    • v.16 no.2
    • /
    • pp.131-140
    • /
    • 2003
  • The input for computer interaction design is very limited for the users to control the interface by only using keyboard and mouse. However, using the basic electrical engineering, the input design can be different from the existing method. Interactive art using computer technology is recently emersed, which is completed by people's participation. The electric signal transmitted in digital and analogue type from the interface controled by people to the computer can be used in multimedia interaction design. The electric circuit design will be necessary to transmit very safe electric signal from the interface. Electric switch, sensor, and camera technologies can be applied to input interface design, which would be alternative physical interaction without computer keyboard and mouse. This type of interaction design using human's body language and gesture would convey the richness of humanity.

  • PDF

Measurement of Interface Trapped Charge Densities $(D_{it})$ in 6H-SiC MOS Capacitors

  • Lee Jang Hee;Na Keeyeol;Kim Kwang-Ho;Lee Hyung Gyoo;Kim Yeong-Seuk
    • Proceedings of the IEEK Conference
    • /
    • summer
    • /
    • pp.343-347
    • /
    • 2004
  • High oxidation temperature of SiC shows a tendency of carbide formation at the interface which results in poor MOSFET transfer characteristics. Thus we developed oxidation processes in order to get low interface charge densities. N-type 6H-SiC MOS capacitors were fabricated by different oxidation processes: dry, wet, and dry­reoxidation. Gate oxidation and Ar anneal temperature was $1150^{\circ}C.$ Ar annealing was performed after gate oxidation for 30 minutes. Dry-reoxidation condition was $950^{\circ}C,$ H2O ambient for 2 hours. Gate oxide thickness of dry, wet and dry-reoxidation samples were 38.0 nm, 38.7 nm, 38.5 nm, respectively. Mo was adopted for gate electrode. To investigate quality of these gate oxide films, high frequency C- V measurement, gate oxide leakage current, and interface trapped charge densities (Dit) were measured. The interface trapped charge densities (Dit) measured by conductance method was about $4\times10^{10}[cm^{-1}eV^{-1}]$ for dry and wet oxidation, the lowest ever reported, and $1\times10^{11}[cm^{-1}eV^{-1}]$ for dry-reoxidation

  • PDF

A Research on EEG Synchronization of Movement Cognition for Brain Computer Interface (뇌 컴퓨터 인터페이스를 위한 뇌파와 동작 인지와의 동기화에 관한 연구)

  • Whang, Min-Cheol;Kim, Kyu-Tae;Goh, Sang-Tae;Jeong, Byung-Yong
    • Journal of the Ergonomics Society of Korea
    • /
    • v.26 no.2
    • /
    • pp.167-171
    • /
    • 2007
  • Brain computer interface is the technology of interface for next generation. Recently, user intention has been tried to be recognized for interfacing a computer. EEG plays important role in developing practical application in this area. Much research has focused on extracting EEG commander generated by human movement. ERD/ERS has generally accepted as important EEG parameters for prediction of human movement. However, There has been difference between initial movement indicated by ERD/ERS and real movement. Therefore, this study was to determine the time differences for brain interface by ERD/ERS. Five university students performed ten repetitive movements. ERD/ERS was determined according to movement execution and the significant pattern showed the difference between movement execution and movement indication of ERD/ERS.

A Brain-Computer Interface Based Human-Robot Interaction Platform (Brain-Computer Interface 기반 인간-로봇상호작용 플랫폼)

  • Yoon, Joongsun
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.16 no.11
    • /
    • pp.7508-7512
    • /
    • 2015
  • We propose a brain-machine interface(BMI) based human-robot interaction(HRI) platform which operates machines by interfacing intentions by capturing brain waves. Platform consists of capture, processing/mapping, and action parts. A noninvasive brain wave sensor, PC, and robot-avatar/LED/motor are selected as capture, processing/mapping, and action part(s), respectively. Various investigations to ensure the relations between intentions and brainwave sensing have been explored. Case studies-an interactive game, on-off controls of LED(s), and motor control(s) are presented to show the design and implementation process of new BMI based HRI platform.

Computer Interface Using Head-Gaze Tracking (응시 위치 추적 기술을 이용한 인터페이스 시스템 개발)

  • 이정준;박강령;김재희
    • Proceedings of the IEEK Conference
    • /
    • 1999.06a
    • /
    • pp.516-519
    • /
    • 1999
  • Gaze detection is to find out the position on a monitor screen where a user is looking at, using the image processing and computer vision technology, We developed a computer interface system using the gaze detection technology, This system enables a user to control the computer system without using their hands. So this system will help the handicapped to use a computer and is also useful for the man whose hands are busy doing another job, especially in tasks in factory. For the practical use, command signal like mouse clicking is necessary and we used eye winking to give this command signal to the system.

  • PDF

Human-Computer Interaction Based Only on Auditory and Visual Information

  • Sha, Hui;Agah, Arvin
    • Transactions on Control, Automation and Systems Engineering
    • /
    • v.2 no.4
    • /
    • pp.285-297
    • /
    • 2000
  • One of the research objectives in the area of multimedia human-computer interaction is the application of artificial intelligence and robotics technologies to the development of computer interfaces. This involves utilizing many forms of media, integrating speed input, natural language, graphics, hand pointing gestures, and other methods for interactive dialogues. Although current human-computer communication methods include computer keyboards, mice, and other traditional devices, the two basic ways by which people communicate with each other are voice and gesture. This paper reports on research focusing on the development of an intelligent multimedia interface system modeled based on the manner in which people communicate. This work explores the interaction between humans and computers based only on the processing of speech(Work uttered by the person) and processing of images(hand pointing gestures). The purpose of the interface is to control a pan/tilt camera to point it to a location specified by the user through utterance of words and pointing of the hand, The systems utilizes another stationary camera to capture images of the users hand and a microphone to capture the users words. Upon processing of the images and sounds, the systems responds by pointing the camera. Initially, the interface uses hand pointing to locate the general position which user is referring to and then the interface uses voice command provided by user to fine-the location, and change the zooming of the camera, if requested. The image of the location is captured by the pan/tilt camera and sent to a color TV monitor to be displayed. This type of system has applications in tele-conferencing and other rmote operations, where the system must respond to users command, in a manner similar to how the user would communicate with another person. The advantage of this approach is the elimination of the traditional input devices that the user must utilize in order to control a pan/tillt camera, replacing them with more "natural" means of interaction. A number of experiments were performed to evaluate the interface system with respect to its accuracy, efficiency, reliability, and limitation.

  • PDF