• Title/Summary/Keyword: Human/System Interface

Search Result 783, Processing Time 0.038 seconds

A Method to Select Humane-System Interfaces for Nuclear Power Plants

  • Hugo, Jacques V.;Gertman, David I.
    • Nuclear Engineering and Technology
    • /
    • v.48 no.1
    • /
    • pp.87-97
    • /
    • 2016
  • The new generation of nuclear power plants (NPPs) will likely make use of state-of-the-art technologies in many areas of the plant. The analysis, design, and selection of advanced human-system interfaces (HSIs) constitute an important part of power plant engineering. Designers need to consider the new capabilities afforded by these technologies in the context of current regulations and new operational concepts, which is why they need a more rigorous method by which to plan the introduction of advanced HSIs in NPP work areas. Much of current human factors research stops at the user interface and fails to provide a definitive process for integration of end user devices with instrumentation and control and operational concepts. The current lack of a clear definition of HSI technology, including the process for integration, makes characterization and implementation of new and advanced HSIs difficult. This paper describes how new design concepts in the nuclear industry can be analyzed and how HSI technologies associated with new industrial processes might be considered. It also describes a basis for an understanding of human as well as technology characteristics that could be incorporated into a prioritization scheme for technology selection and deployment plans.

Network human-robot interface at service level

  • Nguyen, To Dong;Oh, Sang-Rok;You, Bum-Jae
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.1938-1943
    • /
    • 2005
  • Network human-robot interface is an important research topic. In home application, users access the robotic system directly via voice, gestures or through the network. Users explore a system by using the services provided by this system and to some extend users are enable to participate in a service as partners. A service may be provided by a robot, a group of robots or robots and other network connected systems (distributed sensors, information systems, etc). All these services are done in the network environment, where uncertainty such as the unstable network connection, the availability of the partners in a service, exists. Moreover, these services are controlled by several users, accessing at different time by different methods. Our research aimed at solving this problem to provide a high available level, flexible coordination system. In this paper, a multi-agent framework is proposed. This framework is validated by using our new concept of slave agents, a responsive multi-agent environment, a virtual directory facilitator (VDF), and a task allocation system using contract net protocol. Our system uses a mixed model between distributed and centralized model. It uses a centralized agent management system (AMS) to control the overall system. However, the partners and users may be distributed agents connected to the center through agent communication or centralized at the AMS container using the slave agents to represent the physical agents. The system is able to determine the task allocation for a group of robot working as a team to provide a service. A number of experiments have been conducted successfully in our lab environment using Issac robot, a PDA for user agent and a wireless network system, operated under our multi agent framework control. The experiments show that this framework works well and provides some advantages to existing systems.

  • PDF

Development of Human-machine Interface based on EMG and EOG (근전도와 안전도 기반의 인간-기계 인터페이스기술)

  • Gang, Gyeong Woo;Kim, Tae Seon
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.50 no.12
    • /
    • pp.129-137
    • /
    • 2013
  • As the usage of computer based systems continues to increase in our normal life, there are constant efforts to enhance the accessibility of information for handicapped people. For this, it is essential to develop new interface ways for physical disabled peoples by means of human-computer interface (HCI) or human-machine interface (HMI). In this paper, we developed HMI using electromyogram (EMG) and electrooculogram (EOG) for people with physical disabilities. Developed system is composed of two modules, hardware module for signal sensing and software module for feature extraction and pattern classification. To maximize ease of use, only two skin contact electrodes are attached on both ends of brow, and EOG and EMG are measured simultaneously through these two electrodes. From measured signal, nine kinds of command patterns are extracted and defined using signal processing and pattern classification method. Through Java based real-time monitoring program, developed system showed 92.52% of command recognition rate. In addition, to show the capability of the developed system on real applications, five different types of commands are used to control ER1 robot. The results show that developed system can be applied to disabled person with quadriplegia as a novel interface way.

A Study on the interface of information processing system on Human enhancement fire fighting helmet (휴먼 증강 소방헬멧 정보처리 시스템 인터페이스 연구)

  • Park, Hyun-Ju;Lee, Kam-Yeon
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2018.10a
    • /
    • pp.497-498
    • /
    • 2018
  • In the fire scene, it is difficult to see 1m ahead because of power failure, smoke and toxic gas, even with thermal imaging camera and Xenon searchlight. Analysis of the smoke particles in the fire scene shows that even if the smoke is $5{\mu}m$ or less in wavelength, it is difficult to obtain a front view when using a conventional thermal imaging camera if the visual distance exceeds 1 meter. In the case of black smoke with a particle wavelength of $5{\mu}m$ or more, a space permeation sensor technology using various sensors other than a single sensor is required because chemical materials, gas, and water molecules are mixed. Firefighters need a smoke detection technology for smoke detection and spatial information visualization for forward safety view.In this paper, we design the interface of the information processing system with 32bit CPU core and peripheral circuit. We also implemented and simulated the interface with Lidar sensor. Through this, we provide interface that can implement information processing system of human enhancement fire helmet in the future.

  • PDF

Development of advanced walking assist system employing stiffness sensor

  • Kim, Seok-Hwan;Shunji, Moromugi;Ishimatsu, Takakazu
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1638-1641
    • /
    • 2004
  • Many walking stands, and assisting tools have been developed for the people with low-limb disability to prevent diseases from bedridden state and to help them walk again. But many of those equipments require user to have some physical strength or balancing ability. In our last research, we developed walking assist system for the people with lower-limb disability. With the system, user can be assisted by actuators, and do not have to worry about falling down. The system adapted the unique closed links structure with four servomotors, three PICs as controller, and four limit switches as HMI (human man interface). We confirmed the adaptability of the system by the experiment. In this research, Muscle Stiffness Sensor was tested as the advanced HMI for walking assist system, and confirmed the adaptability by the experiment. As Muscle Stiffness Sensor can attain the muscle activity, user can interface with any device he want to control. Experimental result with Muscle Stiffness sonsor showed that user could easily control the walking assist system as his will, just by changing his muscle strength.

  • PDF

Data Transformation System Implementation for the Automation of PCB Product (PCB 생산 자동화를 위한 데이터 변환 시스템 구현)

  • Lee Seung-Hyuk;Kim Gui-Jung;Han Jung-Soo
    • The Journal of the Korea Contents Association
    • /
    • v.5 no.5
    • /
    • pp.17-25
    • /
    • 2005
  • In this paper, we design data transformation interface for the automation of PCB product. The data designed to CAD does not exchange itself for the assembly line, so we construct an automation system which is exchangeable. To do this, we analyze the information of PCB components and construct the Information of IC components as database. We also develope two kinds of algorithm; one is to detect human error and another is to exchange itself for the data which is suitable for PCB assembly line. We design data transformation interface to do addition and revision of the information for PCB assembly line. By automating existing manual processing, we are able to shorten access time and enhance reliability of the data and the efficient assembly line of PCB.

  • PDF

Human body learning system using multimodal and user-centric interfaces (멀티모달 사용자 중심 인터페이스를 적용한 인체 학습 시스템)

  • Kim, Ki-Min;Kim, Jae-Il;Park, Jin-Ah
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.85-90
    • /
    • 2008
  • This paper describes the human body learning system using the multi-modal user interface. Through our learning system, students can study about human anatomy interactively. The existing learning methods use the one-way materials like images, text and movies. But we propose the new learning system that includes 3D organ surface models, haptic interface and the hierarchical data structure of human organs to serve enhanced learning that utilizes sensorimotor skills.

  • PDF

Stereo-Vision-Based Human-Computer Interaction with Tactile Stimulation

  • Yong, Ho-Joong;Back, Jong-Won;Jang, Tae-Jeong
    • ETRI Journal
    • /
    • v.29 no.3
    • /
    • pp.305-310
    • /
    • 2007
  • If a virtual object in a virtual environment represented by a stereo vision system could be touched by a user with some tactile feeling on his/her fingertip, the sense of reality would be heightened. To create a visual impression as if the user were directly pointing to a desired point on a virtual object with his/her own finger, we need to align virtual space coordinates and physical space coordinates. Also, if there is no tactile feeling when the user touches a virtual object, the virtual object would seem to be a ghost. Therefore, a haptic interface device is required to give some tactile sensation to the user. We have constructed such a human-computer interaction system in the form of a simple virtual reality game using a stereo vision system, a vibro-tactile device module, and two position/orientation sensors.

  • PDF

Teleoperated Control of a Mobile Robot Using an Exoskeleton-Type Motion Capturing Device Through Wireless Communication (Exoskeleton 형태의 모션 캡쳐 장치를 이용한 이동로봇의 원격 제어)

  • Jeon, Poong-Woo;Jung, Seul
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.10 no.5
    • /
    • pp.434-441
    • /
    • 2004
  • In this paper, an exoskeleton-type motion capturing system is designed and implemented. The device is designed to have 12 degree-of-freedom entirely to represent human arm motions. Forward and inverse kinematics of the device are analyzed to make sure of its singular positions. With the designed model parameters, simulation studies are conducted to verify that the designed motion capturing system is effective to represent human motions within the workspace. As a counterpart of the exoskeleton system, a mobile robot is built to follow human motion restrictively. Experimental studies of teleoperation from the exoskeleton device to control the mobile robot are carried out to show feasible application of wireless man-machine interface.

MPEG-U based Advanced User Interaction Interface System Using Hand Posture Recognition (손 자세 인식을 이용한 MPEG-U 기반 향상된 사용자 상호작용 인터페이스 시스템)

  • Han, Gukhee;Lee, Injae;Choi, Haechul
    • Journal of Broadcast Engineering
    • /
    • v.19 no.1
    • /
    • pp.83-95
    • /
    • 2014
  • Hand posture recognition is an important technique to enable a natural and familiar interface in HCI(human computer interaction) field. In this paper, we introduce a hand posture recognition method by using a depth camera. Moreover, the hand posture recognition method is incorporated with MPEG-U based advanced user interaction (AUI) interface system, which can provide a natural interface with a variety of devices. The proposed method initially detects positions and lengths of all fingers opened and then it recognizes hand posture from pose of one or two hands and the number of fingers folded when user takes a gesture representing a pattern of AUI data format specified in the MPEG-U part 2. The AUI interface system represents user's hand posture as compliant MPEG-U schema structure. Experimental results show performance of the hand posture recognition and it is verified that the AUI interface system is compatible with the MPEG-U standard.