• Title/Summary/Keyword: Mobile interaction

Search Result 617, Processing Time 0.024 seconds

Image Browsing in Mobile Devices Using User Motion Tracking (모바일 장치를 위한 동작 추적형 이미지 브라우징 시스템)

  • Yim, Sung-Hoon;Hwang, Ja-Ne;Choi, Seung-Moon;Kim, Joung-Hyun
    • Journal of the HCI Society of Korea
    • /
    • v.3 no.1
    • /
    • pp.49-56
    • /
    • 2008
  • Most recent mobile devices can store a massive amount of images. However, the typical user interface of mobile devices, such as a small-size 2D display and discrete-input buttons, make the browsing and manipulation of images cumbersome and time-consuming. As an alternative, we adopt motion-based interaction along with a 3D layout of images, expecting such an intuitive and natural interaction may facilitate the tasks. We designed and implemented a motion-based interaction scheme for image browsing using an ultra mobile PC, and evaluated and compared its usability to that of the traditional button-based interaction. The effects of data layouts (tiled and fisheye cylindrical layouts) were also investigated to see whether they can enhance the effectiveness of the motion based interaction.

  • PDF

The Fourth Industrial Revolution and Multimedia Converging Technology: Pervasive AR Platform Construction using a Mobile Robot based Projection Technology (4 산업혁명과 멀티미디어 융합 기술 : 모바일 로봇 기반 이동형 프로젝션 기술을 이용한 Pervasive AR 플랫폼 구축)

  • Chae, Seungho;Yang, Yoonsik;Han, Tack-Don
    • Journal of Korea Multimedia Society
    • /
    • v.20 no.2
    • /
    • pp.298-312
    • /
    • 2017
  • The fourth industrial revolution is expected to show technological innovation that develops among different fields beyond boundaries through the convergence and integration of fields. With the development and convergence of digital technology, users can receive information anywhere in the world. In this paper, we propose an adaptive interaction concept in a various environment by using a mobile robot based on projection augmented reality (AR). Most previous studies have aimed fixed projector or projection for a pre-designed environment. Thus, they provide only limited information. To overcome the abovementioned problem, we provide the adaptive information by implementing a projection AR system that can be mounted on the mobile robot. For that, the mobile robot based on the projection system was defined as Pervasive AR. Pervasive AR is configured with a pervasive display, a pervasive interface, and seamless interaction. The Pervasive AR technology enables the user to access information immediately by expanding the display area into real space, which implies an environment of intuitive and convenient interaction by expanding the user interface. This system can be applied to various areas, such as a home environment and a public space.

An Automatic and Scalable Application Crawler for Large-Scale Mobile Internet Content Retrieval

  • Huang, Mingyi;Lyu, Yongqiang;Yin, Hao
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.10
    • /
    • pp.4856-4872
    • /
    • 2018
  • The mobile internet has grown ubiquitous across the globe with the widespread use of smart devices. However, the designs of modern mobile operating systems and their applications limit content retrieval with mobile applications. The mobile internet is not as accessible as the traditional web, having more man-made restrictions and lacking a unified approach for crawling and content retrieval. In this study, we propose an automatic and scalable mobile application content crawler, which can recognize the interaction paths of mobile applications, representing them as interaction graphs and automatically collecting content according to the graphs in a parallel manner. The crawler was verified by retrieving content from 50 non-game applications from the Google Play Store using the Android platform. The experiment showed the efficiency and scalability potential of our crawler for large-scale mobile internet content retrieval.

Elementary Teacher's Science Class Analysis using Mobile Eye Tracker (이동형 시선추적기를 활용한 초등교사의 과학 수업 분석)

  • Shin, Won-Sub;Kim, Jang-Hwan;Shin, Dong-Hoon
    • Journal of Korean Elementary Science Education
    • /
    • v.36 no.4
    • /
    • pp.303-315
    • /
    • 2017
  • The purpose of this study is to analyze elementary teachers' science class objectively and quantitatively using Mobile Eye Tracker. The mobile eye tracker is easy to wear in eyeglasses form. And experiments are collected in video form, so it is very useful for realizing objective data of teacher's class situation in real time. Participants in the study were 2 elementary teachers, and they are teaching sixth grade science in Seoul. Participants took a 40-minute class wearing a mobile eye tracker. Eye movements of participants were collected at 60 Hz, and the collected eye movement data were analyzed using SMI BeGaze 3.7. In this study, the area related to the class was set as the area of interest, we analyzed the visual occupancy of teachers. In addition, we analyzed the linguistic interaction between teacher and students. The results of the study are as follows. First, we analyze the visual occupancy of meaningful areas in teaching-learning activities by class stage. Second, the analysis of eye movements when teachers interacted with students showed that teacher A had a high percentage of students' faces, while teacher B had a high visual occupation in areas not related to classes. Third, the linguistic interaction of the participants were analyzed. Analysis areas include questions, attention-focused language, elementary science teaching terminology, daily interaction, humor, and unnecessary words. This study shows that it is possible to analyze elementary science class objectively and quantitatively through analysis of visual occupancy using mobile eye tracking. In addition, it is expected that teachers' visual attention in teaching activities can be used as an index to analyze the form of language interaction.

Interaction Technique in Smoke Simulations using Mouth-Wind on Mobile Devices (모바일 디바이스에서 사용자의 입 바람을 이용한 연기 시뮬레이션의 상호작용 방법)

  • Kim, Jong-Hyun
    • Journal of the Korea Computer Graphics Society
    • /
    • v.24 no.4
    • /
    • pp.21-27
    • /
    • 2018
  • In this paper, we propose a real-time interaction method using user's mouth wind in mobile device. In mobile and virtual reality, user interaction technology is important, but various user interface methods is still lacking. Most of the interaction technologies are hand touch screen touch or motion recognition. In this study, we propose an interface technology that can interact with real time using user's mouth wind. The direction of the wind is determined by using the angle and the position between the user and the mobile device, and the size of the wind is calculated by using the magnitude of user's mouth wind. To show the superiority of the proposed technique, we show the result of visualizing the flow of the vector field in real time by integrating the mouth-wind interface into the Navier-Stokes equations. We show the results of the paper on mobile devices, but can be applied in the Agumented reality(AR) and Virtual reality(VR) fields requiring interface technology.

Mobile Interaction Using Smartphones and Kinect in a Global Space (키넥트와 스마트폰을 활용한 공용 공간상에서 모바일 상호작용)

  • Kim, Min Seok;Lee, Jae Yeol
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.40 no.1
    • /
    • pp.100-107
    • /
    • 2014
  • This paper presents a co-located and mobile interaction technique using smartphones in a global space. To effectively detect the locations and orientations of smartphones, the proposed approach utilizes Kinect that captures RGB image as well as 3D depth information. Based on the locations and orientations of smartphones, the proposed approach can support direct, collaborative and private interactions with the global space. Thus, it can provide more effective mobile interactions for local space exploration and collaboration.

Interactive 3D-View Image Service on Web and Mobile Phone (웹 및 모바일 폰에서의 인터랙티브 3D-View 이미지 서비스 기술)

  • Jeon, Kyeong-Won;Kwon, Yong-Moo;Jo, Sang-Woo;Ki, Jeong-Seok
    • 한국HCI학회:학술대회논문집
    • /
    • 2007.02a
    • /
    • pp.518-523
    • /
    • 2007
  • This paper presents web service and service on mobile phone about research on virtual URS(Ubiquitous Robotic Space). We modeled the URS. Then, we find the location of robot in the virtual URS on web and mobile phone. We control the robot view with mobile phone. This paper addresses the concept of virtual URS and introduces interaction between robot in the virtual URS and human using web and mobile phone service. Then, this paper introduces a case of service on mobile phone.

  • PDF

A Study on Developmental Direction of Interface Design for Gesture Recognition Technology

  • Lee, Dong-Min;Lee, Jeong-Ju
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.499-505
    • /
    • 2012
  • Objective: Research on the transformation of interaction between mobile machines and users through analysis on current gesture interface technology development trend. Background: For smooth interaction between machines and users, interface technology has evolved from "command line" to "mouse", and now "touch" and "gesture recognition" have been researched and being used. In the future, the technology is destined to evolve into "multi-modal", the fusion of the visual and auditory senses and "3D multi-modal", where three dimensional virtual world and brain waves are being used. Method: Within the development of computer interface, which follows the evolution of mobile machines, actively researching gesture interface and related technologies' trend and development will be studied comprehensively. Through investigation based on gesture based information gathering techniques, they will be separated in four categories: sensor, touch, visual, and multi-modal gesture interfaces. Each category will be researched through technology trend and existing actual examples. Through this methods, the transformation of mobile machine and human interaction will be studied. Conclusion: Gesture based interface technology realizes intelligent communication skill on interaction relation ship between existing static machines and users. Thus, this technology is important element technology that will transform the interaction between a man and a machine more dynamic. Application: The result of this study may help to develop gesture interface design currently in use.