DOI QR코드

DOI QR Code

HaptiSole: Wearable Haptic System in Vibrotactile Guidance Shoes for Visually Impaired Wayfinding

  • Slim Kammoun (LaTICE Research Laboratory, University of Tunis) ;
  • Rahma Bouaziz (ReDCAD Laboratory, University of Sfax) ;
  • Faisal Saeed (DAAI Research Group, College of Computing and Digital Technology, Birmingham City University) ;
  • Sultan Noman Qasem (Computer Science Department, College of Computer and Information Sciences, Imam Mohammed Ibn Saud Islamic University (IMSIU)) ;
  • Tawfik Al-Hadhrami (School of Science and Technology, Nottingham Trent University)
  • 투고 : 2023.07.12
  • 심사 : 2023.10.26
  • 발행 : 2023.11.30

초록

During the last decade, several Electronic Orientation Aids devices have been proposed to solve the autonomy problems of visually impaired people. When hearing is considered the primary sense for Visually Impaired people (VI) and it is generally loaded with the environment, the use of tactile sense can be considered a solution to transmit directional information. This paper presents a new wearable haptic system based on four motors implemented in shoes, while six directions can be played. This study aims to introduce an interface design and investigate an appropriate means of spatial information delivery through haptic sense. The first experiment of the proposed system was performed with 15 users in an indoor environment. The results showed that the users were able to recognize, with high accuracy, the directions displayed on their feet. The second experiment was conducted in an outdoor environment with five blindfolded users who were guided along 120 meters. The users, guided only by the haptic system, successfully reached their destinations. The potential of tactile-foot stimulation to help VI understand Electronic Orientation Aids (EOA) instructions was discussed, and future challenges were defined.

키워드

1. Introduction

During the last decades, several studies have been conducted to propose integrated systems that can help visually impaired people (VI) in their daily displacements. The state of the art electronic mobility aids for the VI people presents about 140 commercialized products [1]. These systems were divided into two main categories. The first class is called Electronic Travel Aids (ETAs) that aims at helping the VI in the task of obstacle avoidance. Some of these systems use ultrasonic echolocation [2] or laser telemeters to detect the distance to obstacles and resituate the information by vibrations tactile on the fingers or by producing a sound [3]. Electronic Orientation Aids (EOAs) is the second one, which aims at helping the VI people in the task of mobility and orientation. They provide them with awareness about their current situation and directions in the unfamiliar environment [46].

EOAs are a category of an assistive device for VI people, dedicated to orientation assistance and composed of at least three components. The first one is a localization system (e.g., GPS). The second component is a digitized map and software designed to select routes, track the traveler's path, and provides navigation information to the user. The last one is a non-visual User Interface (UI) to assure communication between the user and the system. Fig. 1 presents an overview of EOA architecture as presented by [4].

E1KOBZ_2023_v17n11_3064_f0001.png 이미지

Fig. 1. EOA architecture adapted from[4]

This system is composed of three main components. A positioning module that localizes the user, a Geographical Information System (GIS) that defines the itinerary between starting and destination points, and finally the user interface.

To assure the user is guided from a start point to their destination, the system must use an accurate localization process that must be around one meter in real-time and without any loss. Positioning is very crucial to determine the sidewalk of the user whether they are on the right or the left or in front of a pedestrian crossing or if they have already started to cross the street, etc. To overcome these limitations, several studies suggested using Differential GPS (DGPS) [7], [8] or combining the signals of GPS with inertial sensors through dead-reckoning algorithms [9]. Other approaches proposed combining embedded vision and GNSS to get a more accurate user position [1013]. Hence, and using all these techniques, the positioning component can provide the system with more accurate localization to improve the user guidance process.

The second component is the Geographical Information System which includes a geographical database and some routes selection software. It also provides assistance and tracks the traveler's path [14]. To indicate remote environmental features’ location, maps are important information sources (e.g., landmarks, point of interest) and help in the understanding of the spatial configuration of the surroundings. To improve positioning, geospatial data are used via map matching approaches. These functions are not usable without resolution improvements to the maps in addition to adding suitable information to the database (e.g., the pedestrian crossing and the presence of walking pathways such as sidewalks) [15]. Few works have been done to identify key features concerning pedestrian mobility in the absence of sight, but the problems of data collection and data extraction must be overcome first [16], [17].

The third component is the user interface (UI); it works as the guiding component between the requirements of the users and the functions of the system. It is one of the most important components of a system that is dedicated to people with impairments. Obviously, for VI people, the interfaces based on visual output are excluded; audio-based interfaces or somatosensory interaction techniques remain the unique solutions. In light of these considerations, a navigational aid based on the sense of touch may be the most appropriate in acoustically rich surroundings. By transmitting navigational commands via touch, the two sources of information—navigational commands and environmental sounds—would be decoupled, preventing physical and attentional interference.

By reviewing the previous studies, the limitations of the existing systems have been identified. It was found that most of the existing tactile user interfaces for VI navigation focus on the macro navigation aspect. These systems aim to roughly guide VI pedestrian through cities by providing turn-by-turn directions towards a destination. The evaluation of these systems has become a challenging task for user interfaces designers due to positioning errors and the absence of adapted GIS. To overcome these limitations, we developed a new platform that can help haptic interfaces designer to test their designed solution with a Wizard-of-Oz technique. The developed system differs from the existing systems in terms of design, implementation, and evaluation. It is embedded in the user's shoes. Instead of communicating directions via an auditory interaction, the sole contains a few vibration motors, which can be turned on to recommend a direction to take. The advantage of simple vibration patterns is that it is an unused communication channel; it does not interfere with users’ ability to perceive the surrounding environments. Also, this system can be used as an evaluation platform for helping EOA designers to evaluate their user interface without positioning errors.

The purpose of this study is to investigate blind people's capacity to process simple navigational directions through a haptic shoe using a small number of vibrators. In other words, the proposed system provides instructional directions to VI pedestrians using a tactile-based interface safely and less confusingly.

The key contributions of this paper are as follows:

1. Present the design, implementation, and experimental evaluation of the HaptiSole system, which is a new wearable haptic-based interface for blind pedestrians.

2. Conduct a user study with blind participants to evaluate the feasibility and usability of a haptic guidance system for blind pedestrian.

3. Propose an evaluation platform that can help EOA designers to evaluate their user interface without positioning errors. The rest of the paper is organized as follows: Section 2 reviews the related works on different types of interfaces that can be used with way-finding systems. While section 3 presents the proposed system. Section 4 gives a field trial evaluation of the proposed system and shows the efficiency of the prototype and the initial feasibility of the proposed approach. Finally, the conclusion includes the research contributions and discusses future challenges associated with haptic interfaces for VI peoples.

2. Related works

In the absence of vision, auditory and somatosensory modalities are the unique solution to provide VI people with directional instructions. Over the 20 last years, many research works have been focusing on developing adapted interfaces that can be used by VI people.

2.1. Interfaces based on Audio

For blind people, interfaces based on audio seem to be the best way to take navigational instructions because it does not need any learning. Instructions are proposed to the user through Text-to-Speech (TTS). These techniques consist of translating a text message into a voice message. This mechanism is highly preferable for commercialized devices solutions[18], [19]. The second approach is based on binaural rendering. These systems proposed binaural 3D audio for guidance and navigational information depending on the capacity of humans for hearing especially the spatial audition [2022]. Through a spatialization engine and over headphones, the 3D sound module provides binaural rendering. The main principle in this approach is to produce informational auditory content at the spatial position which occurs simultaneously with that of a specific target. Therefore, to create binaural 3D sound, we must use stereo headphones. The role of the user then is to follow the sound to reach the desired destination. High levels of intrusion and noisy environments are among the major difficulties that face the users when using audio outputs instructions. The main principle of this method is to generate auditory information content at a spatial location that occurs simultaneously with the content of a particular target. The user controls the output range from the level where the system provides continuous navigation instructions to the point where all but warnings are dismissed. To overcome these limitations, the Mobic project in [23] proposed to allow the user to control the range of the output from the point where the system offers continuous navigation instructions to the point where all but warnings are dismissed. The NAVIG system [6] was proposed to define several user preferences according to the journey. Each preference contains diverse stages of information details and how this information will be introduced to the user. Indeed, sound interfaces will block, partially or totally, the sounds from the surrounding environment to some extent which can be unsafe for virtually impaired travelers.

2.2 Vibrotactile Based Interface

Tactile language is defined by [24] as a set of tactile information used to build a communication system. Like spoken languages, tactile ones include a set of rules that shows the way of using tactile information to form phrases with sequences meaning. In the context of haptic interfaces, the vibrotactile approach consists of using body locations as mechanoreceptors. Vibrators provide navigation continuously and non-intrusive guidelines. During the guidance process, the user is not perturbed by system sounds and will stay focused on the sounds that come from the surrounding environment [25].

Haptic feedback for the VI guidance system has been investigated over the two last decades. Erp et al. [9] proposed a display belt that includes eight vibration motors attached to the user's waist. The schemes translated distance to vibration rhythm while the direction was translated into vibration location. The conducted study with 12 blindfolded pedestrians revealed that no training is needed for mapping waypoint direction on the location of vibration that proved to be an effective coding scheme. However, this distance encoding does not improve the performance under control conditions without distance information. Based on the previous work [26], the authors proposed the tactile Wayfinder, a belt with six vibrators located in the belt where the tactile cues are distributed eventually around the waist of the user. Directions were directed through triggering the corresponding motors. As for the previous work, results show the effectiveness of the belt to indicate direction information. User back is also investigated in [27], where the use of nine vibrators on the user back was used to show the direction of the following waypoint. Street crossing scenario was used as a critical test setting in which to evaluate the proposed interface. The proposed interface system was found appropriate in the wayfinding situation. Distance to a target can be displayed in the user back using vibration as presented in [28]. All these methods utilized several vibrators and need specific hardware. [29]. propose recommendations for the design of vibrotactile patterns for use on the human back.

Using a single vibrator to indicate directional instruction has also been investigated with the expansion of the use of mobile phones. Phone vibrators will play the role of direction indicators. With one vibrator, a specific vibrotactile language must be defined to code different directions. Naviradar [30] is an android mobile-based application; it provided directions information on a 360° circle. A clockwise radar swing rotates, and tactile response is given every swing particularly expresses the user’s current direction in addition to the direction that the user must travel to. An evaluation study shows that people could understand the concept of NaviRadar and identifying the precise direction with a mean deviation of 37° out of the provided 360°. The TactileCompass is also a mobile application working in an android based device using a single vibrational motor that provides directions information by varying the pulses of two subsequent lengths within a single pattern. Distances are given via a difference in the pause between subsequent patterns, with short pauses corresponding to short distances. This pattern has been also used in the Pocket Navigator, a wayfinding mobile application that used the vibrator of a mobile phone to indicate different directions [31].

In Azenkot et al.[32], the researchers used vibration on smartphones to evaluate three proposed methods by giving blind and low-vision people walking directions. Vibration patterns were defined method by vibrating for 1 to 4 pulses to show the nearest intersection that the user must turn to. Ten VI pedestrians were involved in a user study and demonstrated that using a single vibrator is enough for communicating directions information without the need for the user's auditory attention or special hardware.

Haptic wrist bands are utilized for guiding blind people in a real environment. Weber et al. [10] reported that the wrist bracelet was more beneficial than verbal feedback to show the user’s rotations. Researchers in [33] confirmed that tactile feedback is good in keeping blindfolded travelers on a straight path. In addition to belts, pockets, or wrists, shoe-based vibrotactile interfaces were also investigated in [34], [35]. In [36], the researchers conducted some experiments with 20 sighted volunteers and 5 blind volunteers to assess the role of tactile sensation by the human foot and the tactile sensitivity of the sole surface. The results revealed that some information was identifiable, and that tactile foot stimulation can be utilized for a variety of human-machine interaction applications. On the other hand, the somatosensory modality was examined for sighted users to enhance the presentation of the information without visual support, i.e., in the event of situational impairment.

Another work presented by [37] for designing a haptic sole. This device includes four vibrating actuators in a commercial foam shoe insole. For each direction, a corresponding vibrotactile pattern was defined. The whole length of each pattern was 7s. A preliminary evaluation conducted in an outdoor environment involved two blind users who were guided in a distance ranging from 380 to 420 m. They recommended that the proposed system improves independent and safe navigation for the VI people, demonstrating the potential for tactile foot stimulation in assistive devices. However, a complex vibration pattern was used that can add additional cognitive load to recognize the meaning of these vibrations.

In order to understand the effect of surrounding noise in the achievement of the journey, the study [38] explored a prototype using a tactile navigational aid that provides turn signals using vibrations on a hip-worn belt. 12 blind users participated in the evaluation of this belt as they navigated through a series of paths under the direction of the tactile belt or conventional auditory turn commands. Results indicate that background street noise compromised users’ ability to navigate with auditory instructions more than it compromised their ability to navigate with tactile commands. In the same context, the authors in [25] found that the vibrotactile belt enabled closer path following, at the cost of the reduced average speed of a blind pedestrian. Jimenez and Jimenez [39] confirm that navigation was slower and more error-prone in the tactile condition.

Audio and haptic interfaces have also been investigated in the context of blind navigation and itinerary preparation [40]. Researchers in [41] presented a study to examine the level of accuracy of the cognitive map developed through the use of an audio-tactile map. This study aims to prove that audio-tactile maps can help VI in understanding their surroundings and planning their journey.

2.3 Motivation and Significance of this Work

As shown in this brief review, while prior research works focus on audio-based interfaces and vibrotactile displays dedicated for VI navigation, there are no enough knowledge about the appropriate feet-based interfaces.

The efficiency of single-vibrator systems became acceptable due to the widespread prevalence and using smartphones. However, the drawback of a single-vibrator system is that once at a waypoint, the user is required to rotate their body actively to determine the direction to the next waypoint.

Tactile interfaces were proved effective in the VI guidance system. They can be worn on different parts of the body with diverse design methods. Interfaces mounted on the torso may include vibrators back an array that creates straight lines patterns or a waist belt with vibrators intended to create absolute point vibrations. Such tactile interactions are particularly effective at representing spatial information in drawing tasks and provide adequate support for VI pedestrians as they cross the streets. On the other hand, the interfaces of the waist belt are utilized for directional indication by triggering the vibration of the motor at the appropriate point on the interfaces around the user’s waist. These tactile interfaces are especially useful for communicating directional information in an operational environment.

All the initiatives up to date have shown that there is no EOA-approved output interface, which gives us more creativity and innovation in this area. In this research, we bridge the research gap in the literature by proposing a new wearable haptic interface that uses four vibrators indicating six different directions.

The shoe-based interface is supported by the fact that human skin naturally comprises biological sensors that can be stimulated by vibrations and contact. The benefit of these sensors is that they are dispersed across the surface of the skin; allowing users to perceive changes quickly and accurately. Also, shoe-based interface is easy to implement, cheap, easy to learn and can be worn by other VI without notice.

3. The Proposed System: HaptiSole

To improve the usability of the wayfinding system and especially haptic interfaces, our aim is to design a natural, delicate, easy-to-use haptic interface that provides sufficient resolution for pedestrian’s navigation purposes. The developed haptic interface can be used as an output interface for GPS based system for VI people. Our first goal is to evaluate the ability of pedestrians to take the right decision when receiving directional instructions throw the proposed interface. As presented in Fig. 2, the HaptiSole is an interface that heavily relies on an external EOA that computes guidance instruction through the position and orientation of the pedestrian and then displays this information via the proposed interface. The main requirements of the proposed interface are:

E1KOBZ_2023_v17n11_3064_f0002.png 이미지

Fig. 2. The basic concept of the proposed interface.

⦁ Ergonomics. HaptiSole desires to be comfortable, shoes integrated easy to wear and remove as well.

⦁ Self-sufficiency. In this prototype, for wireless operation, the HaptiSole requires to have power, communication, and control units.

⦁ Resolution. HapticSole requires integrating enough actuators to give a high-resolution direction. However, we must be careful about feet sensitivity and problems if we increase actuators number.

⦁ Aesthetics. The HaptiSole must be hidden and not cumbersome.

Considering these four requirements, we will present and discuss the design methods. This section is divided into subheadings to give enough details on the conducted experiments and obtained results in addition to the drawn conclusion.

3.1 System Architecture

The HaptiSole aims to direct the user through activating the vibrator in the sole aligned with the position of the next waypoints. The system assumed that the path is defined as a list of waypoints (each waypoint corresponds to a turning point). When the user is close enough to waypoints the vibrator indicates the next waypoints to be reached. To reach the Users destination, they should arrive at reached all waypoints sequentially. Fig. 2 shows the overall architecture of the proposed system.

3.2. Direction Encoding

3.2.1. Foot Sensitivity

Human skin is made up of biological sensors that can be triggered by touch or vibration. These sensors are evenly dispersed across the skin's surface. People are capable of perceiving variations in pacing that are both quick and accurate. A spatial display is created by attaching tactile signal transducers to the outside of the body. It consists of three primary groups of sensors, each grouped by biological function: (1) thermoreceptors, which detect heat; (2) nociceptors, which detect pain; and (3) mechanoreceptors, which detect mechanical stimuli and skin deformation [41]. External forces cause physical deformation, which is detected and sent to the nervous system by mechanoreceptors.

3.2.2 Vibrators Distribution within the Sole

As presented in the previous works, directions were communicated in the full range of 360° [30], [43], [25] by generating an absolute vibration in a corresponding location on the user's body. However, we think that in the absence of vision, the pedestrian who uses an EOA and a traditional help (i.e., white cane) does not require this level of accuracy. For display purposes, fewer vibrators are used, the better fits the device on the foot. A goal for the haptic sole production is to use the minimum possible number of vibrators. For this research, four vibrators were selected as an ideal solution for macro navigation issues. This technique permits a precision of 60°. One of the challenges that we are trying to solve is to reduce the cognitive load required by blinds to understand the instructions of the direction. Based on the interpolation method proposed by [44], [25], we define a new presentation method that displays directions using one or two stimuli at the same time. To simplify the direction display, we choose to present a maximum of six directions. Fig. 3 illustrates that each one of the four vibrators is responsible for displaying a range direction between 0°, 90°, 180°, and 360°, as shown by the highlighted areas.

E1KOBZ_2023_v17n11_3064_f0003.png 이미지

Fig. 3. Design steps towards an accurate and non-obtrusive direction presentation. One vibrator is used for each main direction presentation (1, 2, 3, and 4 respectively for forward, right, backward, and left direction). 1-2 and 1-4 respectively for up-right and up-left for the composite direction.

The directions that belong to the patterned area between vibrators 1 and 2 are displayed by activating both. This allows you to cover all 360 °. If there are only 6 choices (forward, backward, right, left, Up-left, Up- right), the user is informed which way to go.

3.3 Hadware

HaptiSole was designed based on an Arduino Uno board. Four vibrator motors (see Fig. 4). Every vibrator is linked directly to the board of Arduino and placed on the sole. Vibrators are activated when excited throw the Arduino board.

E1KOBZ_2023_v17n11_3064_f0010.png 이미지

Fig. 4. A) The used vibrator disk motor. B) Vibrators are attached on the sole and linked to the board of Arduino. C) The sole was in a shoe.

For each vibrator, several parameters can be defined. These parameters are (i) frequency (perceptible frequency range: 20-1000 Hz), (ii) amplitude (that is, intensity), (iii) waveform and duration, and (iv) rhythm. As shown in Fig. 4, all vibrators are located on the sole and linked to a board of Arduino with a Bluetooth connection. This connection allows you to link your haptic display to an electronic orientation aid or GPS-based guidance application.

The used vibrators provide vibrating frequencies between 20 and 100 Hz. Each vibrator can be independently controlled with a specific vibrating frequency command connected to the Arduino board. Each vibrator ensures a 130 mm2 contact surface with the foot sole. Fig. 5 illustrates the display of the up-right direction by a guidance activating the vibrator 1 and 2 simultaneously.

E1KOBZ_2023_v17n11_3064_f0004.png 이미지

Fig. 5. Displaying up-right direction by activating vibrator 1 and 2 simultaneously. Vibrators 3 and 4 are switch off when up-right direction is played.

Seeking the user’s convenience, the tactile display can be placed used in the right or left foot and/or dominant foot, and it is wearable. The Arduino board, the battery, and the Bluetooth transmission module are all embedded in a black box that the carried easily which is attached to the users’ ankle and then covered by the user’s clothing or in his pocket. The HaptiSole is then inserted into the shoe, and therefore it becomes an unremarkable, visually hidden assistive device. This is the second version of the tactile interface’s device for the foot that we have introduced previously.

Based on this proposed solution, we can further improve the design of the HaptiSole. Taking advantage of the ability to apply different intensities, frequencies, rhythms, or waveforms to the tactile transducers, we can propose more options to be encoded such as points of interest or landmarks along the route or alert message when GPS signals are lost for example. Also, and according to user foot sensitivity, these different settings can be adapted and adjusted according to users’ preferences.

4. The Experimental Design

To evaluate this interface and to avoid positioning errors and the absence of adapted GIS, an evaluation platform was built (see Fig. 6). Two experiments were performed, the first experiment was carried out in a prepared indoor environment, while the second one was performed outdoors, where the participants walked a predetermined route. In a first step, a Wizard of Oz simulation [45] was used to evaluate the proposed interface. By doing so, we aimed to ensure that instructions sent through the HaptiSole interface can be understandable and usable before switching to the implementation phase. Additionally, the knowledge gained in this step contributed to the improvement of the interaction methods.

E1KOBZ_2023_v17n11_3064_f0005.png 이미지

Fig. 6. Experiment setup. The VI pedestrian (3) is guided by an experimenter (1) through the HaptiSole. The experimenter sends direction instructions using an android application connected to the HaptiSole via Bluetooth.

This method has often been used to recognize implementation and testing ways for numerous applications including speech systems, natural language, command languages, imaging systems, and others. In the next sections performed experiments will be presented in detail.

4.1 First Experiment: Accuracy of Direction Perception

Preliminary tests were carried out to evaluate system behaviors. Before outdoor navigation evaluations, the HaptiSole was first confirmed to be able to transmit discernible tactile information in a controlled indoor environment and to be sure that directions encoding is well understood, and vibration can be felled comfortably.

4.1.1 Evaluation Method

Fifteen blindfolded volunteer users participated in this first experiment (13 women and 2men). None of them complained of any problem insight and had no experience with any haptic devices. None of them had any known tactile sensory or cognitive shortage. The age for all invited participants was 33 years old on average. Participants were invited individually, and each session lasted 25 and 30 minutes. At the beginning of each session, the experimenter explained the objectives of this assessment and prototype composition. General instructions on the task have been given completely. For each volunteer, a short time to be familiar with the device and the user interface was given. Each participant was asked to wear the prototype on their feet while sitting. During the test, all users have the complete freedom to select the preferred foot (that is, the right or left foot) on which the test will be run. The actuators’ vibration frequency was the same for all users and set from the beginning at 60 Hz. We choose 60 Hz to be sure that vibrations will be well received by all users. The experimenter plays each direction 5 times (e.g., forward, backward, right, or left, up-right and up-left). A total of 30random directions were played per session. After the direction is played, the user fills in a checkbox table to indicate the direction entered (see Fig. 7). All participants agreed in writing to participate in the research.

E1KOBZ_2023_v17n11_3064_f0006.png 이미지

Fig. 7. First experiment scenario.

4.2 Second Experiment: Outdoor Navigation Task

The objective of this experiment was to evaluate the performance of the HaptiSole prototype in a reel navigation scenario. Thus, we wanted to make sure that directional encoding was understandable and usable before proceeding to the actual real navigation task. Besides, the knowledge gained in this step helped improve interaction techniques. A Simple guidance algorithm was implemented as presented below.

Algorithm 1: Guidance_process

JAKO202305234897629_3075.png 이미지

The second experiment was performed in the outdoor environment, where the participants walked a predetermined route. These experiment objectives are to make sure that each direction can be felt clearly and accurately throughout a mobility task and observe if the blindfolded participants can reach their destination using the proposed haptic interface. Furthermore, to make sure about the efficiency of the interface.

4.2.1. Evaluation Method

A predefined path was defined composed of 117 meters including several turnings (see Fig. 8). Five blindfolded users were involved in this second evaluation with an average age of 31 years old (Two users from 5 did not participate in the first experiment). The system is introduced clearly to them before the beginning of the experiments. Fig. 9 presents how the interface was embodied by the volunteers.

E1KOBZ_2023_v17n11_3064_f0007.png 이미지

Fig. 8. The outdoor predefined path. Red dots identify points of sending direction by the experimenter walking around the user during an evaluation.

E1KOBZ_2023_v17n11_3064_f0008.png 이미지

Fig. 9. During the second experiment, a user is wearing haptic shoes.

The evaluator walked around the contributors to give them instructions via the android application to follow the predefined path. The link between the interface and the Android application is made via Bluetooth as presented in Fig. 5. Fourteen waypoints were defined. In each waypoint, the experimenter will send the direction to the next waypoint to the user.

None of the participants knows the path before the experiment. To record the user`s performance, each participant brings a GPS logger in his pocket. When the user reaches itinerary points (red points in Fig. 8) the experimenters send the direction of the next one until the endpoint.

5. Results and Discussion

5.1. Results of the First Experiment

As presented in Fig. 10, the four main directions (e.g., up, forward, and right, left) were recognized perfectly with an accuracy rate over 94%. Nevertheless, the accuracy of the combined directions, (up-right and up-left the recognition) was 82.67% and 76% respectively. Table 1 presents the evaluation results in detail.

Table 1. The direction recognition rate for each one of the 6 directions.

E1KOBZ_2023_v17n11_3064_t0001.png 이미지

E1KOBZ_2023_v17n11_3064_f0009.png 이미지

Fig. 10. The direction recognition accuracy.

It is shown that the recognition rates are high for the four main directions. These are because of an optimized tactile rendering method based on a small number of stimuli and with optimal intensity. Regarding composite direction when two vibrators were activated simultaneously, the recognition rate is around 76%. A user interview was performed with all users to get their feedback and to try to improve it.

The evaluation showed that participants were able to know the displayed directions with a precision of 90% for all directions. For up-right and up-left the recognition rates were 82.67% and 76% respectively. First, it seemed difficult for the participants to give feedback about the perceived direction that was displayed by the HaptiSole for these two directions. It was challenging for them to differentiate between right/Up-right and left/Up-left. After several discussions, we think that we can improve this rate by moving and adjusting the left vibrator 1 cm to the left or raising the intensity so it can be felt more clearly. This design choice was taken into consideration in the next experiments. Overall, all participants show a high level of satisfaction and enthusiasm.

5.2. Results of the Second Experiment

All users completed the tasks successfully with no errors. The subjects’ navigation was performed on sidewalks. The itinerary was chosen without typical urban obstacles, such as other people, trees street poles, and dust bins. The user reported that the vibration was very clear, the direction was given very well, and no one was given the wrong direction. The duration and distance of the performed path are shown in Table 2.

Table 2. Navigational times and distance for the outdoor experiment.

E1KOBZ_2023_v17n11_3064_t0002.png 이미지

All users were able to accomplish the proposed path under the guide of the master. For instance, the task finishing time for the user #1 was 04 minutes and 0.8 second.

Our main goal was to assess the ability of pedestrians to make the right decisions when given directional instructions. Five participants can reach the end of the set travel route. Participants recognized the meaning of each vibration. Multiple outages were observed in all participants. The second goal was to evaluate vibration detection at various points on the sole during walking. A short interview was held at the end of the session. Participants said that tactile instruction during walking was well received, and they were not bothered by disturbances. They report that the sense of touch can be perceived so as not to interfere with the current task and can be perceived carefully without being aware of the environment.

5.3 Comparison with a State-of-the-Art Study

The GPS heading is the primary means for obtaining the user’s orientation. Indeed, using a Wizard-of-Oz technique to send directional instructions help us to avoid positioning errors. According [35], an approximate 2–4 m resolution of the GPS data was observed during a similar test. So, by eliminating positioning errors, we will be able to focus on directional instructions displayed in each waypoint rather than if the right instruction was displayed or not.

In [35], the authors presented a tactile interface composed of a four-point array of actuators that display 4 directions (forward, backward, right and left). The prototype was tested to evaluate the interface’s ability to transmit tactile information to the user, and the user’s comprehension level of this feedback. The results showed high recognition rates. These results are confirmed by the HaptiSole which confirms that the medial, lateral, and tibial plantar areas of the foot can be efficiently simulated by actuators in order to display directions. In addition to the main four directions, HaptiSole was designed to add two more directions which are up-left and up-right. In the short interview held at the end of each session, participants are asked about up-right and up-left directions. Responses confirm our assumption that these composed directions are more difficult to understand compared to the four main directions. However, participants said that the composed directions can help them in turning points for example or in micro-navigation tasks where the target is near them.

Therefore, it is found that interpolated direction presentation leads to a more accurate perception of the direction, while the discrete direction presentation was easier to perceive and process for waypoint navigation.

6. Conclusion and future works

The goal of this paper was to design a new wearable haptic interface that provides VI people with guidance instructions in a safer, non-intrusive, and fashionable way. To do that, four embedded vibration motors were implemented into a sole connected to an Arduino board. The current research establishes the feasibility of vibrotactile guidance shoes for blind people. The haptic sole successfully steered users to their desired waypoints while allowing the auditory modality to focus on surrounding sounds. The proposed interface was tested in two different stages. The first one was performed in an indoor controlled environment aiming to validate design choices and be sure that stimulus will be identified and felled by pedestrian users. Participants expressed satisfaction with the intensity and frequency used. However, the proposed platform can be used to easily evaluate such options. The second experiment was performed in an open environment with five blindfolded users to evaluate the pertinence of the used stimulus in an outdoor walking scenario. Results revealed that the haptic sole can effectively direct pedestrians without demanding a user’s auditory attention. The researchers believe that these results need to be confirmed in real-world situations of the VI, let by the EOA. Additionally, to assess different mobility situations. Besides the HaptiSole, a platform for evaluating diverse design options was proposed. The design of the sole can be easily changed (e.g., vibrators placement, number of vibrators at each level, etc.) for a new evaluation. To explore these questions, an evaluation toolkit will be integrated into the android application and log all prototype usage. For future work, the presented interface will be evaluated by VI people. Various environmental and ground conditions should be used to ensure that these changes do not affect the use of the sole. Users can also define their parameters (such as vibration frequency and intensity) according to their preference. The previous studies have demonstrated that tactile acuity increases with practice [46] and those blind people can interpret touch information more effectively [47]. The integration of the HaptiSole within an adapted guidance application for VI people is our next goal. The proposed system will rely on a GPS system for user positioning, which can be improved by using an inertial measurement unit.

Acknowledgement

The authors extend their appreciation to the Deanship of Scientific Research at Imam Mohammad Ibn Saud Islamic University (IMSIU) for funding and supporting this work through Research Partnership Program no RP-21-07-09.

Funding Statement

This paper is funded by the Deanship of Scientific Research at Imam Mohammad Ibn Saud Islamic University through Research Partnership Program no RP-21-07-09.

참고문헌

  1. U. Roentgen, G. J. Gelderblom, M. Soede and L. De Witte, "Inventory of electronic mobility aids for persons with visual impairments: a literature review," Journal of Visual Impairment Blind, vol. 102, no. 11, pp. 702-723, 2008. https://doi.org/10.1177/0145482X0810201105
  2. P. H. Cheng, "Wearable ultrasonic guiding device with white cane for the visually impaired: a preliminary verisimilitude experiment," Assistive Technology, vol. 28, no. 3, pp. 127-136, 2016. https://doi.org/10.1080/10400435.2015.1123781
  3. D. Dakopoulos and N. G. Bourbakis, "Wearable obstacle avoidance electronic travel aids for Blind: a survey," IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), vol. 40, no. 1, pp. 25-35, 2010. https://doi.org/10.1109/TSMCC.2009.2021255
  4. J. M. Loomis, R. G. Golledge, R. L. Klatzky, J. M. Speigle and J. Tietz, "Personal guidance system for the visually impaired," in Proc. of the first annual ACM conference on Assistive technologies, pp. 85-91, 1994.
  5. A. Helal, S. E. Moore and B. Ramachandran, "Drishti: an integrated navigation system for visually impaired and disabled," in Proc. of Fifth International. Symposium on Wearable Computers, Zurich, Switzerland, no. 45, pp. 149-156, 2001.
  6. S. Kammoun, G. Parseihian, O. Gutierrez, A. Brilhault, A. Serpa, M. Raynal, B. Oriola, M.M. Mace, M. Auvray, M. Denis and S.J. Thorpe, "Navigation and space perception assistance for the visually impaired: The NAVIG project," Irbm, vol. 33, no. 2, pp. 182-189, 2012. https://doi.org/10.1016/j.irbm.2012.01.009
  7. B. Hofmann-Wellenhof, K. Legat, M. Wieser, "Navigation: principles of positioning and guidance," Springer, Vienna, 2003. 
  8. L. Ran, S. Helal and S. Moore, "Drishti: an integrated indoor/outdoor blind navigation system and service," in Proc. of Second IEEE Annual Conferance on Pervasive Computing and Communications, Orlando, FL, USA, pp. 23-30, 2004.
  9. B. Mayerhofer, B. Pressl and M. Wieser, "ODILIA - a mobility concept for the visually impaired," in Proc. of ICCHP 2008, Linz, Austria, pp. 1109-1116, 2008. 
  10. S. Bhatlawande, A. Sunkari, M. Mahadevappa, J. Mukhopadhyay, M. Biswas, D. Das, and S. Gupta, "Electronic bracelet and vision-enabled waist-belt for mobility of visually impaired people," Assistive Technology, vol. 26, no. 4, pp. 186-195, 2014. https://doi.org/10.1080/10400435.2014.915896
  11. S. Treuillet and E. Royer, "Outdoor/Indoor vision-based localization for blind pedestrian navigation assistance," International Journal of Image and Graphics, vol. 10, no. 4, pp. 481-496, 2010. https://doi.org/10.1142/S0219467810003937
  12. A. Brilhault, S. Kammoun, O. Gutierrez, P. Truillet and C. Jouffrais, "Fusion of artificial vision and GPS to improve blind pedestrian positioning," in Proc. of 4th IFIP International Conference on New Technologies, Mobility and Security, Paris, France, pp.1-5, 2011.
  13. M.B. Coco-Martin, M. Pichel-Mouzo, J.C. Torres, R. Vergaz, R. Cuadrado, J. Pinto-Fraga and R.M. Coco, "Development and evaluation of a head-mounted display system based on stereoscopic images and depth algorithms for patients with visual impairment," Displays, vol. 56, pp. 49-56, 2019. https://doi.org/10.1016/j.displa.2019.01.002
  14. R. G. Golledge, J. M. Loomis, R. L. Klatzky, A. Flury and X.-L. Yang, "Designing a personal guidance system to aid navigation without sight: progress on the GIS component," International Journal of Geographical Information System, vol. 5, no. 4, pp. 373-395, 1991. https://doi.org/10.1080/02693799108927864
  15. S. Zimmermann-Janschitz, "The Application of Geographic Information Systems to Support Wayfinding for People with Visual Impairments or Blindness," in Visual Impairment and Blindness, London, UK: IntechOpen, 2019. 
  16. M. Srikulwong and E. O. Neill, "Tactile Representation of Landmark Types for Pedestrian Navigation: User Survey and Experimental Evaluation," in Proc. of Workshop on using Audio and Haptics for Delivering Spatial Information via Mobile Devices at MobileHCI, Lisbon, Portugal, pp. 18-21, 2010. 
  17. S. Kammoun, M. J.-M. Mace, B. Oriola and C. Jouffrais, "Towards a geographic information system facilitating navigation of visually impaired users," in Proc. of 13th International Conference, ICCHP, Linz, Austria, Part 2, vol. 7383, 2012. 
  18. S. Al-Khalifa and M. Al-Razgan, "Ebsar: Indoor guidance for the visually impaired," Computer & Electronical Engineering, vol. 54, pp. 26-39, 2016. https://doi.org/10.1016/j.compeleceng.2016.07.015
  19. J. Wilson, B. N. Walker, J. Lindsay, C. Cambias and F. Dellaert, "SWAN: system for wearable audio navigation," in Proc. of IEEE International Symposium on Wearable Computers, Boston, MA, USA, pp. 1-8, 2007.
  20. B. F. G. Katz and L. Picinali, "Spatial audio applied to research with the Blind," in Advances in Sound Localization, no. 1991, P. Strumillo, Ed. InTech, 2011, pp. 225-250. 
  21. B. F. G. Katz, S. Kammoun, G. Parseihian, O. Gutierrez, A. Brilhault, M. Auvray, P. Truillet, M. Denis, S. Thorpe and C. Jouffrais, "NAVIG: augmented reality guidance system for the visually impaired," Virtual Reality, vol. 16, no. 3. pp. 253-269, 2012. https://doi.org/10.1007/s10055-012-0213-6
  22. B. N. Walker and J. Lindsay, "Navigation performance with a virtual auditory display: Effects of beacon sound, capture radius, and practice," Human Factors, vol. 48, no. 2, pp. 265-278, 2006. https://doi.org/10.1518/001872006777724507
  23. H. Petrie, V. Johnson, T. Strothotte, A. Raab, S. Fritz and R. Michel, "MoBIC: Designing a travel aid for blind and elderly people," Journal of Navigation, vol. 49, no. 01, pp. 45-52, 1996. https://doi.org/10.1017/S0373463300013084
  24. Pissaloux and R. Velazquez, "Constructing Tactile Languages for Situational Awareness Assistance of Visually Impaired People," in Mobility of Visually Impaired People: Fundamentals and ICT Assistive Technologies, 2018, pp. 1-652. 
  25. G. Flores, S. Kurniawan, R. Manduchi, E. Martinson, L.M. Morales and E.A. Sisbot, "Vibrotactile guidance for wayfinding of blind walkers," IEEE Transactions on Haptics, vol. 8, no. 3, pp. 306-317, 2015. https://doi.org/10.1109/TOH.2015.2409980
  26. W. Heuten, N. Henze, S. Boll and M. Pielot, "Tactile wayfinder: a non-visual support system for wayfinding," in Proc. of the 5th Nordic Conference on Human-computer Interaction: Building Bridges, New York, NY, United States, pp. 172-181, 2008.
  27. D. A. Ross and B. B. Blasch, "Wearable interfaces for orientation and wayfinding," in Proc. of the Fourth International ACM Conference on Assistive technologies - Assets '00, Arlington, Virginia, USA, pp. 193-200, 2000.
  28. Nicula, A., Longo, M.R., "Perception of tactile distance on the back," Perception, 50(8), 677-689, 2021. https://doi.org/10.1177/03010066211025384
  29. Kappers, A. M. L., & Plaisier, M. A., "Guidance for the Design of Vibrotactile Patterns for Use on the Human Back," in Proc. of 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics, pp. 39-47, 2022.
  30. S. Rumelin, E. Rukzio and R. Hardy, "NaviRadar: a novel tactile information display for pedestrian navigation," in Proc. of the 24th Annual ACM Symposium on User Interface Software and Technology, Santa Barbara, California, USA, pp. 293-302, 2011.
  31. M. Pielot, B. Poppinga, W. Heuten and S. Boll, "A tactile compass for eyes-free pedestrian navigation," Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 6947 LNCS, pp. 640-656, 2011. 
  32. S. Azenkot and R. Ladner, "Smartphone haptic feedback for nonvisual wayfinding," in Proc. of The 13th International ACM SIGACCESS Conference on Computers and Accessibility, Dundee Scotland, UK, pp. 281-282, 2011.
  33. A. Brock, S. Kammoun, M. Mace and C. Jouffrais, "Using wrist vibrations to guide hand movement and whole body navigation," Journal I-Com, vol. 13, no. 3, pp. 19-28, 2014. https://doi.org/10.1515/icom.2014.0026
  34. R. Velazquez and O. Bazan, "Foot-based interfaces for navigational assistance of the visually impaired," Pan American Health Care Exchanges, vol. 3, no. 11, pp. 1-6, 2013. https://doi.org/10.1109/PAHCE.2013.6568206
  35. S. Papetti, M. Civolani and F. Fontana, "Rhythm'n'Shoes: a wearable foot tapping interface with audio-tactile feedback," in Proc. of the International Conference on New Interfaces for Musical Expression, Oslo, Norway, pp. 473-476, 2011. 
  36. R. Velazquez, O. Bazan, J. Varona, C. Delgado-Mata and C. A. Gutierrez, "Insights into the capabilities of tactil-Foot Perception," International Journal of Advanced Robotic Systems, vol. 9, 2017.
  37. R. Velazquez, E. Pissaloux, P. Rodrigo, M. Carrasco, N.I. Giannoccaro and A. Lay-Ekuakille, "An outdoor navigation system for blind pedestrians using GPS and tactile-foot feedback," Applied Science, vol. 8, no. 4, p. 578, 2018.
  38. A. Bharadwaj, S. B. Shaw and D. Goldreich, "Comparing tactile to auditory guidance for blind individuals," Frontiers in Human Neuroscience, vol. 13, pp.443, 2019.
  39. R. Jimenez and A. M. Jimenez, "Blind waypoint navigation using a computer controlled vibrotactile belt," in Proc. of Advances in Human Factors and System Interactions: Proceedings of the AHFE 2016 International Conference on Human Factors and System Interactions, Florida, USA, pp. 3-13, 2017. 
  40. Kappers, A.M.L., Plaisier, M.A., "Hands-Free devices for displaying speech and language in the tactile modality - Methods and approaches," IEEE Trans. Haptics, 14(3), 465-478, 2021. https://doi.org/10.1109/TOH.2021.3051737
  41. K. Papadopoulos, E. Koustriava and P. Koukourikos, "Orientation and mobility aids for individuals with blindness: Verbal description vs. audio-tactile map," Assistive Technology, vol. 30, no. 4, pp. 191-200, 2018. https://doi.org/10.1080/10400435.2017.1307879
  42. P. M. Kennedy and J. T. Inglis, "Distribution and behaviour of glabrous cutaneous receptors in the human foot sole," Journal of Physiology, vol. 538, pp. 995-1002, 2002. https://doi.org/10.1113/jphysiol.2001.013087
  43. M. Srikulwong, "A comparison of two wearable tactile interfaces with a complementary display in two orientations," Haptic Audio Interact. Des., 2010. 
  44. M. Pielot, N. Henze, W. Heuten, and S. Boll, "Evaluation of continuous direction encoding with tactile belts," in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 5270 LNCS, 2008, pp. 1-10. 
  45. J. F. Kelley, "An iterative design methodology for user-friendly natural language office information applications," ACM Trans. Inf. Syst., vol. 2, no. 1, pp. 26-41, Jan. 1984. https://doi.org/10.1145/357417.357420
  46. M. Wong, R. M. Peters, and D. Goldreich, "A Physical Constraint on Perceptual Learning: Tactile Spatial Acuity Improves with Training to a Limit Set by Finger Size," Journal of Neuroscience, vol. 33, no. 22, pp. 9345-9352, May 2013. https://doi.org/10.1523/JNEUROSCI.0514-13.2013
  47. A. Bhattacharjee, A. J. Ye, J. A. Lisak, M. G. Vargas, and D. Goldreich, "Vibrotactile Masking Experiments Reveal Accelerated Somatosensry Processing in Congenitally Blind Braille Readers," Journal of Neuroscience, vol. 30, no. 43, pp. 14288-14298, 2010. https://doi.org/10.1523/JNEUROSCI.1447-10.2010