Javascript is required
1.
World Health Organization, “Blind and vision impairment,” 2024. http://who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment [Google Scholar]
2.
B. Kuriakose, R. Shrestha, and F. E. Sandnes, “Tools and technologies for blind and visually impaired navigation support: A review,” JETE Tech. Rev., vol. 39, no. 1, pp. 1–16, 2020. [Google Scholar] [Crossref]
3.
R. G. Long, “Orientation and mobility research: What is known and what needs to be known,” Peabody J. Educ., vol. 67, no. 2, pp. 89–109, 1990. [Google Scholar] [Crossref]
4.
J. Sanchez, M. Espinoza, M. B. Campos, and L. B. Merabet, “Enhancing orientation and mobility skills in learners who are blind through video gaming,” in Proceedings of the 9th ACM Conference on Creativity and Cognition, New York, NY, USA, 2013, pp. 353–356. [Google Scholar] [Crossref]
5.
S. M. Eldridge, G. A. Lancaster, M. L. Campbell, L. Thabane, S. Hopewell, C. L. Coleman, and C. M. Bond, “Defining feasibility and pilot studies in preparation for randomised controlled trials: Development of a conceptual framework,” PLoS One, vol. 11, no. 3, p. e0150205, 2016. [Google Scholar] [Crossref]
6.
S. B. Olaleye, B. A. Adebiyi, A. Abdulsalaam, F. C. Nwosu, A. O. Adeyanju, H. M. Ambi, and C. Omolayo, “Adaptation of global positioning system (GPS) in Nigerian language for orientation and mobility of students with visual impairment,” Int. J. Res. Spec. Educ., vol. 4, no. 1, pp. 21–29, 2024. [Google Scholar] [Crossref]
7.
S. B. Olaleye, B. A. Adebiyi, A. Abdulsalaam, F. C. Nwosu, A. O. Adeyanju, H. M. Ambi, and C. Omolayo, “Development of blind campus navigation system with obstacle detection device,” Am. J. Sci. Eng. Technol., vol. 9, no. 2, pp. 50–59, 2024. [Google Scholar] [Crossref]
8.
A. Darvishy, H. Hutter, M. Grossenbacher, and D. Merz, “Touch explorer: Exploring digital maps for visually impaired people,” in International Conference on Computers Helping People with Special Needs, 2020, pp. 427–434. [Google Scholar]
9.
M. Al-Razgan, S. Almoaiqel, N. Alrajhi, A. Alhumejani, A. Alshehri, B. Alnefaie, R. Alkhamiss, and S. Rushdi, “A systematic literature review on the usability of mobile applications for visually impaired users,” Peer J. Comput. Sci., vol. 7, pp. 1–36, 2021. [Google Scholar] [Crossref]
10.
A. Csapo, G. Wersenyi, H. Nagy, and T. Stockman, “A survey of assistive technologies and applications for blind users on mobile platforms: A review and foundation for research,” J. Multimodal User Interfaces, vol. 9, pp. 275–286, 2015. [Google Scholar] [Crossref]
11.
D. Bharatia, P. Ambawane, and P. Rane, “Smart electronic stick for visually impaired using android application and google’s cloud vision,” in 2019 Global Conference for Advancement in Technology (GCAT), Bangalore, India, 20119, pp. 1–6. [Google Scholar] [Crossref]
12.
P. M. Lewis, L. N. Ayton, R. H. Guymer, A. J. Lowery, P. J. Blamey, P. J. Allen, C. D. Luu, and J. V. Rosenfeld, “Advances in implantable bionic devices for blindness: A review,” ANZ J. Surg., vol. 86, no. 9, pp. 654–659, 2016. [Google Scholar] [Crossref]
13.
N. Griffin-Shirley, D. R. Banda, P. Ajawon, J. Cheon, J. Lee, H. Park, and S. Lyngdoh, “A survey on the use of mobile applications for people who are visually impaired,” J. Vis. Impair. Blind., vol. 111, no. 4, pp. 307–323, 2017. [Google Scholar] [Crossref]
14.
V. Gintner, J. Balata, J. Boksansky, and Z. Mikovec, “Improving reverse geocoding: Localization of blind pedestrians using conversational ui,” in 2017 8th IEEE International Conference on Cognitive Infocommunications (CoginfoCom), Debrecen, Hungary, 2017, pp. 145–150. [Google Scholar] [Crossref]
15.
A. T. Parker, M. Swobodzinski, J. D. Wright, K. Hansen, B. Morton, and E. Schaller, “Wayfinding tools for people with visual impairments in real world settings: A literature review of recent studies,” Front. Educ., vol. 6, pp. 1–23, 2021. [Google Scholar] [Crossref]
16.
D. Sato, U. Oh, J. Guerreiro, D. Ahmetovic, K. Naito, and H. Takagi, “NavCog3 in the wild: Large-scale blind indoor navigation assistant with semantic features,” ACM Trans. Access. Comput., vol. 12, no. 3, pp. 1–30, 2019. [Google Scholar] [Crossref]
17.
I. Abu Doush, S. Alshatnawi, A. K. Al-Tamimi, B. Alhasan, and S. Hamasha, “ISAB: Integrated indoor navigation system for the blind,” Interact. Comput., vol. 29, no. 2, pp. 181–202, 2016. [Google Scholar] [Crossref]
18.
J. Van der Bie, S. Ben Allouch, and C. Jashinski, “Communicating multimodal wayfinding messages for visually impaired people via wearables,” in Proceedings of the 21st International Conference on Human- Computer Interaction with Mobile Devices and Services, Taipei, Taiwan, 2019, pp. 1–7. [Google Scholar] [Crossref]
19.
M. Saha, A. Fiannaca, M. Kneisel, E. Cutrell, and M. R. Morris, “Closing the gap: Designing for the lastfew- meters wayfinding problem for people with visual impairments,” in 21st International ACM SIGACCESS Conference on Computers and Accessibility, Pittsburgh, PA, USA, 2019, pp. 222–235. [Google Scholar] [Crossref]
20.
M. A. Shera, M. W. Iqbal, M. R. Naqvi, S. K. Shahzad, M. H. Sajjad, A. R. Shafqat, and M. M. Saeed, “Usability evaluation of blind and visually impaired interface in solving the accessibility problems,” in 2021 IEEE International Conference on Innovative Computing, Lahore, Pakistan, 2021, pp. 1–6. doi: 10 .1109/ICIC53490.2021.9693084. [Google Scholar]
21.
M. R. Naqvi, M. Aslam, M. W. Iqbal, S. K. Shahzad, M. Malik, and M. U. Tahir, “Study of block chain and its impact on internet of health things (JOHT): Challenges and opportunities,” in 2020 International Congress on Human Computer Interaction, Optimization and Robotic Applications (HORA), Ankara, Turkey, 2020, pp. 1–6. [Google Scholar] [Crossref]
22.
M. D. Messaoudi, B. A. J. Menelas, and H. Mcheick, “Review of navigation assistive tools and technologies for the visually impaired,” Sensors, vol. 22, pp. 1–29, 2022. [Google Scholar] [Crossref]
23.
H. Chen, K. Wang, and K. Yang, “Improving realsense by fusing color stereo vision and infrared stereo vision for the visually impaired,” in Proceedings of the 1st International Conference on Information Science and Systems, Zhengzhou, China, 2018, pp. 142–146. [Google Scholar] [Crossref]
24.
Z. Dian, L. Kezhong, and M. Rui, “A precise rfid indoor localization system with sensor network assistance,” China Commun., vol. 12, pp. 13–22, 2015. [Google Scholar] [Crossref]
25.
Z. J. Muhsin, R. Qahwaji, F. Ghanchi, and M. Al-Taee, “Review of substitutive assistive tools and technologies for people with visual impairments, recent advancements and prospects,” J. Multimodal User Interfaces, vol. 18, no. 1, pp. 135–156, 2024. [Google Scholar] [Crossref]
26.
L. Hakobyan, J. Lumsden, D. O’Sullivan, and H. Bartlett, “Mobile assistive technologies for the visually impaired,” in Surv. Ophthalmol., 2013, pp. 513–528. doi: j.survophthal.2012.1 0.004. [Google Scholar]
27.
A. Al-Ataby, O. Younis, W. Al-Nuaimy, M. Al-Taee, Z. Sharaf, and B. Al-Bander, “Visual augmentation glasses for people with impaired vision,” in 2016 9th International Conference on Developments in eSystems Engineering (DeSE), Liverpool, UK, 2016, pp. 24–28. [Google Scholar] [Crossref]
28.
S. A. Edupuganti, V. D. Koganti, C. S. Lakshmi, R. N. Kumar, and R. Paruchuri, “Text and speed recognition for visually impaired people using google vision,” in 2021 2nd International Conference on Smart Electronics and Communication (ICOSEC), Trichy, India, 2021, pp. 1325–1330. 2021.9591829. [Google Scholar] [Crossref]
29.
K. Ragavi, P. Radja, and S. Chithra, “Portable text to speech converter for the visually impaired,” in Proceedings of the International Conference on Soft Computing Systems, New Delhi, India, 2016, pp. 751–758. [Google Scholar] [Crossref]
30.
S. Khusro, B. Shah, I. Khan, and S. Rahman, “Haptic feedback to assist blind people in indoor environment using vibration patterns,” Sensors, vol. 22, no. 1, p. 361, 2022. [Google Scholar] [Crossref]
31.
D. Costa and C. Duarte, “Alternative modalities for visually impaired users to control smart TVs,” Multimed. Tools Appl., vol. 79, no. 43, pp. 31931–31955, 2020. [Google Scholar] [Crossref]
32.
L. Porzi, S. Messelodi, C. M. Modena, and E. Ricci, “A smart watch-based gesture recognition system for assisting people with visual impairments,” in Proceedings of the 3rd ACM International Workshop on Interactive Multimedia on Mobile & Portable Devices, New York, NY, USA, 2013, pp. 19–24. [Google Scholar] [Crossref]
Search
Open Access
Research article

Evaluating the Usability and Effectiveness of a Special Education Campus Navigation System for Students with Visual Impairment

solomon babatunde olaleye1*,
benedictus adekunle adebiyi2,
aminat abdulsalaam1,
florence chika nwosu3,
abosede olayinka adeyanju4,
hassana mamman ambi5,
clement omolayo6
1
Department of Computer Science, Federal College of Education (Special), 211102 Oyo, Nigeria
2
Department of Education for Learners with Visual Impairment, Federal College of Education (Special), 211102 Oyo, Nigeria
3
Department of Linguistics and Nigerian Languages, University of Ilorin, 240003 Ilorin, Nigeria
4
Department of Yoruba Language, Emmanuel Alayande College of Education, 211101 Oyo, Nigeria
5
Department of Hausa, Federal College of Education (Special), 211102 Oyo, Nigeria
6
Management Information System Unit, Federal College of Education (Special), 211102 Oyo, Nigeria
Journal of Intelligent Systems and Control
|
Volume 3, Issue 2, 2024
|
Pages 84-92
Received: 03-15-2024,
Revised: 06-04-2024,
Accepted: 06-19-2024,
Available online: 06-29-2024
View Full Article|Download PDF

Abstract:

This study evaluates the usability and effectiveness of a newly developed special education (SPED) campus navigation system designed for students with visual impairment (SVI) at the Federal College of Education (Special), Oyo, Oyo State, Nigeria. The primary objective was to assess the system's capability to facilitate self-navigation for SVI and identify challenges encountered in a campus environment. A mixed-methods approach, combining quantitative data from questionnaires and qualitative insights from interviews, was employed. Twenty SVI, selected through purposive sampling, participated in the study, using the system over a five-week period. The findings indicate significant improvements in the orientation and mobility of SVI, resulting in increased confidence in navigating the campus. Participants reported that the navigation system effectively aided in locating key areas, detecting obstacles, and ensuring safety. However, several critical challenges were identified, such as the system's voice being drowned out in noisy environments and the frequent need for battery recharging every five days. Participants suggested enhancements, including the incorporation of volume control to accommodate various environmental conditions and regular device charging to prevent battery depletion. These improvements are deemed essential for enhancing the system's reliability and usability for SVI.
Keywords: Special education navigation system, Visual impairment, Special education campus, Orientation, Blind mobility

1. Introduction

As published by the World Health Organization (WHO), approximately 2.2 billion people in the world have some form of visual impairment (VI) [1]. It should be noted that, out of all the tertiary institutions of higher learning that accommodate SVI, the most populated institution in this category is the Federal College of Education (Special), Oyo. Consequently, movement in such a typical college presents immense challenges to SVI. Large educational campuses use traditional navigation aids, namely common white canes, human guides, tactile maps and guide dogs, to aid SVI movement on their campuses. Nevertheless, technology improvement has brought about the imagining of a SPED campus navigation system, which improves the mobility of SVI. All the executives and members of the Nigeria Association of the Blind, Oyo State chapter, Nigeria, said they were interested and willing to support the research, emphasizing the necessity of the proposed SPED campus navigation system.

Effective navigation is a fundamental aspect of daily activities, including going to work and school, shopping, among others. There are few who would disagree that sight plays a very important role in the movement from one place to another. Familiar terrains like a bedroom or an office may easily be navigable even by a blind person; however, instructions on how to get from one new place to another largely pose a problem [2]. At the Federal College of Education (Special), Oyo, students with blindness rely on orientation and mobility skills acquired during their first semester in the 100-level course. These skills assist them in nurturing the ideal competent knowledge needed for the performance of safe and efficient mobility within the campus. Orientation involves understanding one's current location and intended destination, whether moving from one section of the college to another or locating a lecture room. However, mobility is defined in relation to the safe and efficient movement across places. It comprises moving around their hostel without tripping, how to safely walk across the streets within the campus, and ensuring efficiency when using public transport [3], [4]. The Federal College of Education (Special), Oyo, was chosen for the research field testing because it happens to be the only college of education with the largest concentration of SVI in Nigeria. This college was established on October 5, 1977, and is the only one of its kind in Nigeria and Sub-Saharan Africa, dedicated to training professionals to support individuals with various disabilities.

This study aims to assess a SPED campus navigation system that was designed for SVI at the Federal College of Education (Special), Oyo. Prior to the main research, a pilot study [5] was carried out on six SVI that were available and willingly selected. The pilot study focuses on a small portion of a larger future project. That is, it is a small-scale project to decide whether to launch a full-scale project. The creation of a novel supportive product, namely smart walking stick, equipped with a Global Positioning System (GPS) and voice directions, was described in this study. The desirable and culturally sensitive talking tool uses a voice command option with the accent of an indigenous Nigerian language. Focusing on the mobility concerns of SVI, especially those of the Federal College of Education (Special), Oyo State, Nigeria, usability testing and field performance evaluations of the device were conducted in this study, which demonstrated its effectiveness in assisting the SVI with navigation. The inclusion of GPS for precise location tracking and voice directions offers users greater flexibility and comfort during navigation [6], [7]. This study evaluates the developed SPED campus navigation system for SVI based on three research questions raised and answered through questionnaires, interviews and field experiences of users. Subsequent sections include related work, research questions, the methodology adopted, results of the work, discussion of findings, conclusions, and recommendations.

2. Related Work

A mobile digital map can address many of the navigation challenges faced by individuals with VI. The digital map communicates with users through three primary methods: speech, vibration, and sound. The model features simple visuals and primarily uses speech to explain the content. It was evaluated using usability tests, which demonstrated its significant impact on users, who were supportive of map parts. Participants offered several suggestions to enhance usability, such as incorporating GPS to pinpoint the user's location [8].

Many blind people rely on technology to perform their daily activities. These technologies are made-up devices that combine sensors, buzzers, speakers, software, etc. that can assist the blind in their daily routines [9]. A key challenge associated with such technologies is their usability, as blind individuals may struggle to use them due to unfamiliarity [10]. Several researchers highlighted three tools: a smart stick that notifies users of obstacles and aids in task performance [11], and an eye device designed to detect colors (pending medical evaluation) [12]. In addition, Griffin-Shirley et al. [13] aimed to understand how blind individuals use mobile devices to perceive the accessibility of their environment. The results of an online survey of 259 participants revealed that most participants found these applications useful, accessible, and satisfactory.

Individuals with VI often require guidance to travel or move into a new environment. To address this, Gintner et al. [14] designed a system that guides individuals with VI using geographical features and a technology device that translates text to voice. The system was evaluated using six individuals with VI, and the results indicated positive feedback regarding its accessibility and usability. Parker et al. [15] described wayfinding devices for blind indoor and outdoor navigation. The investigation revealed participants’ characteristics in real-world environments. The results showed that smart technologies greatly assisted the blind during indoor and outdoor navigation. Several researchers used mixed methods to assess the effectiveness of smart technologies for blind movement [16], [17], [18], [19].

Shera et al. [20] used a user-centred approach to assess the designed device for blind navigation. The use of both qualitative and quantitative data in the investigation aimed at assisting in establishing the best research approach. The process of the user-centred approach involved elements like sampling, developing prototypes, user experience, and general evaluation. A checklist was used to address and report substantive concerns about the study, highlighting the experiences of blind users. Based on this paradigm, parameters concerning blind users were validated, such as effectiveness, satisfaction, and efficiency. Totally, ten self-generated questions were posed to the blind participants following the task, and the post-task assessments were based on a Likert scale. The assessment benchmarks included above-average, set at 68; good, attributed to 72; and excellent, designated as 92 [21].

Messaoudi et al. [22] investigated various cellphone technologies designed for individuals with VI. After reviewing several assistive technology (AT) options, they suggested that future efforts should aim to integrate these tools into a cohesive system, providing comprehensive support for individuals with VI to navigate with ease like their sighted counterparts. They recommended producing a robust device that can provide optimal assistance and support for the blind. Several researchers [23], [24] further suggested the concept of using the smart cane navigation system, as shown in Figure 1, which includes a camera, microcontrollers, and accelerometers that provide audio messages. It utilizes a cloud service to support blind navigation from one place to another. This system is primarily designed to assist individuals with VI in finding and following the shortest path. A smartly designed cane detects obstacles and uses a speaker to produce a sound within its intelligent system. Additionally, the cane assists in identifying whether the environment is dark or bright [24]. However, the concept was a proposal which was neither implemented in real life nor evaluated to determine how effective its functionalities were.

Figure 1. Smart cane navigation system [22]

Numerous studies have explored various navigational systems designed to assist individuals with VI. Although most approaches are theoretically sound, they are excessively complicated or cumbersome in practice. Navigating through different environments is challenging for individuals with VI, who need to be aware of nearby objects and terrain. The inability to effectively navigate these situations negatively impacts their sense of freedom, limiting their ability to discover their way in new surroundings [22].

Several researchers [25], [26] reported that AT can assist blind navigation. However, many of these technologies face limitations regarding the human aspects of the user experience in terms of usability to evaluate their functionalities. Additionally, there are challenges in translating research models into real world solutions. According to Al-Ataby et al. [27], AT for blind navigation can be divided into two categories: traditional technology (such as eyeglasses and white canes) and mobile technology (such as blind navigation devices). Since individuals with VI face challenges with visually demanding devices, several researchers have explored alternative possibilities for AT development. Non-visual sensory modalities, such as speech recognition [28], text-to-speech [29], haptic feedback [30], multimodal feedback [31], and gesture identification [32], have been employed to make AT more accessible for individuals with VI. The usage of AT developed for individuals with VI is low because it was not evaluated for user experience. Therefore, any developed AT devices should be evaluated by the users (individuals with VI). The literature reviewed indicates that existing research predominantly focuses on ATs, which are foreign-based and feature foreign accents. In addition, there are no ATs with native accents. Hence, it is necessary to develop a SPED campus navigation system with a Nigerian accent to fill the identified research gap in a real-life scenario.

3. Research Questions

Three research questions were raised to evaluate the usability and effectiveness of a SPED campus navigation system for SVI at the Federal College of Education (Special), Oyo.

i. How does the SPED navigation system improve the daily campus mobility of SVI?

ii. What are the most common challenges faced by blind users when using the system?

iii. How can the challenges in the second research question be mitigated?

4. Methodology

A pilot study was conducted prior to the main research, aiming to evaluate the initial usability and effectiveness of the SPED campus navigation system. The preliminary feedback helped the main research. After selecting six willing SVI, the pilot study was carried out for one week on the SPED campus. After being given adequate orientation on the usage of the device, the six participants were able to use the device, and their feedback and observatory issues were recorded and used to prepare for the main research. Questionnaires and interviews were used to collect feedback from those participants. The collected data were analysed qualitatively and quantitatively. The results significantly contributed to improvements in the device's voice functionality and the adjustment of the obstacle detection distance from 30 meters to 50 meters for the main study.

A mixed-methods approach was employed for the main research, involving quantitative surveys and qualitative interviews with the 20 SVI selected. A purposive sampling technique was used to select participants from various academic disciplines within the Federal College of Education (Special), Oyo. Participants were recruited through the college's SPED unit. Data were collected through a combination of questionnaires, field tests, and interviews. The surveys were carried out using pre-requirements and post-use of the developed device to assess participants' initial expectations and their experiences after using the device. For the field tests, the 20 participants were asked to navigate to predetermined locations on campus using the developed navigation system. Their routes, time taken, and difficulties encountered were recorded. In addition, semi-structured interviews were conducted to gather in-depth feedback on the device’s usability and effectiveness. The navigation device used can inform SVI about their locations. The integrated software announces this information at regular intervals as the user moves around the campus. The navigation device, which is called an object detection white cane, was utilized because SVI were accustomed to navigating with traditional white canes. The navigation system was developed using components such as a microcontroller, ultrasonic sensors, label surface detection, and a buzzer. Real-time obstacle detection was achieved through the embedded ultrasonic sensors, which identify the presence of obstacles. The collected data were recorded and analysed using simple percentages.

Figure 2 shows the flowchart of the obstacle detection that was implemented in the real world. Figure 3 shows the flowchart of the Federal College of Education (Special), Oyo, as implemented. Figure 4 shows the coupled components consisting of an 8-GB memory card, a lithium-ion battery, a 5-watt speaker, an ultrasonic sensor, an Arduino nanomicro controller, a GPS antenna, etc. Figure 5 shows the SPED campus navigation system with the obstacle detection device during development, while Figure 6 shows the finished product of the SPED campus navigation system, which is ready for use. The device starts by putting on the switch button on the white box attached to the white cane (1 for ON and 0 for OFF).

Figure 2. Flowchart of campus location
Figure 3. Development of the SPED campus navigation system
Figure 4. The finished SPED campus navigation system
Figure 5. Development of the SPED campus navigation system
Figure 6. The finished SPED campus navigation system

Immediately after turning on the device, the written code in the Arduino microcontroller initializes and activates the GPS module, which is used to receive data from satellites. The GPS module provides location information and helps with real-time tracking. The GPS module is connected to the Arduino microcontroller through a Universal Asynchronous Receiver-Transmitter (UART) interface. The UART is a serial communication protocol that can be used to send data between an Arduino microcontroller and the GPS. The GPS receiver receives signals from satellites. These signals are in the National Marine Electronics Association (NMEA) format, and they contain location information in an American Standard Code for Information Interchange (ASCII) format. The Arduino microcontroller calculates the latitude and longitude obtained from the satellite and compares them to the threshold value stored in the database. If the calculated distance is less than or equal to the threshold value, the master controller gives a command to the slave controller, PIC18F4525, to activate the speaker to call out the location on campus. In addition, the device has the capacity to detect any obstacle within a range of 50 meters through the attached sensors [6], [7].

5. Results

Table 1 shows the results of SVI’s experience with the developed SPED campus navigation system. The results show that 100% of the participants agreed the following: the navigation system is effective in identifying important places on campus; the device can easily identify obstacles while navigating on campus; they feel safe while using the navigation system on campus; the device has increased their independence on campus activities; and they are satisfied with the navigation system compared to other navigation aids they have used.

During the interviews with the 20 SVI, three challenges were mentioned.

i. When SVI used the system in a crowded environment, it was difficult for them to hear the system’s voice clearly.

ii. The battery was weak after five days.

iii. SVI wanted to know what could be done in case of any problems with the device.

To mitigate the challenges of the navigation system, several strategies can be implemented as follows:

i. A volume control function should be added to adjust the volume level, thereby meeting the users' needs.

ii. The device should be charged regularly to ensure its proper functionality.

iii. Support services should be offered to the SVI in a known centre to address maintenance issues promptly.

Table 1. SVI’s experience of the SPED campus navigation system

SN

Item

Yes (Quantity and Percentage)

No (Quantity and Percentage)

1

Is the SPED campus navigation system effective in identifying important places on campus?

20 (100%)

0 (0%)

2

Can the device easily identify obstacles while navigating on campus?

20 (100%)

0 (0%)

3

Do you feel safe using the SPED campus navigation system on campus?

20 (100%)

0 (0%)

4

Has the SPED campus navigation system increased your independence in daily campus activities?

20 (100%)

0 (0%)

5

Are you satisfied with the SPED campus navigation system compared to other navigation aids you have used?

20 (100%)

0 (0%)

6. Discussion

The SPED campus navigation system shows great potential for enhancing the campus experience for SVI. Although the system effectively improves navigational independence and reduces anxiety, technical refinements and infrastructural improvements are necessary. The 20 SVI were trained for one day, which is sufficient for them to effectively use the system. Consistent feedback from users in terms of their experience enhanced the device’s reliability and functionality. For instance, during the field testing, there was no voice output from any of the devices. After checking, it was found that the speaker had stopped working. There was voice output after the speaker was replaced. In addition, two GPS modules did not output location information to SVI, but the sensors were working and could detect obstacles. After checking, it was found that the GPS modules were not properly fixed. During field testing, one of the devices fell down and the box was removed from the white cane. This issue was addressed by using adhesive to securely attach the device to the white cane.

According to the responses to the first research question, all the participants (SVI) supported the use of the SPED campus navigation system because it effectively identified important places and obstacles on campus. The participants further reported that they felt safe while using the navigation system and that the system increased their independence during campus activities. In addition, they were satisfied with the voice output during navigation compared to other navigation aids they had used. These results are consistent with several earlier studies [8], [9], [11], [14], [15], [16], [20], [21], [22], [23], [25], [26], which reported that individuals with VI needed a smart stick to notify them of obstacles, aid in their navigation and provide instant feedback in real time. According to several studies [25], [32], usage of AT developed for individuals with VI was low because it was not evaluated by them for user experience. Hence, the AT should be evaluated by individuals with VI because they are the users. In view of this study, the developed SPED campus navigation system is an advancement in the field of AT, which enhances SVI navigation in Nigeria. The device is capable of assisting the SVI in identifying major places on campus as well as obstacles on their path.

As for the second research question, the 20 participants interviewed identified three primary challenges: difficulties in using the system in noisy environments, using the device's battery after five days of regular use, and a concern about device maintenance in case of malfunctions. The participants also suggested ways to mitigate these challenges. To address the noise issue, they recommended allowing users to adjust the volume to suit different noise levels. To handle the battery life problem, it was recommended to charge the device regularly. For maintenance issues, it was recommended to provide prompt support services to SVI to address any problems that arise. This method of evaluation was supported by the studies of several researchers [13], [20], [21], which enabled blind users to rate ATs to be used by them. The work of Griffin-Shirley et al. [13] was assessed using a survey that targeted 259 blind people online, while several researchers [20], [21] dealt with issues of assessing selected parameters concerning users with VI, posing ten questions to the participants.

The SPED campus navigation system cannot be used to identify locations outside the Federal College of Education (Special), Oyo, because it was specifically designed for SVI who are given admission to study at the college. However, the obstacle detection functionality of the device, which can successfully detect obstacles at a distance of 50 meters, is operational outside the college. In addition, to enable the device to identify locations outside its range, the longitude and latitude of the areas stored in the device’s database are needed.

7. Conclusions

This study analyzed the effectiveness and ease of use of the SPED campus navigation system for SVI so as to determine the effectiveness of the AT. The findings confirm that the system enhances the freedom and maneuverability of SVI and subsequently helps them to navigate and function freely and optimistically in the Federal College of Education (Special), Oyo, Oyo State, Nigeria. Nevertheless, some specific restrictions were identified that should be addressed in future work for the tool to become more efficient. The limitations are as follows: The noisy environment on campus in one way or another limited the volume of the sounds produced by the speech output of the system. Another challenge is battery life because the system requires recharging after five days, based on average usage. Consequently, the implementation of the SPED campus navigation system greatly assisted the independent movement of SVI at the Federal College of Education (Special), Oyo, Oyo State, Nigeria.

8. Recommendations

Therefore, in line with the assessment of the usability and effectiveness of the developed SPED campus navigation system for SVI, the following recommendations were made:

i. There should be comprehensive training sessions for SVI to familiarize themselves with the device's features and functionalities.

ii. There is a need to allow a robust feedback mechanism from SVI to continuously gather user input and make iterative improvements.

iii. There is a need to offer support services to the SVI in order to address maintenance issues promptly.

iv. The device works on a lithium battery and needs to be charged regularly.

v. The device is not water-proof; hence, it should not be used in the rain. Future work may look to see how the device can be made water-proof.

vi. There is a need to use a text-to-speech engine with options for different accents to ensure user preferences.

vii. There is a need to ensure that users are able to adjust the volume of voice.

viii. There is a need to incorporate future landmark-based navigation cues such as "turn right after school of science."

ix. Future development should incorporate robust error handling mechanisms to be able to manage situations where SVI deviate significantly from the known paths.

Data Availability

The data used to support the research findings are available from the corresponding author upon request.

Acknowledgments

This paper was supported by TETFund Nigeria through National Research Fund 2021 (Grant No.: TETF/ES/DR&D-CE/NRF-2021/SET1/ICT /00064/VOL.1). The authors also wish to appreciate the invaluable support of the Nigeria Association of the Blind, Oyo State Chapter, Nigeria.

Conflicts of Interest

The authors declare no conflicts of interests.

References
1.
World Health Organization, “Blind and vision impairment,” 2024. http://who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment [Google Scholar]
2.
B. Kuriakose, R. Shrestha, and F. E. Sandnes, “Tools and technologies for blind and visually impaired navigation support: A review,” JETE Tech. Rev., vol. 39, no. 1, pp. 1–16, 2020. [Google Scholar] [Crossref]
3.
R. G. Long, “Orientation and mobility research: What is known and what needs to be known,” Peabody J. Educ., vol. 67, no. 2, pp. 89–109, 1990. [Google Scholar] [Crossref]
4.
J. Sanchez, M. Espinoza, M. B. Campos, and L. B. Merabet, “Enhancing orientation and mobility skills in learners who are blind through video gaming,” in Proceedings of the 9th ACM Conference on Creativity and Cognition, New York, NY, USA, 2013, pp. 353–356. [Google Scholar] [Crossref]
5.
S. M. Eldridge, G. A. Lancaster, M. L. Campbell, L. Thabane, S. Hopewell, C. L. Coleman, and C. M. Bond, “Defining feasibility and pilot studies in preparation for randomised controlled trials: Development of a conceptual framework,” PLoS One, vol. 11, no. 3, p. e0150205, 2016. [Google Scholar] [Crossref]
6.
S. B. Olaleye, B. A. Adebiyi, A. Abdulsalaam, F. C. Nwosu, A. O. Adeyanju, H. M. Ambi, and C. Omolayo, “Adaptation of global positioning system (GPS) in Nigerian language for orientation and mobility of students with visual impairment,” Int. J. Res. Spec. Educ., vol. 4, no. 1, pp. 21–29, 2024. [Google Scholar] [Crossref]
7.
S. B. Olaleye, B. A. Adebiyi, A. Abdulsalaam, F. C. Nwosu, A. O. Adeyanju, H. M. Ambi, and C. Omolayo, “Development of blind campus navigation system with obstacle detection device,” Am. J. Sci. Eng. Technol., vol. 9, no. 2, pp. 50–59, 2024. [Google Scholar] [Crossref]
8.
A. Darvishy, H. Hutter, M. Grossenbacher, and D. Merz, “Touch explorer: Exploring digital maps for visually impaired people,” in International Conference on Computers Helping People with Special Needs, 2020, pp. 427–434. [Google Scholar]
9.
M. Al-Razgan, S. Almoaiqel, N. Alrajhi, A. Alhumejani, A. Alshehri, B. Alnefaie, R. Alkhamiss, and S. Rushdi, “A systematic literature review on the usability of mobile applications for visually impaired users,” Peer J. Comput. Sci., vol. 7, pp. 1–36, 2021. [Google Scholar] [Crossref]
10.
A. Csapo, G. Wersenyi, H. Nagy, and T. Stockman, “A survey of assistive technologies and applications for blind users on mobile platforms: A review and foundation for research,” J. Multimodal User Interfaces, vol. 9, pp. 275–286, 2015. [Google Scholar] [Crossref]
11.
D. Bharatia, P. Ambawane, and P. Rane, “Smart electronic stick for visually impaired using android application and google’s cloud vision,” in 2019 Global Conference for Advancement in Technology (GCAT), Bangalore, India, 20119, pp. 1–6. [Google Scholar] [Crossref]
12.
P. M. Lewis, L. N. Ayton, R. H. Guymer, A. J. Lowery, P. J. Blamey, P. J. Allen, C. D. Luu, and J. V. Rosenfeld, “Advances in implantable bionic devices for blindness: A review,” ANZ J. Surg., vol. 86, no. 9, pp. 654–659, 2016. [Google Scholar] [Crossref]
13.
N. Griffin-Shirley, D. R. Banda, P. Ajawon, J. Cheon, J. Lee, H. Park, and S. Lyngdoh, “A survey on the use of mobile applications for people who are visually impaired,” J. Vis. Impair. Blind., vol. 111, no. 4, pp. 307–323, 2017. [Google Scholar] [Crossref]
14.
V. Gintner, J. Balata, J. Boksansky, and Z. Mikovec, “Improving reverse geocoding: Localization of blind pedestrians using conversational ui,” in 2017 8th IEEE International Conference on Cognitive Infocommunications (CoginfoCom), Debrecen, Hungary, 2017, pp. 145–150. [Google Scholar] [Crossref]
15.
A. T. Parker, M. Swobodzinski, J. D. Wright, K. Hansen, B. Morton, and E. Schaller, “Wayfinding tools for people with visual impairments in real world settings: A literature review of recent studies,” Front. Educ., vol. 6, pp. 1–23, 2021. [Google Scholar] [Crossref]
16.
D. Sato, U. Oh, J. Guerreiro, D. Ahmetovic, K. Naito, and H. Takagi, “NavCog3 in the wild: Large-scale blind indoor navigation assistant with semantic features,” ACM Trans. Access. Comput., vol. 12, no. 3, pp. 1–30, 2019. [Google Scholar] [Crossref]
17.
I. Abu Doush, S. Alshatnawi, A. K. Al-Tamimi, B. Alhasan, and S. Hamasha, “ISAB: Integrated indoor navigation system for the blind,” Interact. Comput., vol. 29, no. 2, pp. 181–202, 2016. [Google Scholar] [Crossref]
18.
J. Van der Bie, S. Ben Allouch, and C. Jashinski, “Communicating multimodal wayfinding messages for visually impaired people via wearables,” in Proceedings of the 21st International Conference on Human- Computer Interaction with Mobile Devices and Services, Taipei, Taiwan, 2019, pp. 1–7. [Google Scholar] [Crossref]
19.
M. Saha, A. Fiannaca, M. Kneisel, E. Cutrell, and M. R. Morris, “Closing the gap: Designing for the lastfew- meters wayfinding problem for people with visual impairments,” in 21st International ACM SIGACCESS Conference on Computers and Accessibility, Pittsburgh, PA, USA, 2019, pp. 222–235. [Google Scholar] [Crossref]
20.
M. A. Shera, M. W. Iqbal, M. R. Naqvi, S. K. Shahzad, M. H. Sajjad, A. R. Shafqat, and M. M. Saeed, “Usability evaluation of blind and visually impaired interface in solving the accessibility problems,” in 2021 IEEE International Conference on Innovative Computing, Lahore, Pakistan, 2021, pp. 1–6. doi: 10 .1109/ICIC53490.2021.9693084. [Google Scholar]
21.
M. R. Naqvi, M. Aslam, M. W. Iqbal, S. K. Shahzad, M. Malik, and M. U. Tahir, “Study of block chain and its impact on internet of health things (JOHT): Challenges and opportunities,” in 2020 International Congress on Human Computer Interaction, Optimization and Robotic Applications (HORA), Ankara, Turkey, 2020, pp. 1–6. [Google Scholar] [Crossref]
22.
M. D. Messaoudi, B. A. J. Menelas, and H. Mcheick, “Review of navigation assistive tools and technologies for the visually impaired,” Sensors, vol. 22, pp. 1–29, 2022. [Google Scholar] [Crossref]
23.
H. Chen, K. Wang, and K. Yang, “Improving realsense by fusing color stereo vision and infrared stereo vision for the visually impaired,” in Proceedings of the 1st International Conference on Information Science and Systems, Zhengzhou, China, 2018, pp. 142–146. [Google Scholar] [Crossref]
24.
Z. Dian, L. Kezhong, and M. Rui, “A precise rfid indoor localization system with sensor network assistance,” China Commun., vol. 12, pp. 13–22, 2015. [Google Scholar] [Crossref]
25.
Z. J. Muhsin, R. Qahwaji, F. Ghanchi, and M. Al-Taee, “Review of substitutive assistive tools and technologies for people with visual impairments, recent advancements and prospects,” J. Multimodal User Interfaces, vol. 18, no. 1, pp. 135–156, 2024. [Google Scholar] [Crossref]
26.
L. Hakobyan, J. Lumsden, D. O’Sullivan, and H. Bartlett, “Mobile assistive technologies for the visually impaired,” in Surv. Ophthalmol., 2013, pp. 513–528. doi: j.survophthal.2012.1 0.004. [Google Scholar]
27.
A. Al-Ataby, O. Younis, W. Al-Nuaimy, M. Al-Taee, Z. Sharaf, and B. Al-Bander, “Visual augmentation glasses for people with impaired vision,” in 2016 9th International Conference on Developments in eSystems Engineering (DeSE), Liverpool, UK, 2016, pp. 24–28. [Google Scholar] [Crossref]
28.
S. A. Edupuganti, V. D. Koganti, C. S. Lakshmi, R. N. Kumar, and R. Paruchuri, “Text and speed recognition for visually impaired people using google vision,” in 2021 2nd International Conference on Smart Electronics and Communication (ICOSEC), Trichy, India, 2021, pp. 1325–1330. 2021.9591829. [Google Scholar] [Crossref]
29.
K. Ragavi, P. Radja, and S. Chithra, “Portable text to speech converter for the visually impaired,” in Proceedings of the International Conference on Soft Computing Systems, New Delhi, India, 2016, pp. 751–758. [Google Scholar] [Crossref]
30.
S. Khusro, B. Shah, I. Khan, and S. Rahman, “Haptic feedback to assist blind people in indoor environment using vibration patterns,” Sensors, vol. 22, no. 1, p. 361, 2022. [Google Scholar] [Crossref]
31.
D. Costa and C. Duarte, “Alternative modalities for visually impaired users to control smart TVs,” Multimed. Tools Appl., vol. 79, no. 43, pp. 31931–31955, 2020. [Google Scholar] [Crossref]
32.
L. Porzi, S. Messelodi, C. M. Modena, and E. Ricci, “A smart watch-based gesture recognition system for assisting people with visual impairments,” in Proceedings of the 3rd ACM International Workshop on Interactive Multimedia on Mobile & Portable Devices, New York, NY, USA, 2013, pp. 19–24. [Google Scholar] [Crossref]

Cite this:
APA Style
IEEE Style
BibTex Style
MLA Style
Chicago Style
GB-T-7714-2015
Olaleye, S. B., Adebiyi, B. A., Abdulsalaam, A., Nwosu, F. C., Adeyanju A. O., Ambi, H. M., & Omolayo, C. (2024). Evaluating the Usability and Effectiveness of a Special Education Campus Navigation System for Students with Visual Impairment. J. Intell Syst. Control, 3(2), 84-92. https://doi.org/10.56578/jisc030202
S. B. Olaleye, B. A. Adebiyi, A. Abdulsalaam, F. C. Nwosu, A. O. Adeyanju, H. M. Ambi, and C. Omolayo, "Evaluating the Usability and Effectiveness of a Special Education Campus Navigation System for Students with Visual Impairment," J. Intell Syst. Control, vol. 3, no. 2, pp. 84-92, 2024. https://doi.org/10.56578/jisc030202
@research-article{Olaleye2024EvaluatingTU,
title={Evaluating the Usability and Effectiveness of a Special Education Campus Navigation System for Students with Visual Impairment},
author={Solomon Babatunde Olaleye and Benedictus Adekunle Adebiyi and Aminat Abdulsalaam and Florence Chika Nwosu and Abosede Olayinka Adeyanju and Hassana Mamman Ambi and Clement Omolayo},
journal={Journal of Intelligent Systems and Control},
year={2024},
page={84-92},
doi={https://doi.org/10.56578/jisc030202}
}
Solomon Babatunde Olaleye, et al. "Evaluating the Usability and Effectiveness of a Special Education Campus Navigation System for Students with Visual Impairment." Journal of Intelligent Systems and Control, v 3, pp 84-92. doi: https://doi.org/10.56578/jisc030202
Solomon Babatunde Olaleye, Benedictus Adekunle Adebiyi, Aminat Abdulsalaam, Florence Chika Nwosu, Abosede Olayinka Adeyanju, Hassana Mamman Ambi and Clement Omolayo. "Evaluating the Usability and Effectiveness of a Special Education Campus Navigation System for Students with Visual Impairment." Journal of Intelligent Systems and Control, 3, (2024): 84-92. doi: https://doi.org/10.56578/jisc030202
OLALEYE S B, ADEBIYI B A, ABDULSALAAM A, et al. Evaluating the Usability and Effectiveness of a Special Education Campus Navigation System for Students with Visual Impairment[J]. Journal of Intelligent Systems and Control, 2024, 3(2): 84-92. https://doi.org/10.56578/jisc030202
cc
©2024 by the author(s). Published by Acadlore Publishing Services Limited, Hong Kong. This article is available for free download and can be reused and cited, provided that the original published version is credited, under the CC BY 4.0 license.