{"id":244168,"date":"2025-05-23T11:29:03","date_gmt":"2025-05-23T03:29:03","guid":{"rendered":"https:\/\/www.grab.com\/sg\/?post_type=editorial&#038;p=244168"},"modified":"2025-11-27T17:13:16","modified_gmt":"2025-11-27T09:13:16","slug":"grabs-ai-voice-assistant-lets-visually-impaired-users-book-rides-with-ease","status":"publish","type":"editorial","link":"https:\/\/www.grab.com\/sg\/inside-grab\/stories\/grabs-ai-voice-assistant-lets-visually-impaired-users-book-rides-with-ease\/","title":{"rendered":"Grab\u2019s AI Voice Assistant lets visually impaired users book rides with ease"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-post\" data-elementor-id=\"244168\" class=\"elementor elementor-244168\" data-elementor-post-type=\"editorial\">\n\t\t\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-746352e elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"746352e\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-98e63fa\" data-id=\"98e63fa\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap\">\n\t\t\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-88bc2dd gr21-boxed-content editorial-gr21-boxed-content elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"88bc2dd\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-d6be047\" data-id=\"d6be047\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-3064034 elementor-widget elementor-widget-text-editor\" data-id=\"3064034\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<p>One of Grab\u2019s goals is making cutting-edge AI technology accessible to underserved communities, including individuals with disabilities, the elderly, and those less familiar with technology.<\/p><p>We created Grab\u2019s AI Voice Assistant to offer a more accessible way to book rides\u2014one that\u2019s designed with inclusivity at its core.\u00a0<\/p><p>While Grab\u2019s app interface is user-friendly for most, it can be challenging for people with visual impairments who rely on their phone\u2019s built-in accessibility tools. These tools typically read aloud app elements as users swipe through them.<\/p><p>But for an app like Grab that offers many services and includes interactive elements such as pop-ups and notifications, finding the right button may take time.<\/p><p>AI Voice Assistant allows users to simply speak their intent and get going: faster, easier, and with more independence.<\/p><p>Members from the Singapore Association for Visually-Handicapped (SAVH) participated in focus groups and product testing, helping us design the feature to address their needs.<\/p>\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-647d78e elementor-widget elementor-widget-text-editor\" data-id=\"647d78e\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<h5>Powered by AI and LLMs<\/h5><p>AI Voice Assistant uses OpenAI\u2019s GPT-4.1 Large Language Model and voice recognition to interact with users through voice commands. It guides users through the process of booking a ride and keeps users updated on the driver&#8217;s status, from the moment a driver is found to their arrival. Other capabilities include:\u00a0<\/p><ul><li aria-level=\"1\">Predict passenger\u2019s drop-off &amp; pick-off locations, while allowing them to make changes\u00a0<\/li><li aria-level=\"1\">Suggest ride options to passengers<\/li><li aria-level=\"1\">Provide post-booking status updates.\u00a0<\/li><li aria-level=\"1\">Allow users to cancel the ride or call drivers\u00a0<\/li><li aria-level=\"1\">Change payment methods, for example from cash to cashless<\/li><\/ul><h5>Building for local contexts<\/h5><p>Creating Voice Assistant for the visually impaired requires our AI models to grasp the context and subtleties. For instance, we noticed the model struggled to recognise Singaporean accents accurately.<\/p><p>Built on OpenAI, we fine-tuned the speech-to-text model by training it with 80,000 local voice samples. These were contributed by Singapore-based Grabbers who recited building names, landmarks, and other phrases to improve the model\u2019s understanding of local accents<\/p><p>This improved the recognition accuracy from 46 per cent to 89 per cent!<\/p><h5>You can help us train the model<\/h5><p>We want to further improve its accuracy, which is why we\u2019re launching a voice donation campaign in June to invite Singapore users to contribute more voice samples.<\/p><p>Apart from fine-tuning our models, we had to put the feature to the test before rolling it out. Over the past few months, we have been conducting user testing among visually impaired users, with the help of SAVH. This allows us to understand their unique needs, challenges and ensure that our tool is intuitive and easy to use.\u00a0<\/p><p>Voice Assistant is currently being piloted in Singapore and is available to all users who have the talkback feature enabled on their phones.\u00a0<\/p><p>We believe that Voice Assistant is a significant step towards a more inclusive app. We will continue expanding its use cases to help more people carry out transactions, connect with loved ones, and access our services with ease.<\/p>\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-f53672e elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"f53672e\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-eb45515\" data-id=\"eb45515\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap\">\n\t\t\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<\/div>\n\t\t","protected":false},"parent":180237,"menu_order":0,"template":"grab21-default","acf":[],"_links":{"self":[{"href":"https:\/\/www.grab.com\/sg\/wp-json\/wp\/v2\/editorial\/244168"}],"collection":[{"href":"https:\/\/www.grab.com\/sg\/wp-json\/wp\/v2\/editorial"}],"about":[{"href":"https:\/\/www.grab.com\/sg\/wp-json\/wp\/v2\/types\/editorial"}],"version-history":[{"count":49,"href":"https:\/\/www.grab.com\/sg\/wp-json\/wp\/v2\/editorial\/244168\/revisions"}],"predecessor-version":[{"id":254694,"href":"https:\/\/www.grab.com\/sg\/wp-json\/wp\/v2\/editorial\/244168\/revisions\/254694"}],"up":[{"embeddable":true,"href":"https:\/\/www.grab.com\/sg\/wp-json\/wp\/v2\/editorial\/180237"}],"wp:attachment":[{"href":"https:\/\/www.grab.com\/sg\/wp-json\/wp\/v2\/media?parent=244168"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}