An intuitive model of perceptual grouping for HCI design
Ruth Rosenholtz, Nathaniel R. Twarog, et al.
CHI EA 2009
To enable people with visual impairments (PVI) to explore shopping malls, it is important to provide information for selecting destinations and obtaining information based on the individual's interests. We achieved this through conversational interaction by integrating a large language model (LLM) with a navigation system. ChitChatGuide allows users to plan a tour through contextual conversations, receive personalized descriptions of surroundings based on transit time, and make inquiries during navigation. We conducted a study in a shopping mall with 11 PVI, and the results reveal that the system allowed them to explore the facility with increased enjoyment. The LLM-based conversational interaction, by understanding vague and context-based questions, enabled the participants to explore unfamiliar environments effectively. The personalized and in-situ information generated by the LLM was both useful and enjoyable. Considering the limitations we identified, we discuss the criteria for integrating LLMs into navigation systems to enhance the exploration experiences of PVI.
Ruth Rosenholtz, Nathaniel R. Twarog, et al.
CHI EA 2009
Paul Ung-Joon Lee, Shumin Zhai
Int. J. Hum. Comput. Stud.
Victor Soto, Lidia Mangu, et al.
INTERSPEECH 2014
Upol Ehsan, Elizabeth Watkins, et al.
CHI 2025