
Dr Kyle Keane
BS, MS, PhD
Expertise
I specialise in researching, designing and creating technology that can assist and empower people with disabilities. As I am blind myself, I take a special interest in assistive technology for the visually impaired.
Current positions
Senior Lecturer in Assistive Technologies
School of Computer Science
Contact
Press and media
Many of our academics speak to the media as experts in their field of research. If you are a journalist, please contact the University’s Media and PR Team:
Biography
Dr. Kyle Keane is a Senior Lecturer in Assistive Technologies at the University of Bristol, where he leads research at the intersection of artificial intelligence, human cognition, and assistive technology design. Inspired to compensate for his early-onset blindness from a degenerative retinal condition, Kyle’s dynamically changing visual impairment provides unique insight, perspective, and motivation to investigate the limits and potential for humans to interpret meaningful information from interactive technologies. With a background in quantum computing and computational physics, he applies the scientific rigor of a physicist to push the boundaries of human perception and the precision of an engineer to design technologies that translate complex information into interpretable, multi-sensory experiences.
Previously a Lecturer and Research Scientist at MIT, he developed and taught Principles and Practices of Assistive Technology, an internationally recognized course supporting students to co-design assistive devices with individuals with disabilities. His work extends globally, having led co-design workshops in India, Saudi Arabia, and beyond, fostering inclusive technology ecosystems. In AI, he has contributed to human-centered machine learning applications, including developing natural language interfaces for Wolfram|Alpha and integrating AI-powered accessibility solutions.
His research on intersensory perception science explores how information can be sonified, tactilely represented, or synthesized across multiple senses to create meaningful cognitive representations, including pioneering auditory and tactile data representation methods to make computational science accessible to blind users. His goal is to redefine how humans interact with information, leveraging insights from cognitive science, AI, and accessibility engineering to augment human capabilities through intelligent, perception-aware systems.
Previously a Lecturer and Research Scientist at MIT, he developed and taught Principles and Practices of Assistive Technology, an internationally recognized course supporting students to co-design assistive devices with individuals with disabilities. His work extends globally, having led co-design workshops in India, Saudi Arabia, and beyond, fostering inclusive technology ecosystems. In AI, he has contributed to human-centered machine learning applications, including developing natural language interfaces for Wolfram|Alpha and integrating AI-powered accessibility solutions.
His research on intersensory perception science explores how information can be sonified, tactilely represented, or synthesized across multiple senses to create meaningful cognitive representations, including pioneering auditory and tactile data representation methods to make computational science accessible to blind users. His goal is to redefine how humans interact with information, leveraging insights from cognitive science, AI, and accessibility engineering to augment human capabilities through intelligent, perception-aware systems.
Research interests
My research interests include AI, the design and implementation of assitive technologies, human-computer interaction, perception, accoustics and immersive audio.
Publications
Recent publications
01/01/2025What the visual system can learn from the non-dominant hand: The effect of graphomotor engagement on visual discrimination
Memory & cognition
Sense-O-Nary
IDC '24: Proceedings of the 23rd Annual ACM Interaction Design and Children Conference