The use of mainstream digital design applications (e.g., Adobe products) requires high-precision control by individuals, often through the use of traditional devices such as a mouse, keyboard, or stylus. However, this can actively exclude or deter creatives with physical impairments who may experience challenges in using these tools.
This lack of inclusivity can impact all levels of design, from disabled students who are interested in developing their creative careers, to hobbyists and professional designers who may have been affected by a disability later in life.
To explore new inclusive opportunities for designers with physical impairments, we are investigating alternative approaches (e.g., eye gaze input) to support the production of high-quality design outputs. Our initial work in this area was published in a paper (Multimodal Gaze Interaction for Creative Design) showcasing our “Sakura” research prototype that enabled digital design work via the use of eye gaze and a mechanical switch. We were also awarded a grant from the Adobe Fund for Design to develop a plugin for their Adobe XD application that enabled users to design and prototype interfaces via eye gaze control.
Furthermore, we have been investigating the use of voice control for manipulating digital assets within a digital canvas. Additional details around our work in this area can be found in our papers on creative object manipulation and digital asset positioning via speech interaction. In terms of ongoing and future work, we are particularly interested in how intelligent assistants can support the working practices of disabled designers using alternative input devices to facilitate more inclusive design experiences.
Project Team
- Professor Chris Creed
- Professor Ian Williams
- Callum Slowley