About
Multitasking on computers has become so ingrained in our work processes. A frequently adapted strategy utilized to manage information and tasks is multi-window multitasking – rapidly switching between different apps and to combine multiple sources of information
Screen reader users who are blind use built-in mechanisms (e.g. ALT+TAB) to switch between windows. This requires them to remember the ordering of windows in these tab switchers and linearizes the switching process. Sighted computer users, on the other hand, use a variety of strategies like positioning windows and tabs within and across screens. For blind and low vision (BLV) users, nonvisual multitasking with multiple windows is inaccessible and inefficient. To address this gap, we explore a solution using a Tangible User Interface (TUI) — a physical, tactile cube which is customizable, non-interfering, and improves control, to support non-linear, non- visual window switching.
Outcome Overview

Our Process
Within the time constraint, we leveraged on current research and literature review to understand user’s needs and access gaps. Putting our maker’s hats on, we ideate and prototyped using Arduino, 3D printing and developed a controller application running on Mac computers. We conducted a usability test using heuristic evaluation with a blind software developer, and analyzed the insights derived from the testing.
01. Understanding BLV users’ current pain points
To start off, we tapped on one of our team members’ personal experience in multitasking with multiple windows. As a blind developer, he uses built-in mechanisms (e.g. alt+tab) to switch between windows. This requires him to remember the ordering of windows in these tab switchers and linearizes the switching process – making multitasking inefficient and unproductive.

02. Secondary research
We delved into exploring what existing solutions and works have done and discovered in the area. An in-depth literature review, especially in disability studies, is useful in unravelling insights from various studies quickly, efficiently and well-evaluated. What works? What doesn’t? What were the challenges faced? With some of these questions in mind, we explored these themes:
03. Ideating various tangible user interfaces
From our research, recent works have found some success using a novel input device with the form factor of an elementary 3D structure such as a cube or die. To create an even richer nonvisual experience, tactile interactions (using highly distinguishable patterns such as lines, planes, dots etc.) enable “permanence” and enhance recall when combined with audio in nonvisual interface. Through various sketches and paper (and everyday-materials-prototyping), we formed an initial concept to start prototyping in mid-fidelity (after internal testing).

04. Prototyping with Arduino & 3D printing
With its built-in Bluetooth capabilities, we used the Adafruit Bluefruit Feather NRF52 as our main Arduino board, together with a 9-DOF Absolute Orientation breakout board for its accelerometer, gyroscope and magnetometer. Qbit is designed to work with existing screen readers without modification to them. When the cube is rotated, it sends the corresponding signal to the QBitController mac application via a Bluetooth connection.

We designed and developed the Qbit’s model based on our initial concept using OpenSCAD, C++ programming and after, 3D printed the cube-shaped model.
3D modelling & printing process

05. Developing and putting it altogether
Qbit employs three main modes of user interactions: tactile, kinesthetic and auditory interactions.
Using the QBit System

The QBit controller (developed using Swift) — a self-voicing application running on a Mac computer — listens for global modifier key (Command, Control, Option, or Shift) press events and performs application assignment, switching, and de-assignment using Apple’s APIs, and other actions like recall. These events are sent from the Arduino boards (and gyroscope) placed inside the 3D printed cube.
And our prototype is developed and functional for usability testing.
06. Heuristic evaluation & pilot study
We conducted a heuristic evaluation with Qbit to serve as a foundation for future formal testing and to guide the cube’s development. Our study protocol focused on exploring the cube’s method of tactile interaction, distinguishability of its surface textures, the hierarchy of its window and tab assignment methods, the usefulness of its audial feedback, and the overall effect of the cube on the user’s non-visual window management experience. Our study also served to test our assumptions around the user’s wants and needs around nonvisual window management and multitasking.

07. Key insights
The pilot study was very effective in uncovering tacit needs that we might have assumed otherwise. In general, the participant was able to assign windows and tabs to the cube with ease. However, we discovered interesting insights from the study that would inform the scope for further works:
Undoubtedly, there are scope for future, in-depth work to explore efficient means for nonvisual window management. This project QBit was successful in eliciting information needs and desired efficiency for nonvisual multitasking, as well as making progress in plausible nonvisual interactions to complement the desktop. While QBit allows four switching between four applications and associated windows, future work should be capable of giving glanceable information of a larger number of windows to BLV users.
Looking back,
What was helpful
Talking first hand and letting the user explore and use the cube gave us a fresh perspective in our user’s needs and challenges. Before letting the user starts on the tasks, we started off with understanding our user’s pain points, frustrations and his current multitasking strategies. Starting off the project with a great deal of literature review also gave us context on combining various sources of works into a novel device like QBit.
I personally feel that having a diverse team was such a strength to our project. We tapped on each other perspective and these gave rise to insights that contributed to our solution.

Leave a Reply