Exploring An Accessible Tangible User Interface for Window Management and Multitasking

Duration: 5 weeks
Tools: Swift, Arduino C Programming, OpenSCAD
Team: Jazz Ang, Venkatesh Potluri, Ana Liu, Della Sigrest


Multitasking on computers has become so ingrained in our work processes. A frequently adapted strategy utilized to manage information and tasks is multi-window multitasking – rapidly switching between different apps and to combine multiple sources of information

Screen reader users who are blind use built-in mechanisms (e.g. ALT+TAB) to switch between windows. This requires them to remember the ordering of windows in these tab switchers and linearizes the switching process. Sighted computer users, on the other hand, use a variety of strategies like positioning windows and tabs within and across screens. For blind and low vision (BLV) users, nonvisual multitasking with multiple windows is inaccessible and inefficient. To address this gap, we explore a solution using a Tangible User Interface (TUI) — a physical, tactile cube which is customizable, non-interfering, and improves control, to support non-linear, non- visual window switching.

Outcome Overview

Our Process

Within the time constraint, we leveraged on current research and literature review to understand user’s needs and access gaps. Putting our maker’s hats on, we ideate and prototyped using Arduino, 3D printing and developed a controller application running on Mac computers. We conducted a usability test using heuristic evaluation with a blind software developer, and analyzed the insights derived from the testing.


Exploring current solutions;

Understand related work and alternatives


Brainstorm, sketch and design the first few low-fidelity prototypes


01. Understanding BLV users’ current pain points

To start off, we tapped on one of our team members’ personal experience in multitasking with multiple windows. As a blind developer, he uses built-in mechanisms (e.g. alt+tab) to switch between windows. This requires him to remember the ordering of windows in these tab switchers and linearizes the switching process – making multitasking inefficient and unproductive.

02. Secondary research

We delved into exploring what existing solutions and works have done and discovered in the area. An in-depth literature review, especially in disability studies, is useful in unravelling insights from various studies quickly, efficiently and well-evaluated. What works? What doesn’t? What were the challenges faced? With some of these questions in mind, we explored these themes:  

Nonvisual nonlinear screen readers
Window management systems for sighted users


03. Ideating various tangible user interfaces

From our research, recent works have found some success using a novel input device with the form factor of an elementary 3D structure such as a cube or die. To create an even richer nonvisual experience, tactile interactions (using highly distinguishable patterns such as lines, planes, dots etc.) enable “permanence” and enhance recall when combined with audio in nonvisual interface. Through various sketches and paper (and everyday-materials-prototyping), we formed an initial concept to start prototyping in mid-fidelity (after internal testing).

04. Prototyping with Arduino & 3D printing

With its built-in Bluetooth capabilities, we used the Adafruit Bluefruit Feather NRF52 as our main Arduino board, together with a 9-DOF Absolute Orientation breakout board for its accelerometer, gyroscope and magnetometer. Qbit is designed to work with existing screen readers without modification to them. When the cube is rotated, it sends the corresponding signal to the QBitController mac application via a Bluetooth connection.

We designed and developed the Qbit’s model based on our initial concept using OpenSCAD, C++ programming and after, 3D printed the cube-shaped model.

3D modelling & printing process


05. Developing and putting it altogether

Qbit employs three main modes of user interactions: tactile, kinesthetic and auditory interactions.


Four different types of texture on four different sides of the cube and the braille labels help user to easily identify the orientation of the cube.


Rotating the cube simulates window switching. The side of the cube facing up will open the window that is assigned to that particular side.

Using the QBit System

The QBit controller (developed using Swift) — a self-voicing application running on a Mac computer — listens for global modifier key (Command, Control, Option, or Shift) press events and performs application assignment, switching, and de-assignment using Apple’s APIs, and other actions like recall. These events are sent from the Arduino boards (and gyroscope) placed inside the 3D printed cube.

And our prototype is developed and functional for usability testing.

06. Heuristic evaluation & pilot study

We conducted a heuristic evaluation with Qbit to serve as a foundation for future formal testing and to guide the cube’s development. Our study protocol focused on exploring the cube’s method of tactile interaction, distinguishability of its surface textures, the hierarchy of its window and tab assignment methods, the usefulness of its audial feedback, and the overall effect of the cube on the user’s non-visual window management experience. Our study also served to test our assumptions around the user’s wants and needs around nonvisual window management and multitasking.

07. Key insights

The pilot study was very effective in uncovering tacit needs that we might have assumed otherwise. In general, the participant was able to assign windows and tabs to the cube with ease. However, we discovered interesting insights from the study that would inform the scope for further works:

Kinesthetic & auditory experience was effective in recalling

The rotation of the cube was an intuitive way of switching between tabs.

Uses more than 50 windows but the cube only accommodate minimal windows usage (up to 4)

Main problem was not switching between immediate windows (ie, windows 1-4,) but, switching from window 1 to window 36.

Undoubtedly, there are scope for future, in-depth work to explore efficient means for nonvisual window management. This project QBit was successful in eliciting information needs and desired efficiency for nonvisual multitasking, as well as making progress in plausible nonvisual interactions to complement the desktop. While QBit allows four switching between four applications and associated windows, future work should be capable of giving glanceable information of a larger number of windows to BLV users.

Looking back,

What was helpful

Talking first hand and letting the user explore and use the cube gave us a fresh perspective in our user’s needs and challenges. Before letting the user starts on the tasks, we started off with understanding our user’s pain points, frustrations and his current multitasking strategies. Starting off the project with a great deal of literature review also gave us context on combining various sources of works into a novel device like QBit.

I personally feel that having a diverse team was such a strength to our project. We tapped on each other perspective and these gave rise to insights that contributed to our solution.

What was surprising

The pilot study challenges our assumptions and some of the literature review we were basing our cube on. We were basing our notion of multitasking on what we were experiencing personally, but for the user, he uses a large number of windows (over 50 windows at a time) and one of his multitasking strategies are to group these windows based on their functions – vastly different from our initial understanding.

What could be done better

In this study, due to the lack of time, we tapped on our personal experience and literature review to base our hypothesis on the problem space and understanding of our user. A more qualitative user research from the onset of the project would be ideal. We could also consider employing a co-design approach in the conception of a novel device, possibly with multisensory crafting to explore affordances across different forms (apart from a cube) and tactiles. This will also aid us in exploring the tangible user interface design space with a disability studies perspective.

Leave a Reply

Your email address will not be published.

Skip to content