Skip to main content

Allen School researchers earn Most Impact Paper Award at IUI 2019 for SUPPLE interface rendering system

Portraits of Krzysztof Gajos and Daniel Weld
Krzysztof Gajos (left) and Daniel Weld

Allen School alumnus Krzysztof Gajos (Ph.D., ‘08) and professor Daniel Weld were recognized with the Most Impact Paper Award at the Association for Computing Machinery’s 24th International Conference on Intelligent User Interfaces (IUI 2019) last month for “SUPPLE: Automatically Generating User Interfaces.” Each year, the IUI organizing committee selects a past paper submitted to the conference that has had the greatest impact and visibility since its initial publication. This year’s winning paper, which Gajos and Weld originally presented at IUI 2004, presents a system for automatically rendering user interfaces based on device characteristics and usage patterns.

When the team began working on SUPPLE, even the most promising solutions for dealing with this challenge could not sufficiently handle the growing array of display sizes and types of interactions available. Many were unable to cope with situations in which the device constraints could not be anticipated in advance or cases in which the functionality had to be generated dynamically. Often, these tools required interface designers to hand-craft templates or explicitly and painstakingly identify which widgets to use and under what constraints — making the process both time-consuming and expensive. And none of them were geared toward addressing the needs of the user, especially the many people with physical disabilities who struggle to use interfaces crafted for an imaginary “average” user.

Gajos and Weld opted to define interface generation as a constrained decision-theoretic optimization problem. They then set about creating a model-driven solution capable of dynamically generating the “optimal” interface for each user given their physical capabilities and constraints such as device capabilities. Rather than specifying how certain features should be presented by an interface from the start — the approach followed by most existing tools — the team preferred to specify what functionality was intended by the interface and leave the decision on how it should be presented to the SUPPLE algorithm.

Jacob Wobbrock

As it became clear that the algorithm might provide revolutionary benefits to users with physical disabilities, Gajos and Weld teamed up with Jacob Wobbrock, a professor of human-computer interaction in the UW Information School and adjunct faculty member in the Allen School. Together, they added methods for quickly and automatically characterizing physical capabilities, and this enabled SUPPLE to generate one interface for a user with muscular dystrophy and a very different one for a person with cerebral palsy, because it learned that the former user had low strength and couldn’t move the pointing device more than small distances while the latter had impaired dexterity and required larger targets for accurate selections. The team’s studies showed that SUPPLE could dramatically increase the speed of users with motor impairments while simultaneously decreasing their error rate.

“We wanted every person, regardless of their physical abilities, to be able to easily manipulate computer interfaces,” explained Gajos, who is now a member of the faculty of Harvard University. “One can view the project as advancing the notion of Ability-based Design. This is important because the prevalent one-size-fits-all methodology inevitably leads to some form of discrimination.”

“Our initial objective with SUPPLE was to provide a consistent user experience across a range of online platforms and services, regardless of physical location or the type of device,” said Weld, a member of the Allen School’s Artificial Intelligence group. “By approaching interface generation as an optimization problem, we were able to move beyond generic design, instead creating personalized interfaces for people with different preferences and physical abilities.”

The system automatically selects the optimal elements to display in a particular interface from multiple widget libraries. In this case, the “optimal” SUPPLE rendering not only satisfies functional and device constraints, but also requires the least amount of user effort expressed as a cost function. To compute estimated cost of user effort, the team employed a trace-driven approach to model users’ typical interactions in addition to their tool for quickly eliciting performance profiles for users with physical disabilities.

To highlight the lasting value of the team’s contributions, IUI organizers invited an expert panel to reflect on the impact of SUPPLE on the intelligent user interface community at the 2019 conference. The panelists included Henry Lieberman of MIT CSAIL, Jeffrey Nichols of Google, and Simone Stumpf of the Centre for HCI Design at City, University of London.

Read the original research paper here.

Congratulations, Krzysztof and Dan!