Skip to main content

Allen School accessibility researchers past and present shine at ASSETS 2019

Galen Weld (left) and Jon Froehlich

The strength and enduring impact of the Allen School and University of Washington’s contributions in accessible technology were on full display at the 21st International ACM SIGACCESS Conference on Computers and Accessibility, known as ASSETS, last month in Pittsburgh. Current or former Allen School researchers had a hand in three award-winning papers recognized at the conference, with a mix of current students, faculty, and alumni all represented. 

The Best Student Paper Award was granted to the paper “Deep Learning for Automatically Detecting Sidewalk Accessibility Problems Using Streetscape Imagery.” The authors,  Allen School Ph.D. students Galen Weld and Esther Jang, undergrad Aileen Zeng, professors Kurtis Heimerl, in the Information and Communication Technology for Development Lab  and Jon Froehlich, founder of the Makeability Lab and University of Maryland Ph.D. student Anthony Li, wrote about the use of deep learning to automatically assess the accessibility of sidewalks found in online imagery.

In an effort to let the public know which sidewalks in any city are accessible to people with disabilities, researchers began using smartphone-based tools to capture sidewalk accessibility problems on a large scale. Users marked accessibility on a map using their smartphones. This solution was not without flaws‒not many regions are adopting the program, the areas they cover are small and the burden on users is too much: they must download an app, take a picture, annotate it and upload it. To improve upon those issues, researchers implemented machine learning and satellite imagery to decrease manual labor and costs. However, even this work has been impacted negatively by small training sets and less than stellar machine-learning. To improve the process even more, the Allen School team used residual neural networks modified to support image and non-image features and had bigger training sets based on a dataset of 300,000+ image-based sidewalk accessibility labels collected by Project Sidewalk in the Makeability Lab. The results described in the winning paper were more accurate than previous automated methods, and in some cases exceeded the accuracy of human-generated labels.

Left to right: Shaun Kane, Jacob Wobbrock and Jeffrey Bigham

Members of the Allen School also contributed to the 2019 SIGACCESS ASSETS Paper Impact Award, granted to a paper at least 10 years old that has had a significant impact on computing and information technology addressing the needs of persons with disabilities. Allen School  alumnus Jeffrey Bigham (Ph.D., ‘09), Information School professor and Allen School adjunct professor Jacob Wobbrock, and Information School alumnus Shaun Kane (Ph.D. ‘11), earned the award for their paper, “Slide Rule: Making mobile touch screens accessible to blind people using multi-touch interaction techniques.”  

Before the iPhone popularized the touchscreen smartphone, users who were blind could rely on their sense of touch using a cell phone’s physical buttons. To enable these users to take advantage of the latest smartphone features, the team designed Slide Rule, an audio-based program that enables blind users to access touch screen applications through the use of gestures. Slide Rule was the first prototype to exhibit finger-driven screen-reading and a second-finger tap gesture, today called “split tap,” to trigger on-screen targets like links and buttons. Since the team published its paper, major manufacturers have incorporated these features into their commercial products.

Last but not least, recent Allen School graduate Danielle Bragg (Ph.D., ‘18), who worked with Allen School professor Richard Ladner as a student, was lead author of the paper that earned  this year’s ASSETS Best Paper Award. Bragg and her co-authors at Microsoft Research were recognized for their paper “Sign language recognition, generation and translation: An interdisciplinary perspective,” which presents the results of an interdisciplinary workshop where researchers discussed developing successful sign language recognition, generation and translation systems in fields such as computer vision, computer graphics, natural language processing, human-computer interaction and linguistics. 

Congratulations to all of the ASSETS 2019 award recipients!