Earlier this spring, the National Science Foundation recognized nine Allen School student researchers as part of its 2020 Graduate Research Fellowship competition. The honorees — seven Ph.D. students and two undergraduate students — were recognized in the “Comp/IS/Engr” category for their potential to make significant contributions to science and engineering through research, teaching, and innovation. Each of them already has amassed an outstanding track record of pursuing high-impact research in their respective areas, including theoretical computer science, systems, machine learning, computational neuroscience, security and privacy, robotics, and more.
“Allen School Ph.D. students represent the future of high quality research and innovation,” said professor Anna Karlin, associate director for graduate studies at the Allen School. “Their creativity and scholarly excellence is perfectly exemplified by our NSF GRFP honorees.”
Klein focuses on the design of efficient algorithms that yield near-optimal solutions to fundamental NP-hard problems that underpin the theory and practice of computing. His current project aims to find a better approximation algorithm for the Traveling Salesperson Problem (TSP). The TSP is applicable to a large class of planning and decision problems with a variety of real-world applications, from transportation routing, to genome sequencing, to computer chip design. Recently, Klein and his collaborators presented the first sub-3/2 approximation algorithm for what is conjectured to be the most difficult case of TSP — making tangible progress in their quest to improve upon a result that has stood for more than 40 years. Through this work, Klein hopes to advance tools and techniques that will yield new insights into a broad array of optimization problems.
Second-year Ph.D. student Jialin Li earned a fellowship for her work with professor Tom Anderson in the Computer Systems Lab on a new operating system that will provide performance guarantees for containers in cloud-based services.
Containers are a lightweight computing model that offers a platform-independent way of packaging application dependencies; as such, they have been widely adopted in industry for building microservice-based applications. While existing operating systems provide functional support for containers, they fall short of providing the performance guarantees necessary for satisfying service-level agreements. This typically leads application developers to request more container resources than required, which wastes energy and resources. Li is designing a new operating system using the Rust low-level programming language that will monitor container performance and intelligently reallocate resources based on container loads, thus increasing resource utilization while offering performance guarantees.
Fellowship winner Ashlie Martinez is a second-year Ph.D. student in the Computer Systems Lab working with professor Tom Anderson and affiliate professor Irene Zhang of Microsoft Research to develop a user space file system for distributed storage applications.
Recent advances in storage technologies have significantly increased storage capacity while speeding up input/output (I/O) by orders of magnitude. While storage technologies have evolved to the point where they can service requests in microseconds, developers’ approach to storage, generally speaking, has not — for the most part, they continue to regard I/O as a slow operation best done through an operating system’s file system. The Storage Performance Development Kit (SPDK) is available to bypass the kernel and speed up I/O, but it is difficult to integrate into existing software as the API exposes raw storage devices instead of a file system. To overcome these challenges and improve the performance of today’s distributed storage applications, Martinez is building a kernel-bypass file system, or KBFS, that combines a generic API with strong consistency guarantees. Using this approach, she aims to reduce developer effort while making KBFS faster and easier to maintain compared to existing OS file systems.
Josh Pollock, an undergraduate majoring in computer science, works with professor Zachary Tatlock in the Allen School’s Programming Languages and Software Engineering (PLSE) group. He received a fellowship based on his research at the intersection of programming languages and visualization.
Pollock started his undergraduate research career in verification and formal methods, specifically the development of computerized proof assistants that take advantage of the correspondence between type theory and mathematical logic. As part of this work, Pollock prototyped a compiler between the Coq and Lean proof assistants. He subsequently contributed to Relay, a compiler for machine learning frameworks, as a member of the Allen School’s multidisciplinary SAMPL group. Expanding his interests to include principles of human-centered research, Pollock is designing Sidewinder, a framework for creating visualizations of program execution to help students and developers understand program semantics. Sidewinder employs formal abstract machine definitions to produce complete, continuous, and customizable program semantics visualizations. Pollock aims to build upon this work while pursuing a Ph.D. at MIT starting this fall.
As an undergraduate, Ruth has focused on addressing security and privacy issues associated with emerging augmented reality (AR) technologies that can have a profound impact on users’ perception of the world. In her early work, Ruth focused on mitigating the risks of buggy or malicious output in AR applications that could endanger user safety by enabling the operating system to constrain undesirable output. She subsequently helped conduct a user study to understand concerns around multi-user AR. More recently, Ruth led the development of ShareAR, a tool for developers of AR applications to enable secure sharing of multi-user content. Going forward, Ruth sees the next step in this line of work to be designing a multi-user sharing protocol at the platform level that would mediate cross-app as well as cross-user interactions. Ruth looks forward to pursuing her Ph.D. at Stanford University in the fall.
First-year Ph.D. student Zöe Steine-Hanson earned a fellowship for her research in computational neuroscience with professors Rajesh Rao and Bingni Brunton. Steine-Hanson is working on the development of a new, generalizable brain-computer interface (BCI) using deep learning and transfer learning techniques.
Currently, even the most advanced BCIs require the collection of significant training data on a single human subject, and the majority of BCI research takes place in a laboratory rather than in naturalistic settings. These factors hinder the ability to generalize state-of-the-art BCIs for people’s everyday use. To address this problem, Steine-Hanson is training a deep neural network on electrocorticography (ECoG) and video data collected from multiple human subjects. By applying techniques from transfer learning, she aims to reduce the amount of training data required for each new subject by leveraging the knowledge collected from previous subjects. Her ultimate goal is to improve quality of life for individuals living with neurological impairments through the use of next-generation BCI technologies in real-world settings.
Fellowship recipient NickWalker is a second-year Ph.D. student working with professor Maya Cakmak in the Human-Centered Robotics Lab. Walker’s research focuses on human-robot communication with the aim of enabling any user to customize a robot to meet their needs.
Previously, Walker developed techniques for improving natural language interfaces within a robot’s existing capabilities. These included the creation of embodied language learners that can acquire understanding of simple words and leveraging neural models to compensate for variations in phrasing of natural language commands. Walker plans to build upon this past work by leveraging language to enable a robot to perform completely new tasks; to that end, he has turned his attention to the development of natural language programming techniques that will address a variety of robotics use cases. As part of this work, Walker plans to explore questions around people’s perceptions of robot agency and who bears responsibility for a robot learner’s mistakes, in anticipation of a time when home robots will be the personal computers of a future generation.
Second-year Ph.D. student Matthew Schmittle earned an honorable mention for his work with professor Siddhartha Srinivasa in the Personal Robotics Lab on the use of online learning methods to enable lifelong learning in robots.
Schmittle’s latest project focuses on improved techniques for imitation learning (IL), an approach to training dynamical systems that leverages expert feedback and demonstrations rather than requiring the hand-tuning of reward functions. IL offers an advantage over reinforcement learning in robotics, where real-world execution can be expensive or dangerous, due to its greater sample efficiency. However, most IL algorithms demand optimal state action demonstrations, which can be challenging even for experts. An alternative is to employ corrective feedback, in which users dispense with full demonstrations in favor of making adjustments during robot execution. This approach is easier for a teacher to provide but tends to be noisy and each teacher and task may require different feedback. To overcome this challenge, Schmittle recognizes robots must be able to learn from a variety of feedback and makes the following key insight: the teacher’s policy is latent, and their feedback can be modeled as a stream of loss functions. Based on this insight, he proposes a new corrective feedback meta-algorithm that can learn from a variety of noisy feedback across different tasks, teachers, and environments.
Caleb Ellington, a senior double-majoring in computer science and bioengineering, has pursued undergraduate research in the Baker Lab working with Ph.D. candidate Nao Hiranuma. Ellington earned an honorable mention for his work on machine learning techniques to improve the design of new therapeutics.
Recombinant protein therapeutics have emerged as an area of huge potential in medical research due to their universal biocompatibility and high specificity. They are also significantly harder to design compared to small-molecule drugs, which has caused their development to lag. Inspired by what he encountered as an intern at Nepal’s Annapurna Neurological Institute and Dhulikhel Hospital — where computing and 3D printing are used to produce imaging and surgical tools quickly and inexpensively — Ellington intends to explore the potential for computer science to speed up the design of new protein therapeutics. Specifically, he proposes to leverage advances in generative deep convolutional neural networks (DCNNs), which are capable of inferring and correcting data, to the design of protein-ligand interactions. His approach is based on a hypothesis that, under the right conditions, generative models are powerful enough to create entirely new proteins based on a target binding region — a potential breakthrough in protein design that could yield effective new treatments for a variety of diseases. Ellington will pursue this research as a Ph.D. student in computational biology at Carnegie Mellon University.
In addition to the Allen School honorees, students from other UW departments were also recognized by the NSF in the “Comp/IS/Engr” category. Ph.D. students Steven Goodman and Sharon Heung in the Department of Human-Centered Design & Engineering both received fellowships, while fellow HCDE student Andrew Beers and Electrical & Computer Engineering undergraduate Kyle Johnson earned honorable mentions.
Congratulations to all — you make the Allen School and UW proud!