Skip to main content

Allen School’s second building named the Bill & Melinda Gates Center for Computer Science & Engineering

Exterior rendering of the Bill & Melinda Gates Center for Computer Science & EngineeringThe University of Washington Board of Regents today approved the naming of the Allen School’s second building as the Bill & Melinda Gates Center for Computer Science & Engineering. The naming of the building in honor of the Gateses was made possible by gifts from Microsoft and a group of local business and philanthropic leaders who are longtime friends and colleagues of the couple.

“There is wonderful symbolism in having the Bill & Melinda Gates Center for Computer Science & Engineering across the street from the Paul G. Allen Center for Computer Science & Engineering on the University of Washington campus,” said Microsoft President Brad Smith in a news release. “As teenagers, Bill and Paul roamed UW computer labs. They went on to change the face of Seattle and the world — first with Microsoft, and later with their philanthropy. I can’t think of a better way for those of us who have had the privilege of working alongside Bill and Melinda to express our gratitude and admiration than to name this building for them.”

Smith and his spouse, Kathy Surace-Smith, are among the group of longtime friends and colleagues of the couple who personally contributed to the naming gift. Smith and fellow donors Charles & Lisa Simonyi spearheaded the fundraising effort to name the building in the Gateses’ honor. Altogether, the Friends of Bill & Melinda contributed more than $30 million toward the project.

Read more about this extraordinary gift in the UW announcement here, and learn more about the building here.

The Bill & Melinda Gates Center is scheduled for completion in December 2018, and will be ready for occupancy in early 2019.

We are tremendously grateful to the Friends of Bill & Melinda for enabling this enduring tribute to the Gateses — and exceedingly proud to have a second home bearing their name. Thank you for giving us the room to grow and deliver a world-class computer science education to more of Washington’s students!

The Friends of Bill & Melinda

  • Microsoft Corporation
  • Jim & Catherine Allchin
  • Rich & Sarah Barton
  • Jeff & MacKenzie Bezos
  • Lloyd & Janet Frink
  • Craig & Marie Mundie
  • Satya & Anu Nadella
  • Jeff & Tricia Raikes
  • Rob Short & Emer Dooley
  • Harry Shum & Ka Yan Chan
  • Brad & Jan Silverberg
  • Charles & Lisa Simonyi
  • Brad Smith & Kathy Surace-Smith
  • John Stanton & Terry Gillespie

Read coverage of the announcement in the Seattle Times, GeekWire, The Wall Street Journal, Bloomberg, and Xconomy. Read more →

15 volunteers, 2 hours, >300 Allen School résumés reviewed

In the run-up to the Paul G. Allen School’s annual fall recruiting fair, 15 industry volunteers reviewed more than 300 student résumés on Tuesday afternoon in the atrium. Many thanks to Amazon’s Greg Geiger and Abigail Gualberto, Whitepages’ Rachel Flanagan, Redfin’s Marissa Carr, Krystin Morgan and Kritin Vij, Microsoft’s Kelsey Saboori, Indeed’s Jason Gabriel and Robert Noble, Qumulo’s Anthony Falsetto, Google’s Zach Spann, Carolyn Balousek and Lauren Woodward, RealSelf’s Finnian Durkan, and Karat’s Aram Greenman! Read more →

Allen School and AWS team up on new NNVM compiler for deep learning frameworks

Tianqi Chen, Thierry Moreau, Haichen Shen, Luis Ceze, Carlos Guestrin, Arvind KrishnamurthyA team of researchers at the Allen School and AWS have released a new open compiler for deploying deep learning frameworks across a variety of platforms and devices. The NNVM compiler simplifies the design of new front-end frameworks and back-end hardware by offering the ability to compile front-end workloads directly to hardware back-ends. The new tool is built upon the TVM stack previously developed by the same Allen School researchers in order to bridge the gap between deep learning systems optimized for productivity, and the programming, performance, and efficiency constraints enforced by different types of hardware.

“While deep learning is becoming indispensable for a range of platforms — from mobile phones and datacenter GPUs, to the Internet of Things and specialized accelerators — considerable engineering challenges remain in the deployment of those frameworks,” noted Allen School Ph.D. student Tianqi Chen. “Our TVM framework made it possible for developers to quickly and easily deploy deep learning on a range of systems. With NNVM, we offer a solution that works across all frameworks, including MXNet and model exchange formats such as ONNX and CoreML, with significant performance improvements.”

With the help of the TVM stack, the NNVM compiler represents and optimizes common deep-learning workloads in standardized computation graphs. It then transforms these high-level graphs, optimizing the data layout while reducing memory utilization and fusing the computation patterns for different hardware back-ends. Finally, NNVM presents an end-to-end compilation pipeline, from the front-end frameworks to bare-metal hardware.

“Existing deep learning frameworks package the graph optimization with the deployment runtime,” noted Allen School professor Carlos Guestrin. “NNVM follows the conventional wisdom of compilers, separating the optimization from the deployment runtime. Using this approach, we get substantial optimization while keeping the runtime lightweight.”

While NNVM is still under development, early indications are that the approach is a step forward compared to the current state of the art. The team benchmarked the performance of the new compiler against that of the MXNet framework for two popular hardware configurations: Nvidia GPU on AWS and ARM CPU on Raspberry Pi. On both benchmarks, the NNVM compiler achieved faster speeds; on the Raspberry Pi, the code generated by the compiler was two times faster for ResNet18 and 11 times faster for MobileNet. With the NNVM compiler, developers will be able to provide consistent results from multiple frameworks to users of a variety of platforms in less time and with significantly less engineering effort.

Like TVM, the NNVM compiler is the product of a collaboration among researchers in machine learning, systems, and computer architecture. In addition to Chen and Guestrin, Allen School Ph.D. students Thierry Moreau and Haichen Shen, and professors Luis Ceze and Arvind Krishnamurthy worked with the AWS AI team to build the new tool.

Learn more in the detailed overview here, and read the AWS announcement here.

  Read more →

Paul G. Allen School out in force at Grace Hopper Celebration

Roughly 40 Allen School students attended this week’s Grace Hopper Celebration of Women in Computing – a phenomenal event dating to 1994 that this year had 18,000 attendees! Read more →

Ph.D. student Kanit Wongsuphasawat earns Best Paper Award at IEEE VAST

Kanit "Ham" WongsuphasawatAllen School Ph.D. student Kanit “Ham” Wongsuphasawat, who works with professor Jeffrey Heer in the Interactive Data Lab, won the Best Paper Award at the Institute for Electrical and Electronics Engineers’  Conference on Visual Analytics Science & Technology (IEEE VAST) for “Visualizing Dataflow Graphs of Deep Learning Models in TensorFlow.” Wongsuphasawat is the first author on the paper, which is based on work he did as an intern at Google Research with colleagues Daniel Smilkov, James Wexler, Jimbo Wilson, Dandelion Mané, Doug Fritz, Dilip Krishnan, Fernanda B. Viégas, and Martin Wattenberg.

Deep learning is becoming increasingly important in a variety of applications, from scientific research to consumer-facing products and services. Google’s TensorFlow open-source platform provides high-level APIs that simplify the creation of neural networks for deep learning, generating a low-level dataflow graph to support learning algorithms, distributed computation, and multiple devices. But developers still need to understand their structure. One way for them to do this is through a visualization; however, the dataflow graphs of such complicated models contain thousands of heterogeneous, low-level operations — some of which are high-degree nodes connected to many parts of the graph. This level of complexity yields tangled visualizations when produced using standard layout techniques.

In their award-winning paper, Wongsuphasawat and his collaborators offer a solution in the form of the TensorFlow Graph Visualizer, a tool for producing interactive visualizations of the underlying dataflow graphs of TensorFlow models. The visualizer is shipped as part of TensorBoard, TensorFlow’s official visualization and dashboard tool.  The tool has enabled users of TensorFlow to understand and inspect the high-level structure of their models, with the ability to explore the complex, nested structure on demand.

The visualization takes the form of a clustered graph in which nodes are grouped according to their hierarchical namespaces as determined by the developer. To support detailed exploration, the team employed a novel use of edge bundling to enable stable and responsive expansion of the clustered flow layout. To counteract clutter, the researchers came up with the idea to extract less important nodes by applying heuristics to extract non-critical nodes and introducing new visual encodings that decouple extracted nodes from the layout. They also built in the ability to detect and highlight repeated structures, while overlaying the graph with quantitative information that will assist developers in their inspection. Users who tried the tool found it to be useful for a variety of tasks, from explaining a model and its application, to highlighting changes during debugging, to illustrating tutorials and articles.

Wongsuphasawat and his co-authors are being recognized at the big IEEE VIS conference, with which IEEE VAST, InfoVis and SciVis are co-located, in Phoenix, Arizona this week. Watch a video of Wongsuphasawat’s presentation of the work below.

Congratulations, Ham!

Visualizing Dataflow Graphs of Deep Learning Models in TensorFlow from Kanit W on Vimeo. Read more →

2017 Paul G. Allen School “Women in Computing” reception

Lisa and Charles Simonyi flank Pascale Wallace Patterson, recipient of the inaugural Lisa Simonyi Prize

Each fall we host a reception to celebrate the women of the Paul G. Allen School and of our region’s technology sector, to provide an opportunity for them to interact with one another, and to give a rousing sendoff to the Allen School women who we and our industry partners will be sending to the Grace Hopper Celebration of Women in Computing (a group of 40 this year!).

In addition, 2017 marked the awarding of the inaugural Lisa Simonyi Prize, which annually will recognize a student who exemplifies our commitment to excellence, to leadership, and to inclusiveness. Congratulations to Pascale Wallace Patterson, the extraordinary recipient!

And thanks to Jennifer Mankoff for a terrific research overview!

Attendees hear from Prof. Jennifer Mankoff about her research in accessibility

Ed Lazowska’s poster from the first Hopper Conference

Read more →

All that jazz: Researchers preserve iconic musical performances in DNA

Illustration of guitarist made up of DNA nucleotide basesA team of researchers in the Molecular Information Systems Lab, a collaboration between the University of Washington and Microsoft Research, worked with DNA synthesis company Twist Bioscience to encode two archival-quality audio recordings from the world-renowned Montreux Jazz Festival in nature’s perfect storage medium. The preservation of “Smoke on the Water” by Deep Purple and “Tutu” by Miles Davis represent the first time that DNA has been used for long-term archival storage — making the songs not only pieces of musical history, but now pieces of scientific history, as well. The project builds upon work by the MISL team to develop a next-generation digital storage system using DNA.

In a media release, Allen School professor Luis Ceze noted that DNA is ideal for archiving precious cultural assets due to its durability, density, and “eternal relevance.”

“Storing items from the Montreux Jazz Festival is a perfect way to show how fast DNA digital data storage is becoming real,” he said.

The team’s latest effort to illustrate the potential of a DNA-based storage system for digital date grew out of a partnership between the Claude Nobs Foundation — curator of the festival’s audio-visual collection — and the École Polytechnique Fédérale de Lausanne (EPFL) on the Montreux Jazz Digital Project, which aims to digitize, store, preserve and share the musical legacy of festival founder Claude Nobs. Whereas existing recordings in the collection may last a decade before they need to be replaced, a DNA-based archival storage system could preserve the same material for thousands of years.

The two songs preserved as a proof-of-concept by UW, Microsoft, and Twist amounted to 140 megabytes of data. According to Microsoft researcher and Allen School affiliate professor Karin Strauss, that represents barely a drop in the bucket when it comes to the potential storage capacity of DNA.

“The amount of DNA used to store these songs is much smaller than one grain of sand,” she noted. “Amazingly, storing the entire six petabyte Montreux Jazz Festival’s collection would result in DNA smaller than one grain of rice.”

Allen School Ph.D. student Lee Organick, MISL lab manager David Ward, and Microsoft researchers Siena Dumas and Yuan-Jyue Chen of Microsoft were part of the team that worked with Twist Bioscience to encode, decode, and analyze the DNA samples in which the iconic recordings were preserved. The team converted the audio files from binary code — 0s and 1s — to the four nucleotide bases that make up a strand of DNA: A, C, G, and T (adenine, cytosine, guanine, and thymine). After the DNA was sequenced, the team decoded and read it back to confirm 100% accuracy.

The decoded versions were played at a forum hosted by the ArtTech Foundation in Lausanne, Switzerland today. The DNA-based recordings represent part of UNESCO’s Memory of the World Register, which includes a collection of more than 5,000 hours of Montreux Jazz Festival concerts.

“The UNESCO archive provides the perfect use-case for testing our approach,” Ceze said. “Thanks to Twist and the Montreux Jazz Festival, our team had a unique opportunity to apply cutting-edge digital storage research to preserving a sliver of cultural heritage for posterity.”

Read more about the Montreux Jazz Festival project in the Twist Bioscience press release here, and check out our past blog post on the record-breaking research of the MISL team.

Illustration: The lyrics of Deep Purple’s Smoke on the Water encoded into DNA. Each letter, space and punctuation mark are represented by a unique triplet of the four bases (A, T, G, C), the building blocks of DNA. For example, “smoke” becomes GACCGACGTCAGAGC. Credit: Martin Krzywinski, courtesy of Twist Bioscience. Read more →

Franziska Roesner honored with Emerging Leader Award from UT Austin

Franziska Roesner holding her Emerging Leader AwardAllen School professor and Ph.D. alumna Franziska Roesner, co-director of the Privacy and Security Research Lab, received the 2017 Emerging Leader Award from the College of Natural Sciences at The University of Texas at Austin. Roesner, who earned her bachelor’s degree in 2008 from UT Austin before her arrival at the Allen School as a graduate student, was inducted into the college’s Hall of Honor at a ceremony last night.

Calling Roesner a “formidable force and leader in the world of computer security and privacy,” the college cited her work to identify the privacy risks to children of internet-connected toys, evaluate and address journalists’ security needs, and safeguard the privacy of web users as evidence of her growing leadership in the field. It also highlighted Roesner’s growing reputation as a leading voice on privacy and security related to emerging technologies such as augmented reality and the Internet of Things.

The Emerging Leader Award was created to recognize graduates of the college “who, in deed or action, reflect and recognize the importance of his or her education at The University.” Nominees are evaluated based on their contributions to their profession, recognition by their peers, and demonstrated ability, integrity, and stature. The winners are individuals in whom the faculty, staff, students, and fellow alumni will “take pride in and be inspired by their recognition.”

We certainly are inspired by the many contributions she has made to the field of computer science and to the Allen School community — and as our friends in Austin note, “Roesner has only just begun to make her mark.” This is turning into a banner year for Roesner, who previously earned a TR35 Award and a NSF CAREER Award.

Read the full citation here.

Congratulations, Franzi! Read more →

“Geek of the Week” Alex Mariakakis sets his sights on long-term impact through mobile health research

Alex Mariakakis in color-calibration glassesPh.D. student Alex Mariakakis, who works with professor Shwetak Patel in the Allen School’s UbiComp Lab, has his eye on the prize in the latest edition of GeekWire’s “Geek of the Week.” Blue Devil-turned-Husky Mariakakis was a slam-dunk for the honor based on his work on mobile health apps that will one day allow anyone, anywhere to be screened for potentially life-threatening medical conditions using a smartphone.

“There are so many reasons why I work at the intersection of health and technology,” Mariakakis told GeekWire. “I like to work on projects that can be explained at a high level for a curious parent or student or at a deeper level for a senior faculty member. I like working on projects that I hope will have a lasting impact on society rather than just sit as a document on a website.”

“And sometimes,” he continued, “I just like to pretend to be a real doctor when I visit collaborators (I got to wear scrubs once!).”

Mariakakis is collaborating with Allen School and UW Medicine researchers on two projects that have the potential for significant impact: BiliScreen, which detects adult jaundice — an early indicator of pancreatic cancer and other serious medical conditions — before it is evident to the naked eye; and PupilScreen, a way to objectively assess athletes and others for traumatic brain injury. Despite the high profile of his research, Mariakakis makes sure to get out of the lab and share his enthusiasm for computer science through K-12 outreach events hosted by the Allen School and the College of Engineering.

“When most people think about computer science, they think about the traditional subfields like systems, architecture, and databases. Without the work they do, so many things wouldn’t be possible, but people should know there is so much more to computer science than just those areas,” Mariakakis said. “I totally understand that not everyone wants to be involved in STEM, but I think it’s important that students at least know what’s out there.”

Read the full article here. Also check out recent Allen School “Geek of the Week” honorees professors Ira Kemelmacher Shlizerman and Shyam Gollakota and Ph.D. alumna Irene Zhang, and 2017 Geek of the Year Ed Lazowska. Read more →

UW’s Shwetak Patel, Matt Reynolds, and Julie Kientz earn Ubicomp 10-Year Impact Award

Abowd, Kientz, Patel, Kay

From left: Gregory Abowd, Julie Kientz, Shwetak Patel, and Award Chair Judy Kay. Not pictured: Matthew Reynolds and Thomas Robertson.

University of Washington professors Shwetak Patel, Matt Reynolds, and Julie Kientz have been recognized with the 10-Year Impact Award at Ubicomp 2017 for the paper, “At the Flick of a Switch: Detecting and Classifying Unique Electrical Events on the Residential Power Line.” The paper, which originally earned the Best Paper Award and Best Presentation Award at Ubicomp 2007, was singled out by this year’s conference organizers for having lasting impact a decade after its original presentation.

Patel and Reynolds hold joint appointments in the Allen School and Department of Electrical Engineering. Kientz is a faculty member in the Department of Human Centered Design & Engineering with an adjunct appointment in the Allen School. Patel and Kientz were Ph.D. students and Reynolds was a senior research scientist at Georgia Tech when they co-authored the original paper with research scientist Thomas Robertson and professor Gregory Abowd.

The paper presents a novel approach for detecting energy activity within the home using a single plug-in sensor. The researchers applied machine learning techniques to enable the system to accurately differentiate between different electrical events, such as turning on a specific light switch or operating certain appliances. This work paved the way for a new field of research in high-frequency energy disaggregation and infrastructure mediated sensing. It also led to the creation of Zensi, a startup spun out of Georgia Tech and UW that was acquired by Belkin in 2010. Many other companies focused on home energy monitoring and automation have been formed based on the techniques first described in the winning paper.

Matt Reynolds

Matt Reynolds

This is the fifth year in a row that UW and Allen School researchers have been recognized at Ubicomp for the enduring influence of their contributions:

2016: The late professor Gaetano Borriello, UW EE Ph.D. alumnus Jonathan Lester, and collaborator Tanzeem Choudhury were recognized for their 2006 paper, “A Practical Approach to Recognizing Physical Activities.”

2015: A team that included Borriello, Ph.D. alumni Anthony LaMarca and Jeff Hightower, and Bachelor’s alumni James Howard, Jeff Hughes, and Fred Potter won for their 2005 paper, “Place Lab: Device Positioning Using Radio Beacons in the Wild.”

2014: Borriello and Hightower won for their 2004 paper, “Particle Filters for Location Estimation in Ubiquitous Computing: A Case Study.”

2013: Ph.D. alumni Don Patterson and Lin Liao, professor Dieter Fox, and then-professor Henry Kautz were recognized for their 2003 paper, “Inferring High-Level Behavior from Low-Level Sensors.”

Way to go, team! Read more →

« Newer PostsOlder Posts »