UW CSE undergraduate and graduate students captured four of the six awards given out during the ACM Student Research Competition at the ACM SIGSOFT International Symposium on the Foundations of Software Engineering (FSE 2016) in Seattle last month. The students, all of whom who work with CSE professor Michael Ernst in our Programming Languages & Software Engineering (PLSE) group, captured first and third place in both the graduate and undergraduate student research categories.
Top honors in the undergraduate competition went to first-year CSE Ph.D. student Martin Kellogg for “Combining Bug Detection and Test Case Generation.” The paper, which is based on work Kellogg began while he was an undergraduate at University of Virginia, presents N-prog, a new tool for detecting software bugs. Automated bug-finding tools or test generators can waste developers’ time by producing false positives or using incorrect oracles. N-prog minimized this problem by combining the two approaches to find interesting, untested behavior while reducing wasted effort.
CSE undergraduate Christopher Mackie earned third place for “Preventing Signedness Errors in Numerical Computations in Java.” The paper presents a new verification tool, the Signedness Checker, which is built on a type system that segregates signed from unsigned integers. The system enables developers to detect errors regarding unsigned integers at compile time, thus avoiding such errors at run time.
In the graduate competition, CSE Ph.D. student Calvin Loncaric captured first place with “Cozy: Synthesizing Collection Data Structures.” Cozy is a novel tool for implementing new data structures using counter-example guided inductive synthesis as an alternative to the tedious and error-prone process of handwritten implementation. Loncaric and his colleagues evaluated Cozy’s synthesized implementations across four real-world programs to show that its performance can match that of handwritten implementations while avoiding human error.
Last but not least, CSE Ph.D. student Spencer Pearson placed third in the graduate competition for “Evaluation of Fault Localization Techniques.” The paper presents the results of a study evaluating the effectiveness of artificial faults for identifying the best real-world fault localization tools. Pearson demonstrated that a commonly-held assumption — that the best tools for localizing artificial faults will be best for localizing real-world faults — is false, thus turning the prevailing wisdom on its head. Based on these results, he and his colleagues developed a set of new fault localization techniques, several of which are shown to outperform existing techniques.
Read more about the winning projects in Ernst’s blog post here. Congratulations to all!