Honored for an advanced research that addresses some of the most fundamental theories and practical uses of machine learning, Cyrus Cousins will receive the Joukowsky Outstanding Dissertation Award at the Graduate School Commencement Ceremony on May 1. His dissertation in Computer Science is titled, Bounds and Applications of Concentration of Measure in Fair Machine Learning and Data Science. This prize recognizes the superior research achievements of doctoral students.
“Cyrus’ thesis addresses some of the most fundamental questions in modern machine learning. He treats these problems with rigorous mathematical approach, and obtained significant, impactful, results extending the state of the art understanding of the mathematical foundation of machine learning, its practical applications, and its limitations,” says his advisor Eli Upfal, the Rush C. Hawkins Professor of Computer Science.
The first part of his thesis develops new upper bounds to the generalization error of learning algorithms. The key question is how much labeled data does a learning algorithm need in order to accurately approximate an unknown function. This question was originally raised in the 1920-1960s with Vapnik-Chervonkis theory and its predecessors, and the answer depends on the quality and diversity of options an algorithm has to select from for output. The modern tool for addressing this question is a metric called the “Rademacher average”, which is at the center of Cyrus’ work.
One of the surprising results in this part is that unlabeled data can be more helpful than previously thought to obtain a strong learning algorithm. Machine learning algorithms need labeled data, which can be very expensive and time consuming to obtain. Cousins’ novel work in this area shows that the unlabeled data can be more helpful than previously thought to obtain a strong algorithm.
Cousins developed these ideas and completed the majority of the technical work. “This elegant result demonstrates Cyrus’ creative and original approach, and his outstanding mathematical insight and techniques,” says Upfal.
The second part of his thesis addresses the timely subject of algorithmic fairness, namely, how to guarantee that a learning algorithm treats all members of a population equitably. Learning algorithms can perpetuate any bias present in the data, so if the data reflect a systematic bias against a minority, the machine learning algorithm will give results that confirm that bias, or potentially exacerbate it. Addressing this issue is important, especially when an algorithm can be used to influence public policy on important issues such as the zoning of neighborhoods or refining electoral districts boundaries. Cousins’ work in this area is challenging, but will serve as a foundation for continued research in this area and is moving the entire field of study in the right direction.
The third section of his dissertation looks at how non-independent data can be used to study the theoretical properties of algorithms. Cousins discovered interesting results for a database sampling (a common approach to speed up data analytics through a small, random sample) and the Markov chain problems, which formalizes a large number of realistic situations. This work uses concepts he developed in his first section and shows a very practical solution.
In addition to working on the research presented in his thesis, Cousins also contributed to three other research projects. During his time at Brown he collaborated with many faculty members, post-docs, and students. “Cyrus has also been an enthusiastic mentor to other graduate students, and in particular to undergraduate students working with my group,” shares Upfal.
About his work and process Cousins openly shares that he has a debilitating joint condition that makes writing and other work extremely difficult. He credits friends and colleagues from around the globe for helping him further his ideas and research.
“The highlight of writing this thesis, and indeed my time at Brown, was the chance to learn and work closely with other young researchers on exciting problems. In particular, I treasured the chance to work with the brilliant Shahrzad Haddadan, whose collaboration formed the basis for chapter seven, as well as Enrique Areyan, whom I've submitted countless manuscripts, and Alessio Mazzetto, our work eventually making its way into chapter two,” says Cousins. Haddadan is a postdoctoral researcher; Areyan and Mazzetto are fellow PhD students in Computer Science.
Cousins’ work in chapter two has been published in 2020 in Advances in Neural Information Processing Systems. The remaining chapters are awaiting publication. Learn more about his work and research.
He expounds on what this award means to him, noting that it is both a recognition of the quality of his work and an appreciation for the contributions he made in various areas and a reflection of the inherent importance of the problems he addresses, rather than their explicit solutions.
“I've become increasingly concerned with the catastrophic real-world consequences of the (mis)application of machine learning methods, and wanted to study these sorts of problems with the same academic rigor I had previously applied to statistical analysis. As an inherently interdisciplinary body, perhaps receiving the Joukowsky prize is a reflection of not just the scholarly, but also the human value of my work,” says Cousins.
Following graduation, Cousins will begin as a Dean's Faculty Fellow in the fall, teaching as a visiting assistant professor at Brown University.
Read more about the other Joukowsky Prize winners.