UW–Madison Computer Sciences PhD student Lisheng Ren, his advisor Sheldon B. Lubar Professor Ilias Diakonikolas, and UC-San Diego Professor Daniel Kane have received the Mark Fulk Best Student Paper Award at the 2025 Conference on Learning Theory (COLT). COLT is the premier international conference on the mathematical foundations of machine learning and artificial intelligence.
Their award-winning paper, “Faster Algorithms for Agnostically Learning Disjunctions and their Implications,” makes the first algorithmic breakthrough in two decades on a classical problem in learning theory: how to efficiently learn disjunctions—Boolean “OR” functions—from noisy or imperfect data.
“Learning disjunctions is one of the most basic problems in the theory of supervised learning,” explains Diakonikolas. “When data is perfectly consistent with a disjunction, we’ve had simple algorithms for decades. But when the data is only mostly consistent, the problem becomes much harder, and progress had stalled for twenty years. Our work shows that surprising new algorithms are possible, even in areas that seemed long stuck.”
“It is an honor for me and my collaborators to receive this award,” Ren says. “I see it as an encouragement to continue exploring this fascinating field.”
Why this research matters
Disjunctions (“OR” functions) are among the simplest logical rules we use to make decisions — for example, “admit a patient if they have symptom A or symptom B.” While such rules are easy to learn from perfectly clean data, real-world data is noisy. This work develops the first faster algorithms in 20 years that can reliably learn such rules even when the data is imperfect, advancing both the theory of machine learning and its ability to handle practical challenges.