Assistant Professor John Dickerson Part of Google Award to Promote Diversity and Fairness in AI Systems
Two faculty members, one from the University of Maryland and the other from Tulane University, have received funding from Google to promote diversity and fairness in a variety of pipeline and selection problems—including hiring practices, graduate school admissions and customer acquisitions—that rely increasingly on artificial intelligence (AI) algorithms.
John Dickerson, an assistant professor of computer science at the University of Maryland, and Nicholas Mattei, an assistant professor of computer science at Tulane, received a $60,000 award as part of the Google Research Scholars program, an initiative to support early career professors who are pursing research in fields relevant to Google.
The researchers say their work aims to operationalize and incorporate responsible AI practices and techniques into real-world systems, informed by data from actual selection processes at the universities they work at. Their goal is to produce an open-source toolkit, preliminary studies and a whitepaper to be discussed by policymakers at a later date.
The research is an outgrowth of a paper Dickerson and Mattei helped co-author, “We Need Fairness and Explainability in Algorithmic Hiring,” presented virtually last spring at the 2020 Autonomous Agents and Multi-Agent Systems Conference.
In focusing specifically on graduate admissions, a form of academic hiring, Dickerson and Mattei will look at two key factors related to admissions: how to allocate aspects of the process, such as budget and interview slots, and how to explain decisions made by their algorithm in a transparent and compliant way.
“Several reports related to algorithmic hiring, including one from the nonprofit UpTurn, motivates us to focus on how to allocate additional human resources to these problems,” says Dickerson, who also has an appointment in the University of Maryland Institute for Advanced Computer Studies. “We feel that we must treat issues of bias and fairness as first-order concerns in any system that may have an impact on people.”
The researchers anticipate their work will directly address questions of transparency, constraints and fairness when working with complex, multistage decision-making problems where there’s a need to end up with a recommendation or selection at the end of the process.
This type of sequential decision-making problem is typically optimized using what are known as “multi-armed bandit algorithms,” wherein a fixed limited set of resources must be allocated between competing (alternative) choices in a way that maximizes their expected gain.
The concern, the researchers say, is that these types of algorithms may optimize for criteria for which they are not intended, or may not be legal, under certain hiring laws.
The researchers believe their work will support their thesis that data-driven approaches to measuring and promoting fairness at a single stage of the talent sourcing process can be extended beyond graduate admissions. This includes AI technologies applied to internal product ideation and review, academic proposal reviewing, advertising selection, or any setting that involves the collection of recommendations from experts.
Dickerson and Mattei have collaborated for many years, and the fact that they work at two very different academic institutions—one a large public university (UMD) in a wealthy geographic area and the other a smaller private university (Tulane) in a lower income part of the country—will prove to be an advantage in their research.
“The graduate student application profiles at both schools are very different and will lead to different concerns and distributions of data” Mattei explains. “We believe this diversity strengthens the results that will arise from the data-driven validation of our model.”
This story was adapted from a news release authored by Barri Bronston that was published by Tulane University.
The Department welcomes comments, suggestions and corrections. Send email to editor [-at-] cs [dot] umd [dot] edu.