Jul 22, 2019 | Atlanta, GA
When Samira Samadi was in her third year of college, she took a class on theory of computation where she learned about the Turing machine. The Church-Turing thesis, named after two renowned mathematicians, states that a simple abstract machine can determine any algorithmically solvable problem. The class was life-changing for the Sharif University of Technology mathematics student.
“It was super cool that such a simple structure can describe any algorithm or computation that a computer or a human could do,” she said. “It was a window for me that what I study could go far beyond the abstract. I could do great things with it.”
After the class, Samadi shifted her focus to theoretical computer science. Now a fourth-year Ph.D. student in the School of Computer Science, she has made it her mission to apply fundamental ideas in theoretical CS to practical problems, such as an easy way to generate secure passwords.
Breaking the theory down so it’s simple for anyone to understand and benefit from has always been how Samadi approached problems.
“When I start thinking about a problem, I have to first get rid of a lot of conditions and transform it into a basic question, and that is the point when I can think creatively and connect the problem to different concepts,” she said. “That is when the motivation and the excitement sparks, thinking that what I’m doing could be used by another person or in an influential application.”
One of those connecting points is the growing field of fairness in machine learning (ML). As more products and services use ML for everything from home loans to self-driving cars, ethical issues can arise because much of the data models are trained on is historical and subject to bias.
Many researchers believe bias is the model or data’s fault, but it can often start as early as the data processing step, as Samadi has been researching. Last year, she offered a solution to remove bias from one of the first steps in the ML pipeline, principal component analysis. Her work reduced bias in low-dimensional representations of large datasets. This year, she tackled models to ensure group data was fairly represented.
Fairness is a burgeoning field, which enables Samadi to pick the problems that most interest her, but she also has a more personal motivation for this research as a woman in computer science.
“In fifth grade, I was the only girl competing for math olympiad in our state,” she said. “That was probably the first time I felt I would be in the minority if I continued in this field.”
The experience has impacted how she views gender ever since.
“No matter how confident one is, working in a male-dominated environment can negatively affect a woman’s perception of her abilities,” she said. “If I was in a more diverse environment or if I had seen more women in my field, this harm could be prevented, so I want to really grab this opportunity now that people care about fairness, diversity, and discrimination.”