AI Needs YouFreya Gulamali reflects on Jeff Ward's Seminar, "Balancing Big Data: The Push and Pull of Competing Ethical Values"
“AI needs you.” These were the words that Jeff Ward, Director of the Duke Center for Law & Technology, stated as the core truth of his presentation. Between machine learning algorithms that can play games, write poetry, or identify objects, and recent controversy over whether Google’s AI chatbot is sentient, it is easy to forget the dependency of these algorithms on the data humans feed to it. In the health AI field, where some fear a future where doctors are replaced by machines and bedside manner stoops even further down than can be imagined, “AI needs you” is a defining truth. Currently, AI works largely as clinical decision “support” (CDS) tools, where algorithms ally with humans to diagnose or treat patients, augmenting rather than replacing human capabilities.
This newfound alliance between human and machine is why Jeff Ward believes that these tools are more than just a model or algorithm that takes in an input and spits out an output—they are socio-technical systems. This phrase, socio-technical system, places AI in the reality of its context, requiring conscious efforts from humans to design, interpret, and monitor/evaluate. Sepsis Watch, a tool at Duke designed to predict patient risk of developing Sepsis, was touted as one of the few health AI models implemented in practice. This effort required years of human effort in data collection, model design, integration into clinical workflow, and monitoring to work in practice.
As Jeff Ward pointed out, without this extra effort to optimize the effectiveness of machine learning algorithms and because of the “black-box” nature of machine learning, these algorithms can be either useless or dangerous. Jeff Ward highlighted the findings in ProPublica’s piece, Machine Bias, as a prime example of this. This piece outlined the disparity between how a machine learning model assesses the potential for someone to be a future criminal depending on the color of their skin. Another astonishing example Jeff Ward spoke of was Target’s ability to predict that one of its customers was pregnant before she knew that information herself. When machine learning models are capable of learning protected information through proxies as well as the biases present in our current society, Jeff Ward claimed that there is a “crisis of trust” in our society.
This “crisis of trust” weighs even heavier following the recent overturning of Roe v. Wade. In a day and age where companies have enough data to track the time when a woman is pregnant to when she has visited a Planned Parenthood or ordered abortion pills, laws have a new significance. Jeff Ward urged us to believe that we can work through this crisis by solving with trust. Trust in the fact that there is a future of fairness, privacy, and security—one that we can balance with the promises laid out by machine learning. Despite everything, or rather as a result of it, he treasures the rising awareness for why privacy matters, and he challenges us to inspire the next Cambrian explosion of ideas that will manifest a better future.
Freya Gulamali, Huang Fellow ’25
Freya is a rising sophomore from Bellevue, Washington, planning to major in Computer Science on the pre-med track.