Ben Shepard
Ben Shepard

Duke News

Keep up with our core and affiliated faculty in the national and international news. Read their op-ed pieces, quotes and interviews, and cutting-edge research findings.

Mon, Mar 04

AI on Trial: Bot-Crossed Lovers

What happens when an AI chatbot takes part in a crime?

This is the first episode of a Stay Tuned miniseries, “AI on Trial,” featuring Preet Bharara in conversation with Nita Farahany, professor of law and philosophy at Duke University.

Preet and Nita discuss the hypothetical case of a nurse who is caught stealing medications and redistributing them to people living in poverty. As it turns out, an artificial intelligence chatbot, who the nurse is in love with, aided her crimes and coerced her into carrying them out. How might the AI’s assistance impact the nurse’s criminal liability and a potential prosecution? And how do we even begin to think about the idea of holding the AI itself accountable for the harm it causes?

Tue, Feb 27

Anne Crabill, Science and the Public, Class of 2023

“I came into Duke with a strong interest in healthcare, biology, and the government and struggled to reconcile my seemingly disparate interests. The Science and the Public cluster helped me broaden my interests and celebrate the interdisciplinary nature of the relationship between science and society…

Read More 

Wed, Feb 21

It’s HIPAA, not HIPPA

People often have the misconception that HIPAA is a health privacy law that protects all health data and gives them a right not to disclose their sensitive information. The unfortunate truth is that the “Health Insurance Portability and Accountability Act” is far from the privacy law most people believe it to be. HIPAA only applies to a select group of organizations like medical providers and insurers called “covered entities” and associated entities called “business associates.” In the past few years, there has been an explosion of digital health websites and mobile apps that collect, store, use, and sell health data. However, the majority of services don’t connect users with a medical provider or require health insurance and therefore, are not covered by the data protections and regulations imposed by HIPAA.

2024 Duke Data Privacy Panel Photo

To better understand the evolving environment of non-HIPAA-covered health data, I attended Duke’s Data Privacy Day—a two-panel conference at the Duke Law School on February 2nd. In the first panel, privacy expert Marc Groman and medical provider Dr. David Reitman discussed the data privacy concerns of mental health apps. David highlighted how health providers see mental health apps as an outside-the-box solution for the current shortage of mental health professionals, while Marc underscored how mental health apps turn patients into products by selling sensitive health information collected during services. By hearing from both a privacy and medical perspective, the discussion shed light on the tension between the duty of care and the right to privacy while providing a relevant example of a category of apps that collect health data, not subject to HIPAA regulation.

The second panel, composed of privacy experts with distinctive backgrounds, discussed potential solutions to the growing quantity of non-HIPAA-covered health data. The panel generated interdisciplinary conversations about solutions to protecting health data while maximizing its benefit to society. Each panelist brought a unique perspective to the table, attuned to the concerns, limitations, and arguments in their respective fields. Dr. Rachele Hendricks-Sturrup, who is a Duke Health Policy Researcher in Real-World Evidence, was particularly attuned to how non-health data, like location data or credit card data, can infer information about your health. Maneesha Mithal, lawyer and former leader of the FTC’s Division of Privacy and Identity Protection, expanded on the FTC’s evolving role in protecting health data.

2024 Duke Data Privacy Group Photo

Health data privacy is a complex problem that requires a diverse set of perspectives and expertise to solve. Duke’s Data Privacy Day is just one example of the co-curricular events sponsored by Science & Society that focus on creating an environment where interdisciplinary conversations are welcomed and thrive. As an MA in Bioethics and Science Policy student, I frequently engage in classroom dialogue with fellow students and professors. However, the hallmark of my education thus far is events like Data Privacy Day where I can interact with and learn from experts around the world, gaining insight into the most pertinent conversations in ethics and policy.



 

Liz Sparacino, Duke MA in Tech Ethics & Policy

Liz SparacinoLiz Sparacino enrolled in the Duke Master of Arts in Bioethics & Science Policy to better understand how to articulate and advocate for the bioethical issues that arise at intersection of science, technology, and society. Throughout her career, she hopes to address these concerns before new genetic technology is implemented and to continue to advocate for people with disabilities.

DISCLAIMER: These reflections represent the views of the student and not necessarily the views of the Duke Initiative for Science & Society or the Bioethics & Science Policy Masters Program. Our program represents myriad views and ideologies and we welcome open discussion on potentially controversial subject matter as it relates to society.

Mon, Feb 05

Elon Musk Says Neuralink Has Implanted Its First Brain Chip In Human

Elon Musk, the billionaire founder of the neurotechnology company Neuralink, has said the first human received an implant from the brain-chip startup and is recovering well.

Read More

Wed, Dec 20

Rite Aid’s ‘Reckless’ Use of Facial Recognition Got It Banned From Using the Technology in Stores for Five Years

Rite Aid has agreed to a five-year ban from using facial recognition technology after the Federal Trade Commission found that the chain falsely accused customers of crimes and unfairly targeted people of color.

The FTC and Rite Aid reached a settlement Tuesday after a complaint accused the chain of using artificial intelligence-based software in hundreds of stores to identify people Rite Aid “deemed likely to engage in shoplifting or other criminal behavior” and kick them out of stores – or prevent them from coming inside.

But the imperfect technology led employees to act on false-positive alerts, which wrongly identified customers as criminals. In some cases, the FTC accused Rite Aid employees of publicly accusing people of criminal activity in front of friends, family and strangers. Some customers were wrongly detained and subjected to searches, the FTC said.

Read More