Tuesday, November 28, 2023

ChatGPT Replicates Gender Bias in Recommendation Letters

Trouble viewing? View in your browser.
View all Scientific American publications.
    
November 28, 2023

In order to spit out human-sounding sentences, large language models must be trained on oodles of human writing. As a result, they're also trained on the inadvertent biases that we have encoded in our words throughout history. That's a problem when AI chatbots like ChatGPT are used to, say, write recommendation letters for employees. A new study found that bots were more likely to describe men with terms like "expert" and "integrity," and women with words like "warm" and "emotional." Using such AI tools in the office could entrench existing gender bias—and thus aggravate the discrimination women already face in the workplace.

Sophie Bushwick, Associate Editor, Technology

Inequality

ChatGPT Replicates Gender Bias in Recommendation Letters

A new study has found that the use of AI tools such as ChatGPT in the workplace entrenches biased language based on gender

By Chris Stokel-Walker

Policy

Firearm Forensics Has Proven Reliable in the Courtroom. And in the Lab

Despite criticism, a slate of new scientific studies show that forensic firearms analysis is a reliable scientific discipline that the criminal justice system should trust

By Raymond Valerio,Nelson Bunn

Artificial Intelligence

When It Comes to AI Models, Bigger Isn't Always Better

Artificial intelligence models are getting bigger, along with the data sets used to train them. But scaling down could solve some big AI problems

By Lauren Leffer

Climate Change

Air-Conditioning Discovery Eliminates Harmful Gases

Heat pumps are ubiquitous in the form of air conditioners. Scientists just invented one that avoids harmful refrigerant gases

By Davide Castelvecchi,Nature magazine

Behavior

It's Not All in Your Head--You Do Focus Differently on Zoom

Virtual meetings and video calls don't quite stack up to in-person interaction—and a new study proves it

By Lauren Leffer

Defense

What Would It Mean to 'Absorb' a Nuclear Attack?

The missiles on the Fort Berthold Reservation in North Dakota make it a potential target for a nuclear attack. And that doesn't come close to describing what the reality would be for those on the ground.

By Ella Weber | 19:05

Culture

The Science to Be Grateful for This Year

A year of exciting ideas and research has given us much to be grateful for

By Lori Youmshajekian

Robotics

Robotics 'Revives' a Long-Extinct Starfish Ancestor

Engineers and paleontologists teamed up to reconstruct an ancestor of starfish from the Paleozoic era and figure out how it moved

By Lauren Leffer

Policy

AI Needs Rules, but Who Will Get to Make Them?

Skirmishes at the U.K.'s AI Safety Summit expose tensions over how to regulate AI technology

By Chris Stokel-Walker

Defense

The Members of This Reservation Learned They Live with Nuclear Weapons. Can Their Reality Ever Be the Same?

The Mandan, Hidatsa and Arikara peoples are learning more about the missiles siloed on their lands, and that knowledge has put the preservation of their culture and heritage in even starker relief.

By Ella Weber | 14:55

Evolution

Our Evolutionary Past Can Teach Us about AI's Future

Evolutionary biology offers warnings, and tips, for surviving the advent of artificial intelligence

By Eliot Bush

QUOTE OF THE DAY

"The AI authors' writing often sounds like it was written by an alien; one Ortiz article, for instance, warns that volleyball 'can be a little tricky to get into, especially without an actual ball to practice with.'"

Maggie Harrison, Futurism

FROM THE ARCHIVE

Humans Absorb Bias from AI--And Keep It after They Stop Using the Algorithm

People may learn from and replicate the skewed perspective of an artificial intelligence algorithm, and they carry this bias beyond their interactions with the AI

LATEST ISSUES

Questions?   Comments?

Send Us Your Feedback
Download the Scientific American App
Download on the App Store
Download on Google Play

To view this email as a web page, go here.

You received this email because you opted-in to receive email from Scientific American.

To ensure delivery please add newsletters@scientificamerican.com to your address book.

Unsubscribe     Manage Email Preferences     Privacy Policy     Contact Us

Scientific American

1 New York Plaza, FDR Dr, Floor 46, New York, NY 10004

Unsubscribe - Unsubscribe Preferences

Scientist Pankaj

Day in Review: NASA’s EMIT Will Explore Diverse Science Questions on Extended Mission

The imaging spectrometer measures the colors of light reflected from Earth's surface to study fields such as agriculture ...  Mis...