Lucretia Williams, a researcher at Howard University, is regularly interviewed about her work, and she’s noticed that her quotes sometimes aren’t quite right (the “-ed” might get dropped from a word, for example). She attributes this issue to transcription technology, which—like all voice-recognition software—can struggle with Black speech. That’s a problem that she and her team are now trying to address. Williams has spent much of the past two years leading an effort to help voice-recognition technology better understand Black voices.
A partnership between Howard and Google, Project Elevate Black Voices, as it’s called, is an effort to broaden the data sets that teach software to recognize human speech. Black speech hasn’t traditionally been well represented, leading the systems to struggle with the unique grammar, pronunciation, and vocabulary of African American English. The result is that Black speakers using voice-driven AI tools and other technologies often encounter responses like “I’m not sure I understand.” The idea for Project Elevate Black Voices was originally hatched by another Howard professor, Gloria Washington, and Google researcher Courtney Heldreth. (Google is funding the project.)
Tools like Siri and Amazon’s Alexa interpret and respond to commands like “Play Beyoncé,” but a 2020 study found that top-tier voice-interpretation systems have higher error rates for Black users than for white ones—a 22 percent gap for Apple, 15 percent for Amazon, and 12 percent for Google. Williams says this forces many Black Americans to address these pieces of technology in standard English. “You shouldn’t have to code-switch when you talk to your personal devices,” she says.
To help solve that issue, Williams and her team recruited more than 530 African Americans across 32 US states to participate. “Oftentimes, when research is done like this in vulnerable communities, the researchers get more out of it than the actual participants,” Williams says. In this case, participants received up to $599 for three weeks of answering questions, and the team is being careful with how it uses the material it has collected. The data is currently available only to Google and to historically Black colleges and universities that apply for access for specific projects. Howard is maintaining ownership in order to ensure that it isn’t misused in ways that would compromise privacy or otherwise harm participants.
As a result of all that collection effort, Project Elevate Black Voices has now created a data set of 600 hours of responses to questions like “What are your hobbies?” A Black-owned transcription company then processed all the recordings, and Google will use them to improve its offerings.
That’s just the first stage of the project, Williams says. The hope is to expand the data set to include additional dialects from across the African diaspora that are spoken in the United States. “When certain voices don’t get understood, it is a problem,” says Williams. “You shouldn’t have to feel excluded from the technologies that you use and pay for.”
How Howard University Is Helping Tech Understand Black Speech
Howard and Google teamed up to improve voice recognition.
Lucretia Williams, a researcher at Howard University, is regularly interviewed about her work, and she’s noticed that her quotes sometimes aren’t quite right (the “-ed” might get dropped from a word, for example). She attributes this issue to transcription technology, which—like all voice-recognition software—can struggle with Black speech. That’s a problem that she and her team are now trying to address. Williams has spent much of the past two years leading an effort to help voice-recognition technology better understand Black voices.
A partnership between Howard and Google, Project Elevate Black Voices, as it’s called, is an effort to broaden the data sets that teach software to recognize human speech. Black speech hasn’t traditionally been well represented, leading the systems to struggle with the unique grammar, pronunciation, and vocabulary of African American English. The result is that Black speakers using voice-driven AI tools and other technologies often encounter responses like “I’m not sure I understand.” The idea for Project Elevate Black Voices was originally hatched by another Howard professor, Gloria Washington, and Google researcher Courtney Heldreth. (Google is funding the project.)
Tools like Siri and Amazon’s Alexa interpret and respond to commands like “Play Beyoncé,” but a 2020 study found that top-tier voice-interpretation systems have higher error rates for Black users than for white ones—a 22 percent gap for Apple, 15 percent for Amazon, and 12 percent for Google. Williams says this forces many Black Americans to address these pieces of technology in standard English. “You shouldn’t have to code-switch when you talk to your personal devices,” she says.
To help solve that issue, Williams and her team recruited more than 530 African Americans across 32 US states to participate. “Oftentimes, when research is done like this in vulnerable communities, the researchers get more out of it than the actual participants,” Williams says. In this case, participants received up to $599 for three weeks of answering questions, and the team is being careful with how it uses the material it has collected. The data is currently available only to Google and to historically Black colleges and universities that apply for access for specific projects. Howard is maintaining ownership in order to ensure that it isn’t misused in ways that would compromise privacy or otherwise harm participants.
As a result of all that collection effort, Project Elevate Black Voices has now created a data set of 600 hours of responses to questions like “What are your hobbies?” A Black-owned transcription company then processed all the recordings, and Google will use them to improve its offerings.
That’s just the first stage of the project, Williams says. The hope is to expand the data set to include additional dialects from across the African diaspora that are spoken in the United States. “When certain voices don’t get understood, it is a problem,” says Williams. “You shouldn’t have to feel excluded from the technologies that you use and pay for.”
This article appears in the August 2025 issue of Washingtonian.
Most Popular in News & Politics
See a Spotted Lanternfly? Here’s What to Do.
Meet DC’s 2025 Tech Titans
Patel Dined at Rao’s After Kirk Shooting, Nonviolent Offenses Led to Most Arrests During Trump’s DC Crackdown, and You Should Try These Gougères
What Happens After We Die? These UVA Researchers Are Investigating It.
GOP Candidate Quits Virginia Race After Losing Federal Contracting Job, Trump Plans Crackdown on Left Following Kirk’s Death, and Theatre Week Starts Thursday
Washingtonian Magazine
September Issue: Style Setters
View IssueSubscribe
Follow Us on Social
Follow Us on Social
Related
This DC Woman Might Owe You Money
Why a Lost DC Novel Is Getting New Attention
These Confusing Bands Aren’t Actually From DC
Fiona Apple Wrote a Song About This Maryland Court-Watching Effort
More from News & Politics
Nominations Are Now Open for 500 Most Influential People List
Trump and Musk Reunite, Administration Will Claim Link Between Tylenol and Autism, and Foo Fighters Play Surprise Show in DC
This DC Woman Might Owe You Money
A New Exhibition Near the White House Takes a High-Tech Approach to a Fundamental Question: What Is the American Dream?
Want to See What Could Be Ovechkin’s Last Game in DC? It’s Going to Cost You.
Norton Gets a Big-Name Challenger; Mayor, Congresswoman Affirm They Are Women; and an Alligator—That’s Right, an Alligator—Was Found Swimming Near the Wharf
Four Seasons General Manager Marc Bromley on Running a Hotel
Fiesta DC Is Still on Despite Fears of ICE and Other Festival Cancellations