Brooke Gladstone: As AI technologies advance, many critics observe that these tools replicate the prejudices of the data they train on.
News clip: One UC Berkeley professor was able to trick ChatGPT to write a piece of code to check if someone would be a good scientist based on their race and gender. A good scientist it found was white and male.
Brooke Gladstone: In 2020, a prominent researcher named Timnit Gebru said that Google fired her after she highlighted harmful biases in the AI systems that support Google’s search engine. Today, she runs a research institute rooted in the belief that AI is not inevitable, its harms are preventable, and when it includes diverse perspectives, it can even be helpful, beneficial, but–
Timnit Gebru: We should have guardrails in place and we should make sure that the group of people involved in creating the technology resemble the people who are using the technology.
Brooke Gladstone: New players have joined the field to address that issue. In December, a startup called Latimer AI announced a licensing agreement with the largest and oldest Black newspaper in New York City, the New York Amsterdam News. The partnership began when Latimer’s founder, John Pasmore approached an old friend, Elinor Tatum, the publisher and editor of the paper.
Elinor Tatum: It, to me, was a no-brainer because we know how our community can be so misrepresented in media in general, and because of that and the way large language models learn.
Brooke Gladstone: The biases are built in to the models. It scrapes the internet and there’s a lot of real garbage out there.
Elinor Tatum: Garbage in, garbage out. If there are things that misrepresent our communities and what it’s learning from, the idea of being able to be a part of something that is going to be able to give a correct narrative, I thought was something very important.
Brooke Gladstone: Lewis Latimer, I understand, was a Black inventor whose legacy and scientific contributions were often overlooked. That’s who the company’s named for. Tell me about this company, Latimer.
Elinor Tatum: They’re working very hard to make sure that the information that is coming from sources that are Black is getting out there to the public. They are actually training the model, partially based upon the archives of the Amsterdam News going back to 1926. There may be some things that just weren’t covered in other media that were covered by the Amsterdam News. If we look at the Central Park jogger case, for instance, we will see very different coverage coming out of the Amsterdam News than we would’ve seen out of any of the other newspapers. We may see a difference in what Latimer would produce versus another AI search because there would be very different information, even coverage of the Macy’s Thanksgiving Day parade.
Brooke Gladstone: What do you have in your mind there?
Elinor Tatum: Because it used to start in Harlem.
Brooke Gladstone: I understand you spent several months on figuring out how to work together. Can you tell us anything about your arrangement?
Elinor Tatum: The actual agreement is confidential with Latimer, but I can say that what we have right now is not permanent and we will be renegotiating our relationship as we get a better understanding of what the real value is around the data. This is all very new, especially in terms of Latimer because they’re very much a startup.
Brooke Gladstone: When you talk about an evolving relationship, do you expect to ever make any money out of it?
Elinor Tatum: I certainly believe that we will. There’s definitely a number attached to it, and the model is going to be working in looking to be placed in places like HBCUs across the country as a starting point and go from there. They’ve already got relationships set up with several HBCUs around the country.
Brooke Gladstone: Latimer said in its press release that it’s “constructing an LLM that represents the future of AI, where these models are built to better serve distinct audiences.” Clearly, in this case, the distinct audience includes the countless people who’ve been served for over a century by the Amsterdam News and the historic Black colleges and universities. That’s great. If you had the chance, Elinor, would you want to combat these built-in biases that your archive could help correct training a much bigger platform intended to reach nearly everybody like ChatGPT?
Elinor Tatum: Well, doesn’t everyone have to start somewhere?
Brooke Gladstone: Yes, but if you had a chance, you’d go as big as you could.
Elinor Tatum: Well, I would like to see Latimer be as large or larger than ChatGPT or any of these because I believe that it could be with the right technology, with the right infrastructure, with the right information being inputted into it. You see all of the world needs to get the diversity that Latimer is going to provide. I am hoping that Latimer gets into every HBCU in the country to start with, and then to libraries across the country, public libraries, then the general public. They’re already signing up. Just general internet users are using it already. I’m hoping that it’s another commonplace usage, just like ChatGPT.
Brooke Gladstone: It’s really refreshing to hear this perspective. It’s unique, because it’s not based on, well, if you can’t beat them, join them. It’s not focused on trying to have a more efficient operation based on fancy AI tools making lots of money, or even about losing less money at this point. It really is about improving the media ecosystem.
Elinor Tatum: Absolutely. I really feel strongly about Latimer because if you don’t have the voices of the people that are being represented, you’re not going to have a correct representation of people. That’s why I feel it is so very important to have our voices included in all media, and that includes these large language models.
Brooke Gladstone: You have no fear of AI taking journalism down.
Elinor Tatum: I think everyone has some fears of it, but journalism is still very much needed and I want to make sure that there is information out there that is quality information that’s going to be added to it. Now, does AI need some help? Are there a lot of issues? Yes. AI has a lot more learning that needs to be done and with every day, with every week, every month, and every year, advances are made and more advances need to be made, but it’s an ever-evolving process and I’m looking forward to see what comes next. I’m very excited to be a part of it.
Brooke Gladstone: What sort of a future are you hoping to build together?
Elinor Tatum: Well, one that is long and lucrative but also one that is going to bring information to people that shows the true breadth with texture and color of our communities, that tells the stories and brings out the information that has been so long overlooked by other keepers of history so when people ask questions, they get the answers that aren’t so easily found.
Brooke Gladstone: Elinor, thank you very much.
Elinor Tatum: Well, thank you for having me.
Brooke Gladstone: Elinor Tatum is the editor-in-chief of the New York Amsterdam News.
[music]
Micah Loewinger: Coming up with AI, it’s easy and profitable to make highly trafficked and highly stupid conspiracy videos.
Brooke Gladstone: This is On The Media.
Copyright © 2024 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.