When Artificial Intelligence image generators such as Dall-E and Stable Diffusion are asked to generate photos of “attractive people,” the only thing that comes up is pages upon pages of fair-skinned women and men.
What about toys in Iraq?
The page reloads, and dozens of images pop up of toy soldiers wielding AK-47s and RPGs.
Humans are smart enough to know that not all attractive people are white and that Iraq has toys other than just soldiers, but this is an instance in which racial bias in AI reveals our shortcomings in tech development. Many of these models are developed by Western companies, and thus the data they use to train these AI tools mirrors Western views and biases.
It’s time to re-examine how AI tools are made and what data they’re using to jump to these racist conclusions. As AI technology becomes more a part of our day-to-day lifestyles, the far-reaching global impact of racism in AI development stands to only worsen. It demonstrates the importance of prioritizing diversity in AI software development and technology in general.
Examples like the ones above show us how racial bias can affect how AI tools function. The bias isn’t the fault of the AI tool itself; it’s a disturbing result of the tool’s development. A generator’s bias stems from the data used to learn — in this case, race-related information — and when the data lacks diversity, the results will too according to Angle Bush — founder of technology and diversity organization Black Women in AI. To remedy this, we need more diverse data used to train AI models.
Incomplete or skewed data reveals the need to heavily consider diversity when programming such models. Theoretically, biased AI could have devastating effects on people of color’s finances, education, housing and more.
It’s no longer theoretical. Right now, we’re witnessing the concerning effects of bias in AI development in the real world.
Housing discrimination is a prime example. It’s no secret that applying for mortgages and high-end rental properties has been skewed against people of color in the past due to redlining and other racist practices in housing.
UC Berkeley Schools of Business and Law report that people of color applying to purchase or refinance homes are overcharged millions of dollars, and the American Civil Liberties Union reports that Black and Latino tenants are denied at much higher rates than white tenants. This is due to the Western AI models that large banks and property managers are beginning to employ for automating financing and applications for these processes that suggest anti-Black and anti-Latino bias.
Despite monumental strides made to remedy this like the passing of the Fair Housing Act, this progress is completely undone as we see biases carry over into the housing industry at the advent of AI.
The essential question isn’t how AI will be implemented, but who it’ll benefit.
Since these problems aren’t the fault of the AI and instead of the developers and data, diversity in data sets is a simple solution to the problem. However, when diversity isn’t one of the first things being considered as tech develops, we often run into problems like these. It’s high time we started making diversity paramount in the tech world.
Diversity should be a central consideration in all fields — including technology — instead of just an afterthought.
AI is a fascinating new tool marking the beginning of unimagined societal advancements, but it also has the potential to further entrench minority groups into discrimination of the past. Let’s make sure we accomplish the former and minimize the latter.
Staying in contact with your representatives through email, phone and protest is essential for creating AI governing legislation. Additionally, staying up-to-date on news and making your voice heard is the key to making sure tools like AI are used to maximize positive progress instead of setting people of color back.
AI has the power to move us into the future, let’s not let it leave us in the past.
The 2024-25 editorial board consists of Addie Moore, Avery Anderson, Larkin Brundige, Connor Vogel, Ada Lillie Worthington, Emmerson Winfrey, Sophia Brockmeier, Libby Marsh, Kai McPhail and Francesca Lorusso. The Harbinger is a student run publication. Published editorials express the views of the Harbinger staff. Signed columns published in the Harbinger express the writer’s personal opinion. The content and opinions of the Harbinger do not represent the student body, faculty, administration or Shawnee Mission School District. The Harbinger will not share any unpublished content, but quotes material may be confirmed with the sources. The Harbinger encourages letters to the editors, but reserves the right to reject them for reasons including but not limited to lack of space, multiple letters of the same topic and personal attacks contained in the letter. The Harbinger will not edit content thought letters may be edited for clarity, length or mechanics. Letters should be sent to Room 400 or emailed to smeharbinger@gmail.com. »
Starting his fourth and final year on staff, senior Greyson Imm is thrilled to get back to his usual routine of caffeine-fueled deadline nights and fever-dream-like PDFing sessions so late that they can only be attributed to Harbinger. You can usually find Greyson in one of his four happy places: running on the track, in the art hallway leading club meetings, working on his endless IB and AP homework in the library or glued to the screen of third desktop from the left in the backroom of Room 400. »
Leave a Reply