Bias by Us

Biases in Digital Platforms — Open AI’s ChatGPT

Build Up
2 min readMar 5, 2025

by Caleb Gichuhi

Original text was written in August 2024

Two weeks ago I ran a simple bias check on Google translate using Swahili -English translation and posted it here. I decided to run the same test on Chat GPT and below are the results:

The descriptions were spot on and Chat GPT provides the dual meaning of the term Yeye, and even explains how the term is context based.

This is way better that what google translate was offering. I then pushed this slightly further beyond short descriptive usage and began providing simple scenarios. At first Chat GPT kept the gender neutrality of the term as shown in the description of the text below:

However, when I interchanged the text to begin with nurse instead of doctor, Chat GPT no longer used the gender neutral approach but defaulted to determine the gender of the nurse. See below:

While some can ignore this and claim that it’s harmless, it still remains a concern because it continues to perpetuate human biases and further raises the questions on how this and other biases show up in variables such as age, nationality, race, political affiliation. That’s all!

--

--

Build Up
Build Up

Written by Build Up

Build Up transforms conflict in the digital age. Our approach combines peacebuilding, participation and technology.

No responses yet