Koobdoo - Learning Equilibrium Mascot
The Learning Equilibrium

Understanding Reality and Bias in AI generated content

By Sanjay MukherjeeOctober 18, 2024

Image depicting a commercial airline pilot interacting with AI

Recently I was working with AI models to generate visuals of airline pilots. As I evaluated outputs, I thought about bias, fact, accuracy et al. Some observations, questions, and thoughts that came to mind:

  • **Observation 1: **After initial 30 prompts and close to a 100 images, I realised that all images were of male pilots.

  • **Observation 2: **The platforms generated images of female pilots only when I included the term ‘female’ in the prompts.

  • **Observation 3: **When I reverted to the gender-neutral term ‘pilot’, the platforms again generated only male pilots.

  • **Observation 4: **On one platform, I have tweaked an instance of the model that is tailored to my inputs and conversations. The model, after the initial steps, generated only female pilots when I asked for images of pilots - it assumed that I want images of only female pilots. After I repeatedly asked for pilots instead of female pilots, the tailored model also reverted to generating only male pilots. 

  • **Observation 5: **Subsequent 20 prompts on various platform yielded the same results. 

Does this constitute a bias?

If yes, what kind of bias? What is bias actually? Bias is a preference based on non-rational influences or information that is not wholly consistent with fact or reality (statistical reality). In the case of airline pilots, the factual reality is that globally an average of 4-6% of the pilot workforce is female. Thus, there could be an argument that there is some rationale to the belief (and therefore visual representation) that pilots are male.

International Civil Aviation Organization (ICAO) data for 2021 indicates that the share of female licensed personnel in aviation, which includes pilots, air traffic controllers, and maintenance technicians was around 5.1%.17 More specifically, at that point, globally, women comprised 4.7% of all pilots, 3.1% of aircraft maintenance engineers, and 21.1% of air traffic controllers.

IATA Gender in Aviation 2024

But there is another way to define bias in the evolving context of fairness, which is about preferences that deviate from neutral or diverse representation in a desired equal society. If we keep showing pilots as only male, then, we are creating societal barriers for females to pursue such a career. On the other hand, if we show more female human beings as pilots in visual contexts, it is likely to promote societal support for more women taking up related educational opportunities and career choices.

My personal opinion is that even if there was no female pilot in the world, it would still be bias to depict only male pilots. 

From a reality perspective, I am based in India and I see female pilots quite often on flights I take, which should not be surprising since 14% of the pilot workforce in India is female (more than double the global average). 

India ranks highest in terms of gender diversity in aviation, and women account for 14% of airline pilots in the country. This is indeed a success story, driven by a wide range of factors such as outreach programs to improve corporate policies, strong family support, company investments, and state and government subsidy programs.

IATA Gender in Aviation 2024

Given that I am using all the AI platforms from India, shouldn’t the platforms factor the location and give me at least 1 image in every 7/8 with a female pilot? But that’s not how AI platforms work. The inherent bias comes from the selection of training data. My guess is in the case of airline pilot images, the input training data could have comprised close to 100% male pilot images. Or, end-users asking for pilot images would have overwhelmingly accepted - without query - male pilots in image outputs.

This is where training professionals working in specialised domains come in. Working with AI is not just about refining models to get technically qualitative content. It is also about participating in the ethics of responsible AI.

Corollary

I ran a similar research for airline cabin crew images. Can you predict the result?

5 prompts, 3 platforms, 30 images - 96.6% images had only female cabin crew. Only 1 image had a male cabin crew in the background in a group of cabin crew.