Check your (tech) biases
Why we are teaching technology our prejudices and what brands are doing about it
Picture yourself:
You’re cooking dinner and you need to set a timer, but your hands are covered in seasoning. What do you do? You speak to your smart device and say set a timer for 10 minutes. The device responds 10 minutes, starting now. Problem solved. Now what did you hear?
Here’s another scenario:
A father and his son are involved in a horrific car crash and the man died at the scene. But when the child arrived at the hospital and was rushed into the operating theatre, the surgeon pulled away and said: “I can’t operate on this boy, he’s my son”.
How can this be?
Thinking about the first scenario what voice came into your head - was it female identifying?
And thinking about the Surgeon scenario - did you come up with an answer? Did you realise the surgeon was the boy’s mother?
The Surgeon’s Dilemma demonstrates an unconscious bias we often hold (that women tend to not fill important medical roles). And the smart device scenario demonstrates how we’ve built these biases into technology (that women tend to embody the qualities of an assistant).
From selling smart devices with default female-sounding voices, to search images skewing to white cisgender people biases exist everywhere. Even when I Googled an image to show using Alexa whilst cooking - these were my results:
When we look at technology, artificial intelligence and algorithms, it is humans who are the engineers behind them. As Sinead Bovell, a futurist and tech influencer who I recommend following, quotes:
“AI systems in many ways are a reflection of the data they are trained on. And since that data usually comes from society it will also contain all of society’s historical biases which means the AI’s decisions and predictions will reflect those biases”
Did you know:
Colour blindness can impact people’s experience in gaming and online. For example, Wordle had to introduce a colour blind option.
Closed Captions, which are integral for including hard-of-hearing audiences, are not always included in video. This year Tasha Ghouri became Love Island’s first-ever Deaf contestant. But the show did not even offer subtitles (until it received backlash) therefore excluding hard-of-hearing people from viewing.
In Australia, a study found that on average ethnic minority applicants received about half as many positive responses to their job applications. Recruitment can be very discriminatory to people of colour, women and more. Amazon had to scrap a secret AI recruiting tool because it showed bias against women.
Recently I’ve seen some brands and companies respond to the various biases that exist in physical tech and ad tech. Daily advancements in Web3, NFTs and gaming mean these bias checks are more important than ever.
Spark + Outline: Beyond Binary Code
Spark (NZ internet company) and Outline (an organisation providing mental health support to the NZ Rainbow community) partnered to make the internet more gender inclusive. They created a code that can be added to any website. The code teaches companies when and how to ask for gender information.
Google Pixel: Real Tone
Historically cameras and camera technology haven’t accurately represented darker skin tones. Google partnered with photographers to accurately portray the skin tone of over 50 individuals of different backgrounds and ethnicities. The Pixel 6 with Real Tone is now designed to address image equity.
Dove + Getty Images: #ShowUs
Have you ever searched stock photos or google images for a photo of someone for your presentation? What tends to come up? Thin, white, cisgender, young? According to Dove 70% of women and non-binary people don’t see themselves represented in media and advertising. So Dove created the world’s largest stock photo library to stop biases in beauty.
In a similar vein, Olay sought to challenge image search biases through #DecodetheBias. The campaign was a pledge to double the number of women in STEM, and a goal to make search results for beautiful skin more inclusive.
Google + Canadian Down Syndrome Society: Project Understood
According to IAB 1 in 4 Australians have a smart speaker at home with ownership increasing 32% since last year. By 2023 it’s expected there will be 8 billion voice assistants around the world.
But voice assistants were not trained to understand speech of people with Down Syndrome. Canadian Down Syndrome Society and Google worked together to improve Google’s voice products. They invited over 700 people with Down Syndrome to record their voice and improve voice capturing.
Final Thoughts
Tech has the opportunity to revolutionise our world. However it’s limited by the biases we intentionally or unintentionally teach it. So what can we learn from this?
In the majority of examples I discussed, brands often partnered with organisations that reflected the relevant audience. Your company should represent the diverse makeup of the world. And if not, you should be consulting outside of your business.
Businesses need to collectively solve the inequity among Tech workers. This has occurred through scholarships, free training and more. Vogue Codes and Women in Stem are great at addressing female representation in tech. But there is more to do to support people of colour and Aboriginal Australians in tech and tech design.
Next time your Googling, using a voice assistant, using facial recognition technology, using TikTok filters, filling out an online survey, or watching video content - ask yourself is this inclusive? Does this make assumptions about me? And what needs to change?
I leave you with a quote from Abhijit Naskar, a neuroscientist, who sums this up:
“We have to place our attention on humanising artificial intelligence by removing the biases from algorithms rather than dehumanising it.”





Very thought provoking! Felt so silly when I didn’t realise the surgeon was the mum