There’s a bias issue we found in Google’s search results – skip here to get right to it. This was an important discovery / reminder that we shouldn't blindly use the outputs of AI analyses.
While we can get so much further than we have before in terms of insights, we still need to ensure we’re gut checking our results and not blindly following the data.
Image AI Hypothesis
We discovered the bias we’ll be discussing below during this hypothesis process:
Can AI help us pick better images for our clients?
I looked at the current state and thought, how do we pick images for e-commerce feeds and social ads today? Does our design team get requirements from the client, brand guidelines, etc then go to a site to pick them? What data are we using [or not using] to make these decisions?
The other side of my brain was like – what data do I already have that could help my design team pick different images to test?
When we are finding out “where we rank” on Google, we have to deploy tools that scrape the search engine results pages in order to tell us where we rank relative to our competitors. In doing that scrape we also get a lot of information, like “Does Google show images as an answer and in what position?” – that’s where I focused.
Now I was able to see things like where are my clients paying good money to target a keyword with text ads, but the API deems a set of images is more likely the right answer for the users' question?
Then the magic happens
Once I find the keywords that trigger images, if I could then get those images from Google Search and into Google Vision API I could get all the parts of the images, find their commonality and compare the commonality to what we’re showing.
It was within the process of working that hypothesis that I ran into this question…
Why is the Image API Showing “Street Fashion” for Black Doctors?
In preparing for a new healthcare client, I wanted to show them the power of using our data warehouse / infrastructure, so I picked an example in their space.
I’ve been seeing trends around Black doctors trending up for other healthcare clients. When searching for Black OB/GYN in Google Trends you can see it:
In Google Search Results, you can see that they believe images are a good answer for the term Black OB/GYN. So they show Black female doctors, perfect - great match.
If you want to do this at scale you have to tap into Google’s Vision AI API, but for starters to just get what we needed for this new Healthcare client, we figured we’d just upload these photos into the demo version:
The image below was first to run through the demo API:
Here is what the AI gives back:
But it was the identification two more down that was confusing:
Street Fashion? Maybe because her face is covered there’s a lot less for the image API to use? Trying that again ...
The results:
Street fashion again? This is when I started asking our Seer team for a sanity check. They started chiming in …
Nichole had a hypothesis and tested it:
Dana and Theresa came through with more tests:
What This Means for the Future of Image AI
As companies continue to move up the data maturity curve with AI and Machine Learning, it's important to keep a tight watch on the outputs of our analyses. The danger of using data can be in trusting it blindly and creating a disconnect between you and your audience.
It’s worth noting this also isn’t to throw blame at Google's Vision API -- Google overall obviously knows bias is an issue in their products and they’re actively working to solve some of them. While they work to improve bias in pre-trained models, we will be prepared to gut check our inputs and outputs and suggest other marketers do the same as we get our hands on more data.
Additional Resources
Learn more about the importance of building more inclusive products for everyone:
- Master Your Heart, Brain & Ego for Search Marketing
- The Importance (& Value) of a Purpose-Driven Business Model
- Why Consumer Connection is Important for Brands
Looking for an agency partner to help unleash the power of your data? Explore Seer’s digital marketing services!