Observation & Questions
Have you ever spent too much time crafting the perfect title tag and meta description, only to see that Google has decided to use its own version in search results? It can be frustrating to see your carefully chosen words replaced by what seems like a random snippet from your page. Is this good for your web pages? How can we write metadata that won’t get rewritten? Can ChatGPT help? Our account team recently asked those questions, which drove us to create a test to help get answers.
How and why does Google rewrite metdata?
Google uses several different sources to automatically determine the title and description for your web page in search results - this includes but is not limited to, descriptive information in the meta description tag and <title> element and information found on the page.
During a Webmaster Central hangout in 2020, Google’s John Mueller offered a couple of reasons why Google might be rewriting your meta descriptions:
- Poorly written meta description, I.E., not using it to summarize the web page.
- To more accurately match the search query with the web page.
Another reason your metadata gets rewritten is character length; the maximum length for a title is ~50-60, and for a meta description, it is ~155-160 characters.
Here are Google’s resources for metadata best practices:
Test Questions
If Google is using its algorithm and machine learning to rewrite upwards of 70% of meta descriptions, that must be what is best for your web pages, right? We decided to run a test to help answer this broad question. First, we determined more specific and testable research questions:
1) Does Google rewriting your meta descriptions help your web pages?
2) If we use Supernova data and ChatGPT to help write meta descriptions, will they get rewritten less?
Hypotheses
We hypothesized that the answer to both of our questions was, “Yes,” Google rewrites are good for performance, and we can help them by creating better descriptions with ChatGPT in the first place. Hindsight is that these two could be considered conflicting ideas, but more on that to come.
Test
This entire test would not be possible without Supernova. Seer’s innovation team created a “Is Google rewriting my metadata?” dashboard within Supernova for account teams to utilize. Data in this dashboard helps us identify how often Google is changing your metadata, understand how metadata changes when Google serves the same result for different search terms, and uncover trends about where on the page Google is pulling content from when serving changed metadata. With this data ready, we conducted the following test:
Step 1: Align on goals and KPIs
First, we defined what our objectives were and what metrics we were going to measure when determining results.
- Improve organic search engine results page (SERP) performance. KPI = Click-through rate (CTR)
- Get Google to change our metadata less. KPI = Google rewrites
During this process, we all aligned on sample size and timing. We wanted this process to be realistic and agile, so we determined that a smaller sample size (~5 pages) and shorter time frame (~4 weeks) would be best.
Step 2: Gather data and make recommendations
Using the Supernova dashboard mentioned above (“Is Google rewriting my metadata?”), we set up alerts to come our way when any page’s descriptions were changed more than 10x per week. Once we were pinged, we determined which 4-5 pages would be best to test.
For each page, we gathered specific data for what keywords were triggering changes and what information Google was pulling into SERPs in place of given meta descriptions.
This information was compiled and given to ChatGPT after a prompt.
New meta descriptions were gathered and set to our client for review.
Step 3: Implementation
Then, our recommendations for new meta descriptions were reviewed by the client and implemented on each web page. We then waited four weeks to gather the results.
Results
Page |
KPI 1: CTR (YoY) |
KPI 2: Meta Description Changes (YoY) |
1 |
2.6% (-1.1%) |
13 (-13) |
2 |
1.1% (-0.2%) |
12 (+6) |
3 |
0.4% (+0%) |
9 (-3) |
4 |
0.8% (+0.2%) |
3 (-10) |
In total, meta descriptions were changed 20 fewer times, while click-through rates stayed fairly stagnant.
During the same period, a control group of the same size, which had two or fewer meta description changes saw CTRs slightly increase overall.
Noted variables:
- Sample size - this was a purposely smaller test to keep bandwidth low and have a faster delivery time.
- Other factors affecting CTR - We understand that there are many other factors contributing to CTR changes; rankings position, title, featured snippets, competing websites, user search intent, etc.
- Seasonality - Period-over-period results were not as important to us as year-over-year as seasonal trends shift quickly for this client, and changes in data compared to the previous year are more valuable.
Conclusions
ChatGPT will help your metadata get rewritten less, but you may not want it to.
For SEOs and website owners, having a lack of control can be hard to grasp, but we must remember Google’s goals and what actions must be taken for them to achieve them, namely, people coming back to search again and again. How can they ensure this? By giving them helpful results. When Google rewrites your metadata, they are trying to make your site more helpful and personalize it based on the search query. We saw this directly with the results of our test.
Sign up for Seer’s newsletter to see more of Supernova's capabilities and our latest tests and POVs on that state of search and AI.