Jump to content
Forum²

Are We There Yet? The State of the Web & Core Web Vitals [Part 1]


Guest Tom Capper

Recommended Posts

Guest Tom Capper
Posted

XZQSg5yXEfYuRucHGyutBOXsL4ccG7xs2A12I8_L1Wor_tFqDjNECbmaHvRBFdl2lVsEYRfNIkyiFRMRHHEDF4FO4DcKPodcQ-dMr_mxYNrDQvCN8aVStRGPFlr9a8xu5GSeSNEL=s0

 

No, please, do read on. This is a post about what has gone wrong with Core Web Vitals and where we stand now, but also why you still need to care. I also have some data along the way, showing how many sites are hitting the minimum level, both now and back at the original intended launch date.

 

At the time of writing, it’s nearly a year and a half since Google told us that they were once again going to pull their usual trick: tell us something is a ranking factor in advance, so that we improve the web. To be fair, it’s quite a noble goal all told (albeit one they have a significant stake in). It’s a well trodden playbook at this point, too, most notably with “mobilegeddon” and HTTPS in recent years.

 

Both of those recent examples felt a little underwhelming when we hit zero-day, but the “Page Experience Update”, as Core Web Vitals’ rollout has been named, has felt not just underwhelming, but more than a little fumbled. This post is part of a 3-part series, where we’ll cover where we stand now, how to understand it, and what to do next.

 

Fumbled, you say?

 

 

Google was initially vague, telling us back in May 2020 that the update would be “in 2021”. Then, in November 2020, they told us it’d be in May 2021 — an unusually long total lead time, but so far, so good.

 

The surprise came in April, when we were told the update was delayed to June. And then in June, when it started rolling out “very slowly”. Finally, at the start of September, after some 16 months, we were told it was done.

 

So, why do I care? I think the delays (and the repeated clarifications and

along the way) suggest that Google’s play didn’t quite work out this time. They told us that we should improve our websites’ performance because it was going to be a ranking factor. But for whatever reason, perhaps we didn’t improve them, and their data was a mess anyhow, so Google was left to downplay their own update as a “tiebreaker”. This is confusing and disorientating for businesses and brands, and detracts from the overall message that yes, come what may, they should work on their site performance.

 

As John Mueller said, “we really want to make sure that search remains useful after all”. This is the underlying bluff in Google’s pre-announced updates: they can’t make changes that cause the websites people expect to see, to not rank.

 

Y’all got any data?

 

 

Yes, of course. What do you think we do here?

 

You may be familiar with our lord and savior, Mozcast, Moz’s Google algorithm monitoring report. Mozcast is based on a corpus of 10,000 competitive keywords, and back in May I decided to look at every URL ranking top 20 for all of these keywords, on desktop or on mobile, as tracked from a random location in the suburban USA.

 

This was some 400,000 results, and (surprisingly, I felt) ~210,000 unique URLs.

 

At the time, only 29% of these URLs had any CrUX data — this is data collected from real users in Google Chrome, and the basis of Core Web Vitals as a ranking factor. It’s possible for a URL to not have CrUX data because a certain sample size is needed before Google can work with the data, and for many lower traffic URLs, there is not enough Chrome traffic to fill out this sample size. This 29% is an especially depressingly low number when you consider that these are, by definition, higher traffic pages than most — they rank top 20 for competitive terms, after all.

 

fvRV4zxzFnDWm0Bi41VZhy57yCxKLwxKycG44qgjcHfDG6pj95HzFbeqUX6LfvnE4CReW70N0tjSvm9OZJ4MAdIWJF7J2gt8u8eCzDWYHv-HBxgJiBdSCNNKDCxsctkKYsAGMRQp=s0

 

Google has made various equivocations around generalizing/guesstimating results based on page similarity for pages that don’t have CrUX data, and I can imagine this working for large, templated sites with long tails, but less so smaller sites. In any case, in my experience working on large, templated sites, two pages on the same template often had vastly different performance, particularly if one was more heavily trafficked, and therefore more thoroughly cached.

 

Anyhow, leaving that rabbit hole to one side for a moment, you might be wondering what the Core Web Vitals outlook actually was for this 29% of URLs.

 

cm36v2Jf0JkKQtiw2vvzPh96DKDlFByMn9s2fjYAwLx6_zVGLhHSYFmaeeYKwyqgCT9iID3hmNXf6GJ0y4ZKyCaQbuHdGUgJuELXSArujbfjKfKih7i2XzZ5EaOmJ4nY-2-nr7UZ=s0

 

Some of these stats are quite impressive, but the real issue here is that “all 3” category. Again Google has gone and contradicted itself

and forth on whether you need to pass a threshold for all three metrics to get a performance boost, or indeed whether you need to pass any threshold at all. Still, what they have told us concretely is that we should try to meet these thresholds, and what we haven’t done is hit that bar.

 

30.75% passed all thresholds, of the 29% that even had data in the first place. 30.75% of 29% roughly equals 9%, 9% of URLs or thereabouts can concretely be said to be doing alright. Applying any significant ranking boost to 9% of URLs probably isn’t good news for the quality of Google’s results — especially as household name brands are very, very likely to be rife among the 91% left out.

 

So this was the situation in May, which (I hypothesize) led Google to postpone the update. What about August, when they finally rolled it out?

 

zp-MVW2zUOgmmYJxpaMfHyBN4FoawPAZl67zOv8NvHc_b2uOX00NBfgHShMCkH4gXIFRtiioDqDE6dpKiTvkPBFCSSxBmNg-qWv-EQ_CTKYxmZzGhAOb9cQjSZYMC43QxO0Zwr_o=s0

 

CrUX data availability increased from 29% to 38% between May and August 2021.

 

Vg4IL354GJd0h4zQtDa-ixNbCBAySDgH_ByA07-QEY8J54BdhHkCimrtqyhRAg1VD2HLwiZkwCHfevLlvet0VfoAocRiSauFjcQgu6iiu_gJFM-8199M1uZTB0Gk4k0-iEXS6xP2=s0

 

The rate of URLs with CrUX data passing all three CWV thresholds increased from 30.75% to 36.3% between May and August 2021.

 

So, the new multiplication (36.3% of 38%) leaves us at 14% - a marked increase over the previous 9%. Partly driven by Google collecting more data, partly by websites getting their stuff together. Presumably this trend will only increase, and Google will be able to turn up the dial on Core Web Vitals as a ranking factor, right?

 

More on that in parts 2 and 3 :)

 

In the meantime, if you're curious about where you stand for your site's CWV thresholds, Moz has a tool for it currently in beta with the official launch coming in mid-to-late October.

 

 

Appendix

 

 

And if you really want to nerd out, see how you score against the industry at large on these distribution charts from the August data:

 

6DAdn5uXzVh__0Bcn0aOuvvMzWbtT3g3i_6WNzNLqYkXmv5I1JULtHTAousEl-rWLwUMIw14jW8aIm65rKWs40EKkwUZ-02FjtxMz3V6Hk-lNUQzQSYevnoX2eIqOdUKm9lRF7-r=s0

3ZXtiMQYoaicc6uTYCOJxPSqgcHQC4Sl0nKmol9R_1fGJTORT7BswCZVJRLXl8K7cPa5qfbXQGpYRXZv5SKaoTpwidEsdJhGf1X80GQDbSEvsHOrX9vB9bN_l5JDHKe13y6u-rDs=s0

u-o9lgO9JNL99xhVLgtrJHnFFquEnaEPj1TpxFP91xf5drXfyIAJfrqwwplJHeG8RA74Z2UBgADw_Bb2HjjeYLW6yH_07wSnWBZgMcrLdOcYau1taPgkXElyOuTXx_Vu_PT-r4xN=s0

http://feeds.feedburner.com/~r/MozBlog/~4/S5vgBkaHBPE

 

Continue reading...

×
×
  • Create New...