Jump to content
Forum²

Getting Smarter with SERPs - Whiteboard Friday


Guest rjonesx.

Recommended Posts

Guest rjonesx.
Posted

Posted by rjonesx.

 

Modern SERPs require modern understanding. National SERPs are a myth — these days, everything is local. And when we're basing important decisions on SERPs and ranking, using the highest quality data is key. Russ Jones explores the problem with SERPs, data quality, and existing solutions in this edition of Whiteboard Friday.

 

 

 

 

 

 

http://d2v4zi8pl64nxt.cloudfront.net/getting-smarter-with-serps/5e696e14352321.56433750.png

 

Click on the whiteboard image above to open a high resolution version in a new tab!

 

Video Transcription

 

 

Hey, folks, this is Russ Jones here again with another exciting edition of Whiteboard Friday. Exciting might be an exaggeration, but it really is important to me because today we're going to talk about data quality. I know I harp on this a whole lot.

 

It's just, as a data scientist, quality is really important to me. Here at Moz, we've made it a priority of the last several years, from improving the quality of our Domain Authority score, improving Spam Score, completely changing the way we identify the search volume in particular keywords. Quality is just part of our culture here.

 

Today I want to talk about a quality issue and probably the most important metric in search engine optimization, which are search rankings. Now I know there's this contingent of SEOs who say you shouldn't look at your search rankings. You should just focus on building better content and doing better outreach and just let it happen.

 

But for the vast majority of us, we look at our rankings for the purposes of determining how we're performing, and we make decisions based on those rankings. If a site stops performing as well for a very important keyword, well, then we might spend some money to improve the content on that page or to do more outreach for it.

 

We make important decisions, budgetary decisions on what the SERPs say. But we've known for a while that there's a pretty big problem with the SERPs, and that's personalization. There just is no national search anymore, and there hasn't been for a long time. We've known this, and we've tried different ways to fix it.

 

Today I want to talk about a way that Moz is going about this that I think is really exceptional and is frankly going to revolutionize the way in which all SERPs are collected in the future.

 

What's wrong with SERPs?

 

1. Geography is king

 

 

Let's just take a step back and talk a little bit about what's wrong with SERPs. Several years back I was a consultant and I was helping out a nonprofit organization that wanted to rank for the keyword "entrepreneurship."

 

They offered grants and training and all sorts of stuff. They really deserved to rank for the term. Then one day I searched for the term, as SEOs do. Even though they rank track, they still check it themselves. I noticed that several local universities to where I live, the University of North Carolina Chapel Hill and Duke, had popped up into the search results because they were now offering entrepreneurship programs and Google had geolocated me to the Durham area.

 

Well, this wasn't represented at all in the rank tracking that we were doing. You see, the nationalized search at that time was not picking up any kind of local signals because there weren't any colleges or universities around the data center which we were using to collect the search results.

 

That was a big problem because that one day Google rolled out some sort of update that improved geolocation and ultimately ended up taking a lot of traffic away for that primary keyword because local sites were starting to rank all across the country. So as SEOs we decided to fight back, and the strategy we used was what I call centroid search.

 

2. Centroid search sucks

 

 

The idea is pretty simple. You take a town, a city, a state, or even a country. You find the latitude and longitude of the dead center of that location, and then you feed that to Google in the UULE parameter so that you get a search result from what would happen if you were standing right there in that specific latitude and longitude and perform the search.

 

Well, we know that that's not really a good idea. The reason is pretty clear. Let me give an example. This would be a local example for a business that's trying to perform well inside of a small city, a medium town or so. This is actually, despite the fact that it's drawn poorly, the locations of several Italian restaurants in South Bend, Indiana.

 

So as you can see, each little red one identifies a different Italian restaurant, and the centroid of the city is right here, this little green star. Well, there's a problem. If you were to collect a SERP this way, you would be influenced dramatically by this handful of Italian restaurants right there in the center of the city.

 

But the problem with that is that these blue circles that I've drawn actually represent areas of increased population density. You see most cities, they have a populous downtown, but they also have around the outside suburban areas which are just as population dense or close to as population dense.

 

At the same time, they don't get represented because they're not in the middle of the city. So what do we do? How do we get a better representation of what the average person in that city would see?

 

3. Sampled search succeeds

 

 

Well, the answer is what we call sampled search. There are lots of ways to go about it.

 

Right now, the way we're doing it in particular is looking at the centroids of clusters of zip codes that are overlapping inside a particular city.

 

http://d1avok0lzls2w.cloudfront.net/uploads/blog/screen-shot-2020-03-19-at-4-213618.jpg

 

As an example, although not exactly what would happen inside of Local Market Analytics, each one of these purple stars would represent different latitudes and longitudes that we would select in order to grab a search engine result and then blend them together in a way based on things like population density or proximity issues, and give us back a result that is much more like the average searcher would see than what the one person standing in the center part of the city would see.

 

http://d1avok0lzls2w.cloudfront.net/uploads/blog/screen-shot-2020-03-19-at-4-168616.jpg

 

We know that this works better because it correlates more with local search traffic than does the centroid search. Of course, there are other ways we could go about this. For example, instead of using geography, we could use population density specifically, and we can do a lot better job in identifying exactly what the average searcher would see.

 

But this just isn't a local problem. It isn't just for companies that are in cities. It's for any website that wants to rank anywhere in the United States, including those that just want to rank generically across the entire country. You see, right now, the way that national SERPs tend to be collected is by adding a UULE of the dead center of the United States of America.

 

Now I think pretty much everybody here can understand why that's a very poor representation of what the average person in the United States would see. But if we must get into it, as you can imagine, the center part of the United States is not population-dense.

 

We find population areas throughout the coastlines for the most part that have a lot more people in them. It would make a lot better sense to sample search results from all sorts of different locations, both rural and urban, in order to identify what the average person in the United States would see.

 

http://d1avok0lzls2w.cloudfront.net/uploads/blog/screen-shot-2020-03-19-at-4-206232.jpg

 

Centroid search delivers you a myopic view of this very specific area. Whereas sampled search can give you this blended model that is much more like what the average American or in any country or county or city or even neighborhood would see. So I actually think that this is the model that SERPs in general will be moving to in the future, at least SERP collection.

 

The future of SERPs

 

 

If we continue to rely on this centroid method, we're going to continue to deliver results to our customers that just aren't accurate and simply aren't valuable. But by using the sampled model, we'll be able to deliver our customers a much more quality experience, a SERP that is blended in a way that it represents the traffic that they're actually going to get, and in doing so, we'll finally solve, to at least a certain degree, this problem of personalization.

 

Now I look forward to Moz implementing this across the board. Right now you can get in Local Market Analytics. I hope that other organizations follow suit, because this kind of quality improvement in SERP collection is the type of quality that is demanded of an industry that is using technology to improve businesses' performance. Without quality, we might as well not be doing it at all.

 

Thanks for hearing me out. I'd like to hear what you have to say in the comments, and in the SERPs as well, and hopefully we'll be able to talk through some more ideas on quality. Looking forward to it. Thanks again.

 

Video transcription by Speechpad.com

 

 

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

 

http://feeds.feedburner.com/~r/MozBlog/~4/9ZxzxPNcn2w

 

Continue reading...

×
×
  • Create New...