× |Home |About |Services |Solutions |Portfolio |Quote |Careers |Blog |Web Engineering |Mobile Applications |Software Engineering |Cloud Solutions |Media/Creative Services |Network Solutions |Web Marketing |Consultancy |IT Staffing / Branded Services |FAQ |Contact

Google Algorithm Update

The February 7, 2017 Google Algorithm Update – Analysis and Findings From A Significant Core Ranking Update

February 15, 2017


The fall of 2016 was an extremely volatile time from a Google algorithm update standpoint. Actually, I believe it was the most volatile fall I have ever seen. We saw numerous updates from Google, including partial rollbacks. For example, we had Penguin 4 in late September and early October, and then massive volatility in November (with some rollbacks), which then rolled into December. It was a crazy few months for sure.

And since I heavily focus on algorithm updates, I felt like I was riding a Google coaster!


Once the updates calmed down (after the December 18, 2016 update), I knew the volatility could return in early 2017. And I was right. We saw an early January core update around 1/4/17 and then a strange update on 2/1 that seemed to target PBNs and links in general (which may yield a different post). But little did I know a major update was brewing. And that rolled out on February 7, 2017. And it was big. Barry Schwartz was the first to pick up chatter about the update and it wasn’t long before I started seeing major volatility.

Examples of Impact:

If you’ve read my posts about algorithm updates, then you know the deal already. But if this is your first time reading my algo update posts, then here’s a quick rundown. I have access to a lot of data from websites that have experienced quality problems in the past. That includes Panda, Phantom, and other core ranking updates. That enables me to see movement across sites, verticals, and countries when algorithm updates roll out.

In addition, a number of companies reach out to me for help after seeing negative impact (or to let me know about positive impact). So, I’m able to see fresh hits and recoveries as well. And then I dig into each vertical to find more impact. Between all of that, I get to see a lot of movement, especially with algorithm updates as large as the 2/7/17 update.

So what have I seen over the past week? Here are some screenshots of search visibility movement based on the February 7 update:

Positive impact:


Negative impact:


Big Rankings Swings

When an update like this hits, it’s amazing to see the rankings changes for sites impacted. You typically see big swings in rankings for many keywords. For example, jumping ten to twenty spots, or even more. Here are some screenshots of big rankings increases from sites impacted by the 2/7/ update:


Google’s Core Ranking Updates (and connections to previous updates):

In May of 2015 I uncovered a major algorithm update that I called Phantom 2. It was a core ranking update that Google ultimately confirmed. They said it was a change in how they assessed “quality”. It was a huge update, just like this one. For example, here’s a screenshot of a site that got hammered:


Since then, there have been numerous core ranking updates that seem to target very similar things. I’ll explain more about that below, but think low quality content, thin content, user experience barriers, ad deception, etc.

It seems the algorithm(s) that are part of the core ranking updates need to be refreshed, and when they do, all hell can break loose. It reminds me of old-school Panda updates. In addition, and this makes sense, you can see many connections to those previous core ranking updates with sites that have been impacted. That doesn’t mean you have to be impacted by a previous update to see movement, but sites that make changes (or fall out of the gray area) can see movement during subsequent core updates. I’ll cover that more below.

For example, here are a few screenshots showing the connection between previous major core ranking updates and the February 7 update:


The Connection to Google’s Quality Rater Guidelines

I presented at the Search Engine Journal summit this past fall (in the middle of all the crazy algo updates). In that presentation, I explained something I called “quality user engagement”. Since I have a lot of evidence of UX barriers and ad problems causing core ranking issues based on these updates, I used the phrase to explain them.

Basically, don’t just look at content quality. There’s more to it. Understand the barriers you are presenting to users and address those. For example, aggressive ad placement, deception for monetization purposes, broken UX elements, autoplay video and audio, aggressive popups and interstitials, and more.

And when referring to “quality user engagement”, I often point people to Google’s Quality Rater Guidelines (QRG). The QRG is packed with amazing information about how human raters should test, and then rate, pages and sites from a quality standpoint.

The amazing thing about the Quality Rater Guidelines is that there’s a serious connection to what I’m seeing in the field while analyzing sites getting hit by core ranking updates. For example, check out this quote below about ad deception. I can’t tell you how many times aggressive monetization leads to an algorithm hit.


So go read the QRG… it’s an eye-opening experience. And the other reason I’m bringing this up is because of an interesting tweet from Google’s Gary Illyes on Sunday (right after the algo update). As part of his “Did You Know” series of tweets, he explained that the QRG were updated and they contain great information right from Google. It was interesting timing, to say the least.

Now, the guidelines were updated in March of 2016, so it’s almost a year ago. So why tweet that now? It might just be a coincidence, but maybe it’s due to the major update on 2/7? Only Gary knows. Regardless, I highly recommend reading the QRG.

Here’s Gary’s tweet:


Negative Impact

Since 2/7, I’ve dug into many sites that experienced negative movement. And some saw significant drops across keywords. Basically, Google’s algos are saying the site isn’t as high quality as they thought, rankings should be adjusted, and it’s a site-wide adjustment. Sure, some keywords might still rank well, but many dropped in position, and some dropped significantly.

I can’t cover all of the factors I saw during my travels, or this post would be excessively long (and it’s already long!) But I’ll definitely cover several things I’ve seen during my travels. And don’t forget what I said above about the Quality Rater Guidelines. Go read that today if you haven’t already. It’s packed with great information that directly ties to many situations I’ve seen while analyzing sites impacted by core ranking updates.

Increase in Relevancy & The Connection To Google Panda

The first thing that jumped out at me was the noticeable increase in relevancy for many queries I was checking. For example, on sites that saw negative impact, I checked a number of queries where the site dropped significantly in rankings.

The relevancy of those search results had greatly increased. Actually, there were times I couldn’t believe the sites in question were ranking for those queries at one time (but they were prior to 2/7). Sure, the content was tangentially related to the query, but did not address the subject matter or questions directly.

And if you’ve been paying attention to Google Panda over the past few years, then your bamboo radar should be active. In January of 2016 we learned that Panda became part of Google’s core ranking algorithm. That was huge news, but we had seen major changes in Panda over time. We knew something was going on with or cute, black and white friend.

I saw big changes in Panda from 4.0 to 4.2, which led me to believe the algorithm was being refined, adjusted, etc. You can read my post about Panda 4.2 to learn more. Well, after the announcement about Panda being part of Google’s core ranking algorithm, Gary Illyes kept explaining Panda in an interesting way. I covered this in a post last year when covering the March 2016 Google algorithm update.

Gary explained that Google did not view Panda as a penalty. Instead, Panda looked to adjust rankings for sites that did not meet user expectations from a content standpoint. So Panda would adjust rankings for overly prominent sites. That’s a huge change from Panda of the past (where sites could drop by 65%+ overnight).

Here’s the video clip (check 7:47 in the video):


The reason I’m bringing this up is because I saw a lot of “relevancy movement” during my analysis of the 2/7 update. Now, if that was all I saw, I would say this was Panda at work. But that’s not all I saw. Instead, I saw a mix of problems causing drops, including low-quality user engagement. So if Panda was part of this update somehow, then the update combined Panda with other quality algorithms that were refreshed. Hard to say for sure, but I saw relevancy enough that I wanted to bring it up in this post.

Low-Quality User Engagement

Just like with other core ranking updates, low-quality user engagement reared its ugly head. I saw this often when checking sites that saw big drops. For example, broken user interfaces, menus that didn’t work (either on purpose or by mistake), extremely bulky and confusing navigation, ad deception and crazy ad placement (more on that soon), excessive pagination for monetization purposes, and other UX barriers.

This is consistent with previous core ranking updates dating back to Phantom 2 in May of 2015. To learn more about these issues, I recommend reading all of my previous posts about Google’s quality updates. I have covered many different situations in those posts. In addition, read the Quality Rater Guidelines. There is plenty of information about what should be rated “low quality”.

For example, I’ve seen sites forcing users through 38 pages of pagination to view an article. Yes, 38.


Thin content and Low-quality content

Similar to UX barriers, low quality content and thin content has been on my list of problems with regard to Google’s core ranking updates from the start. I often checked pages that dropped significantly and found problematic content.

The sites I checked were once ranking for keywords leading to that content, but either chose to keep thin content there, or they weren’t aware of the thin content (based on the size of their sites). It underscores the importance of continually analyzing your website, rooting out content problems, enhancing content where needed, removing UX barriers, etc. If you don’t do that, problems can slip in. And when those problems keep growing, you can face serious problems with Google’s core ranking updates.


Mobile Usability Problems

I’ve seen this with previous core ranking updates as well. We know Google is testing its mobile-first index, but it hasn’t rolled out yet. But even before “mobile-first” was announced, I saw serious mobile problems on some sites that were negatively impacted by core ranking updates.

During my latest travels, I saw certain sites that were nearly unusable on mobile. They were ranking well for many queries, but when you hit the site on a mobile device, the menus didn’t work well, you could scroll horizontally, UX elements were out of place, etc.

And in case you’re wondering, this had nothing to do with Google’s mobile popup algorithm. I’ve been heavily tracking that update too. You can read more about my findings (or lack of findings) in my post and in my Search Engine Land column.

So make sure you check your site on various mobile devices to ensure users can do what they need to do. If they are running into problems, then that can send horrible signals to Google that they didn’t find what they wanted (or were having a hard time doing so). Beware.

An example of a mobile UX breaking:


Ad Deception

Ah, we meet again my old friend. Just like with previous core ranking updates and old-school Panda updates, ad deception could cause big problems. For example, weaving ads into your content could drive users insane (especially when those ads match your own content styling-wise). They might mistakenly click those ads and be sent downstream to a third-party advertiser site when they thought they were staying on your own site.

Think about the horrible signals you could be sending Google by doing that to users. And add aggressive third party sites with malware and malicious downloads and you have a disastrous recipe. For example, shocking users as they get taken off your site, and then angering them even more as they get hit by malware. Not good.

By the way, ad deception is specifically mentioned in Google’s Quality Rater Guidelines. Don’t do this. You are playing Russian Roulette with your rankings. I’ve seen this a thousand times over the years (with an uptick since May of 2015). Again, beware.


Side Note: Rich Snippets Algo Refreshed

This is not the first time that rich snippets have either appeared or disappeared for sites during core ranking updates. It’s pretty clear that Google’s rich snippets algo was refreshed either during, or right around, the 2/7 update. And since core ranking updates are focused on quality, that makes complete sense.

For example, there’s a quality threshold for receiving rich snippets. If your site falls below the threshold, then they can be removed. If it’s deemed “high quality”, then you can receive them. Well, I saw several examples of rich snippets either showing up or being removed during the 2/7 update. So if you’ve been impacted from a rich snippets standpoint, work hard to improve quality. They can definitely return, but it will take another core update and refresh of the rich snippets algo for that to happen. Based on what I’ve seen, that’s happening every few months.


And Jim Stewart reached out to me on Twitter about what he saw with rich snippets:


If you are interested in learning more about Google’s site-level quality signals and how they can impact rich snippets, then definitely read my post covering the topic. I provide a lot of information about how that works and provide examples of that happening during major algorithm updates.

Positive Movement on 2/7

So that’s a sampling of problems I saw on sites that were negatively impacted by the 2/7/17 update. But what about sites that improved? Here are some things that sites have done after getting hit by a core ranking update that experienced a recovery or partial recovery during subsequent updates:

Improved “Quality Indexation”:

I’ve brought up what I call “quality indexation” quite a bit in the past. That’s the idea that you only want your highest quality pages indexed. Don’t shoot for quantity… Instead, shoot for quality. When helping companies that have been negatively impacted by core ranking updates, it’s not unusual to find massive amounts of thin or low quality content on the site.

One of the first things I do is surface those urls and work with clients to understand their options. First, did you know this was indexed? If so, is there any way to enhance the content there? And if it’s too big of a job, then work on removing those urls from Google’s index. That last option is typically for larger-scale sites with hundreds of thousands, or millions of pages indexed.

Google’s quality algorithms will use the urls indexed to “score” the site. So if those low quality urls are either enhanced or removed, that can be a good thing. This was a factor I saw on several sites that surged during the 2/7 update.

Cut down on ad aggressiveness:

There are times companies reach out to me after getting hit and they have no idea why they were negatively impacted. They focus on their content-quality and links, but don’t take “quality user engagement” into account.

When analyzing negative hits, it’s not unusual to see urls with way too many ads, ads that push the content down on the page, or ads that are so annoying that’s it hard to focus on the primary content.

And then you have excessive pagination for monetization purposes, which fits into the same major category. For example, forcing users through 38 pages of pagination to read an article (as I mentioned earlier). And of course, you have ads in between every component page. Think about your users. How do they feel about that? And what signals does that send to Google about user happiness (or lack thereof)?


Improved UX on both desktop and mobile:

I mentioned mobile UX problems earlier on sites that were negatively impacted. Well, there are companies that have enhanced their mobile experience that saw gains. Now, I’m not saying that was the only change… since companies typically have to make substantial changes to quality overall in order to see positive movement. But it’s worth noting.

For example, moving to a responsive design, enhancing content on mobile urls, and creating a stronger navigation/internal linking structure for mobile. And with some of my clients having 60-70% of visits from mobile devices, it’s critically important to ensure those users are happy, can find what they need, and navigate through the site seamlessly.

And from a desktop standpoint, fixing usability barriers like broken elements, navigation, ad placement, etc. all contributed to a strong user experience. It wasn’t hard to surface these problems by the way. You just need to objectively traverse your site and rate the experience, identify pitfalls, aggressive site behavior, and broken UX components.

For example, how important do you think a mobile UX is when 55% of your traffic is from mobile devices?


Fixed technical SEO problems that could be causing quality problems:

I’ve always said that technical SEO problems could cause quality problems. For example, glitches that cause thin content (or no content) produced across a large-scale site. Or massive canonicalization problems impacting many pages across a site. Or meta robots issues, or robots.txt problems, that cause pages to disappear from the index (so they can’t help you quality-wise). So on and so forth.

Sites I’ve helped that saw gains also fixed problems like this along the way. It’s more about site health than specific quality problems, but it’s also worth noting. Don’t throw barriers in the way of users and Googlebot.

Here’s a video of Google’s John Mueller explaining that sites need to increase quality substantially in order to see impact (at 39:33 in the video):


Moving Forward:

I’m going to sound like a broken record here, but this is similar advice I have given for a long time when it comes to Google’s core ranking updates. Here we go:

  • You must know your site inside and out. And you must objectively analyze your site to weed out low quality content, low quality user engagement, ad aggressiveness, over-monetization, UX barriers, etc.
  • I recommend performing a crawl analysis and audit of your site to surface potential quality problems. And make sure you crawl the site as both Googlebot and Googlebot for Smartphones in order to understand both the desktop and mobile situation.
  • Run a Panda report (which can be completed for any algo update or traffic loss) to understand the urls seeing the most volatility. Then dig in to those urls. You might just find serious issues with the content and/or user experience on urls that were receiving the most impressions and clicks PRIOR to the algorithm update.
  • Move fast to make changes. The quicker you can surface problems and make changes, the more chance you have of recovery during the next major core ranking update. I don’t mean you should rush changes through (which can be flawed and cause more problems). But move fast to identify big quality problems, and then fix them quickly. Remember to check those changes in staging and then once they go live. Don’t cause bigger problems by rolling out flawed updates.
  • Understand the time to recovery. As I’ve explained earlier, it can take months for sites to see recovery after getting hit by a core ranking update. So don’t have a knee-jerk reaction after just a few weeks after implementing changes. Keep the right changes in place for the long-term and keep increasing quality over time. I’ve seen sites roll out the right changes, and then reverse them before the next major algo update. You can drive yourself mad if you do that.

Summary – The February 7, 2017 Update Was Substantial

The 2/7/17 update was significant and many sites saw substantial movement (either up or down). If you’ve been negatively impacted by the update, go back through my post, and my other posts about Google’s core ranking updates, to better understand what could be causing problems. Then work hard to analyze your site objectively and weed out the problems. Maintain a long-term view for improving your site quality-wise. Don’t put band-aids on the situation. Make significant changes to your site to improve quality overall. That’s how you win. Good luck.