Go to the Moz Google Algorithm Update Page(http://moz.com/google-algorithm-change) and see if you can find the data of your traffic drop listed there.
Barracuda Digital also offers a tool called the “Panguin Tool”(http://www.barracuda-digital.co.uk/panguin-tool/) that will overlay the Google algorithm updates on your Google Analytics data to make this data comparison analysis earlier for you.
Top Heavy Update
Sites have too many ads above the fold. Even if the ads are AdSence ads. To avoid this penalty, avoid placing lots of ads near the top of your web pages. A banner ad is probably fine, as is something on the right rail of your site, but anything more than that starts to run into trouble with this algorithm.
DMCA Penalty(a.k.a. Pirate)
This update applies penalties to sites that receive too many valid DMCA takedown requests. DMCA stands for Digital Millennium Copyright Act. You can file a DMCA takedown request if someone has stolen your content and published it as his own. Of course, to avoid this penalty, you should publish only your own original content. If you hire people to write for you, ensure that they are writing original content as well.
With the release of this update, Google started to change the way it values exact-match domains. An example of an exact-match domain might be http://www.blue-widgets.com (note that this is not currently a real site, and is in fact a parked page), which historically would have had an advantage in ranking for the wearch phrase blue widgets. The purpose of this update was to reduce or eliminate that advantage.
Payday Loan(spammy sites)
This algorithm targeted sites that it considered to be in particularly spammy market spaces. Google specifically mentioned payday loans, casinos, and pornography as targeted markets.
This update is designed to reduce rankings for low-quality sites-sites which are low-value add for users, copy content from other websites, or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites – sites with original content and information such as research, in-depth reports, thoughtful analysic, and so on. Added use of the Chrome Blocklist Extension.
Panda’s target areas
As you might expect, this is defined as pages with very little content. Examples might be user profile pages on forum sites with very little information filled in, or an ecommerce site with millions of products, but very little information provided about each other.
These may be scraped pages, or pages that are only slightly rewritten, and Google can detect them relatively easily. Sites with even a small number of these types of pages can be impacted by Panda.
Even if you create all ariginal articles, this may not be enough. If every page on your site covers topics that have been written about by others hundreds or thousands of times before, then you really have nothing new to add to the Web with your site. Consider, for example, the number of articles in the Google index about making French toast. There are 30,000 pages on the Web that include the phrase “how to make french toast” in their title. From Google’s perspective, it don’t need another web page on that topic.
This is content that is inaccurate or poorly assembled. In many cases, this may be hard to detect, one indicator is content that includes poor grammar or a lot of spelling mistakes. Google could also potentially use fact checking as another way to determine poor-quality content.
Sites that have large numbers of pages with lists of curated links do get hit by Panda. Content curation is not inherently bad, but if you are going to do it, it’s important to incorporate a significant amount of thoughtful commentary and analysis. Pages that simply include lots of links will not do well, nor will pages that include links and only a small amount of unique text. Content curation is explored in depth in “Content Curation & SEO: A Bad Match?”(https://www.searchenginewatch.com/2013/08/25/content-curation-seo-a-bad-match/)
This was believed to be one of the original triggers for the Panda algorithm, as it was a popular tactic for content farms. Imagine you wanted to publish content on the topic of schools with nursing programs. Content farm sites would publish many articles on the same topic, with titles such as “nursing schools””nursing school””nursing colleges””nursing universities””nursing education,” and so forth. There is no need for all of those different articles, which prompted Google to target this practice with Panda.
The practice of using a database to generate web pages is not inherently bad, but many companies were doing it to an excessive scale. This led to lots of thin-content pages or poor-quality pages, so many of these types of sites were hit by Panda.
Impact of Any Weak Content on Rankings
Weak Content on even one single section of a larger site can cause Panda to lower the rankings for the whole site. This is true even if the content in question makes up less than 20% of the pages for the site. When you are putting together a plan to recover from Panda, it is important to take this into account.
Path to Recovery- Panda releases come out roughly once per month.
As you consider these tough choices, it can be helpful to look at your competition that did not get hit. Understand, however, that you may see instances of thin content, weak content, “me too” content, and other poor-quality pages on competitor’s sites that look just as bad as the content penalized on your site, and they may not appear to have been impacted by Panda. Don’t let this type of analysis deter you from making the hard choices. There are so many factors that Google uses in its ranking algorithms that you will never really know why your site was hit by Panda and your competitor’s site was not.
To rebuild your traffic, it’s best to dig deep and take on hard questions about how you can build a site full of fantastic content that gets lots of user interaction and engagement. While it is not believed that social media engagement is a factor in Panda, there is likely a strong correlation between high numbers of social shares and what Google considers to be good content.
Improve the content. This may involve rewriting the content on the page, and making it more compelling to users who visit.
Add the noindex meta tag to the page. This will tell Google to not include these pages in its index, and thus will take them out of Panda equation.
Delete the pages together, and 301-direct visitors to other pages on your site. Use this option only if there are quality pages that are relevant to the deleted ones.
Delete the pages and return a 410 HTTP status code when someone tries to visit the deleted page. This tells the search engine that the pages have been removed from your site.
Use the URL removal tool(https://www.google.com/webmasters/tools/removals?pli=1) to take the page out of Google’s index. This should be done with great care. You don’t want to accidentally delete other quality pages from the Google index!
Penguin-about bad links
Target Areas of Penguin- they happen roughly twice per year.
From the very first release of Penguin, Goolge targeted sites that obtained links from article directories. While these links may not always reflect manipulative intent by the publisher, Google found that people who leveraged article directories tended to operate lower-quality sites.
These were also targeted in the very first release of Penguin. There are a few directories that are genuinely high quality, such as the Yahoo! Directory, DMOZ, Business.com, Best of the Web, and perhaps a few other specific to your industry vertical. Stay away from the rest.
Excessive righ anchor text
Excessive use of rich anchor text was also a part of the initial release of Penguin. Spefifically, Penguin targeted too many instances of the sme anchor text pointing to any of the URLs on your site. Google does not expect, or want, all links to say, “click here,” but it sees it as a signal of spammy behavior when the exact same keyword-rich anchor text is used repeatedly.
Low-relevance international links
While it is not confirmed that this target area is a part of any Penguin release, anecdotal evidence suggests it might be. Consider any links from countries where you don’t sell your products or services as a potential problem, unless they come from truly high-quality sites.
Excessively implementing links in comments on other people’s blog posts and forums is also a problem for Penguin.
There are two types of penalties, algorithmic and manual. Algorighmic penalties do not involve any human component, whereas manual penalties do.
Potential triggers for manual penalties
Any user(including your competitor) can file a spam report in Google(https://www.google.com/webmasters/tools/spamreport?pli=1%22). Google receives large volumes of these reports every day. Google evaluates each report, and it if finds one credible(it may run some type of algorithmic verifier to determine that), then it conducts a manual review.
Algorithmically triggered review
While this approach has never been verified by Google, it’s likely that Google uses algorithms to trigger a manual review of a website. The premise is that Google uses algorithms like Panda, Penguin, and others that identify large quantities of sites whose behavior is bad, but not bad enough for Google to algorithmically penalize them, so these sites would be queued for manual review. Google could also implement custom algorithms designed to flag sites for review.
Regular search results reviews
Google maintains a large team of people who perform manual reviews of search results to evaluate their quality. This effort is primarily intended to provide input to the search quality team at Google that they can use to help them improve their algorithms. However, it is quite possible that this process could also be used to identify individual sites for further scrutiny.
Cloaking and/or sneaky redirects(https://support.google.com/webmasters/answer/9044175?hl=en&visit_id=637161611621180180-3760402011&rd=1#cloaking)
You can get this message if Google believes you are showing different versions of pages to Googlebot than you show to users. To diagnose this, use the “Fetch and Render as Google” tool in Search Console to retrieve the page. Use the tool to load the same page in another browser window and compare the two pages.
If you see differences, invest the time and effort to figure out how to remove the differing content. You should also check for URLs that redirect and send people to pages that are not inline with what they expected to see – for example, if they click on anchor text to read an article about a topic of interest but instead find themselves on a spammy page trying to sell them something.
Another potential source of this problem is conditional redirects, where users coming from Google search, or a specific range of IP addresses, are redirected to different pages than other users.
Hiden text and/or keyword stuffing(https://support.google.com/webmasters/answer/9044175?hl=en&visit_id=637161611621180180-3760402011&rd=2#hidden-text)
This message is generated if Google believes you are stuffing keywords into your pages for the purpose of manipulating search results – for example, if you put content on a page with a white background using a white font, so it’s invisible to user but search engines can still see it.
Another way to generate this message is to simply repeat your main keyword for a page over and over again in hopes of influencing search results.
This type of penalty is applied to sites allowing user-generated content that are perceived to not be doing a good job of quality control on that content. It’s very common that sites with user-generated content become targets for spammers uploading low-quality content with links back to their own sites.
The short-term fix for this is to identify and remove the spammy pages. The longer-term fix is to implement a process to reviewing the screening out spammy content to prevent it from getting on your site in the first place.
Unnatural links from your site(https://support.google.com/webmasters/answer/9044175?hl=en&visit_id=637161643984990043-1131098544&rd=1)
This is an indication that Google believes you are selling links to third parties, or participating in link schemes, for the purpose off passing PageRank. The fix is simple: remove the links on your site that look like paid links, or add a nofollow attribute to those links. You do not need to worry about links that are marked as noffolw.
Google will communicate this penalty by sending you a message in Search Console and/or by showing indications that your site has been hacked(and is dangerous to visit) in the search results. The most common cause for this penalty is failing to keep up with updates to your content management systems(CMS).
Spammers take advantage of vulnerabilities in the CMS to modify your web pages, most often for the purpose of inserting links to their own sites, but sometimes for more nefatious purpose such as accessing credit card data or other personally identifiable information.
To revolve this problem, you will need to determine how your site has been hacked. If you don’t have technical staff working for you, you may need to get help to detect and repair the problem. To minimize your exposure going forward, always keep your CMS on the very latest version possible.
Google will give you this message in Search Console if it believes that your site is using very aggressive spam techniques. This can include thing such as automatically generated gibberish or other tactics that appear to have little to do with tring to add value for users.
If you get this message, there is a strong chance that you should simply shut down the site and start with the new one.
Even if your site is clean as a whistle, if a large percentage of the sites using your hosting company are spamming, Google may take action against all of the sites hosted there. Take care to make sure you are working with a highly reputable hosting company!
There is nothing wrong inherently wrong with a link from the footer off someone’s web page, but as these links are less likely to be clicked on or viewed by users, Google may discount their value. For more check http://www.seobythesea.com/2010/05/googles-reasonable-surfer-how-the-value-of-a-link-may-differ-based-upon-link-and-document-features-and-user-data/
Google provides a list of external links in the Search Console account for your site. Search Console-Search Traffic-Links to Your site. We recommend that you also pull links from several other sources include: Open Site Explorer(www.opensiteexplorer.org), Majestic SEO(www.majesticseo.com), Ahrefs(https://ahrefs.com), and LinkResearchTools(www.linkresearchtools.com). The combination of the data from all of these tools will show a more complete list of links.
Tools help speed up link removal by automating the process of identifying bad links. Remove’em(www.removeem.com) and Link Detox(www.linkdetox.com). Use your own judgement, and don’t just reply on the tools to decide for you what is good or bad.
Google provides a tool to allow you to disavow links(https://www.google.com/webmasters/tools/disavow-links-main?pli=1). The Disavow Links Tool tell Google that you no longer wish to receive any PageRank(or other benefit) from certain links. This gives you a method for eliminating the negative impact of bad links pointing to your site. https://support.google.com/webmasters/answer/2648487?hl=en
There is no need to send link removal requests to everyone in sight. For example, don’t send them to people where the link to you is marked with nofollow.
Links from sites with very low PageRank for their home page probably are adding little to no value to your site.
Links from very low-relevance pages are not likely to be adding much value either.
Filling reconsideration requests. The best path is to be short and to be point:
1. Briefly define the nature of the problem. Include some statistics iff possible.
2. Explain what went wrong. For example, if you were ignorant of the rules, just say so ,and tell them that you now understand. Or, if you had a rogue SEO firm do bad work for you, say that.
3. Explain what you did to clean it up:
If you had a link penalty, let them know how many links you were able to get removed.
If you did something extraordinary, such as removing and/or disavowing all of your links from the past year, tell them that. Statement actions such as this can have a strong impact and improve your chances of success.
4. Clearly state that you intend to abide by the Webmaster Guidelines going forward.