If your website's search rankings decrease, don't panic. It's possible to fix them by correcting your website basics, tracking the right keywords, following Google's best practices, and conducting a website audit.
Businesses only want their organic search rankings to go in one direction: up. That’s why it can induce panic when rankings start to go down.
Search engine positions fluctuate in the short term, but what if your rankings show a long-term downward trend?
It’s possible, however, to recover with some simple actions.
As the co-founder of seoplus+, a full-service digital marketing company, I help hundreds of clients, from local plumbers to multinational enterprises, recover from a drop in search rankings. Together, we plan and execute strategies to get their sites trending up.
Ask yourself four big-picture questions about your website, and you may find why your rankings dropped.
1. Correct Your Website Basics
Check site necessities like title, H1 headers, and HTTP status code first.
Does your homepage title appear properly in search results?
The title tag tells search bots what a page is about.
If the title tag doesn’t pull the right information or doesn’t make sense, you’re missing a valuable ranking opportunity.
Here’s an example of Amazon’s title tag:
Amazon’s title appears correctly, making the search bots’ job easy.
For static sites (that is, sites not created by a web application like Wordpress), you can edit the title tag within the head section of each page’s HTML.
Is the title generic or repetitive, or does it inaccurately represent site content? If so, go back to the drawing board and keep it simple. Best practice is to have your company’s name and target keyword in the homepage title.
Does Your Homepage Have a Single H1 Tag?
Each page should have a single main heading in an H1 tag.
Like the title tag, the H1 heading tells search bots the purpose of the page, and it’s a great opportunity to include target keywords.
Here’s an example of Clutch’s H1 tag:
There should only be one H1 on the page: If you have multiple H1 tags per page, you might dilute the tag’s value.
Is Your Site Crawlable?
Check your site’s robots.txt file in Google’s handy Robots Testing Tool to confirm that bots can crawl your site, and make sure the file’s instructions allow bots to crawl key pages.
Here are examples of robots.txt files that both exclude and allow all bots:
Image via robotstxt.org
If the robots.txt blocks your site from being crawled, change it to fit the second example above.
Is Your Site Returning the Correct HTTP Status Code?
If your site does not return the proper HTTP status code, Google might de-index it – that is, remove it from showing up in search results.
Test key URLs via HTTP Status Code Checker. The “200 OK” status code is the standard response for successful HTTP requests.
This screenshot shows that Clutch’s site is returning a successful status code.
If your site returns an error status, correcting that error could save your ranking. Try this troubleshooting guide for more in-depth explanations.
Are Your Local Google Citations Correct?
Local citations are listings on sites like Google My Business, Yellow Pages, and Yelp that confirm you are who/where/what you say you are.
If your business name, address, and phone number (NAP) don’t match the contact info on your website, this could affect your local rankings in Google Maps and the local Knowledge Graph.
Here’s an example of Clutch’s Google My Business NAP info pulling correctly:
To determine if your business’ NAP information is correct in Google Local Business, perform a few local searches or run a citation audit through a service like Loganix or Whitespark. If you spot any inconsistencies, clean them up manually or hire a clean-up service to take care of it.
2. Track the Right Keywords
Keeping an eye on your most useful keywords can make or break your search placement.
Have you done your keyword research?
Ask yourself some questions about the keywords you track:
- Did you intentionally identify the most searched keywords?
- Are these keywords likely to result in a conversion?
- If you’ve done keyword research already, have you updated it routinely?
If you can’t say yes to all of the above, it’s time to take a closer look.
Are You Using Multiple Tracking Resources?
Using reporting software to track keywords is a first step, but the keywords’ initial rankings may not give the whole picture.
Rather than relying on your reporting software, check other sources like SEMRush to get a full view of the keywords you rank for.
Discover ranking successes you didn’t see before or high-value, low-competition keywords you may have overlooked at first.
Then take them to the next level by updating titles and H1s or writing targeted content.
Are You Tracking Keywords that Don’t Rank?
If so, there’s an easy fix: delete the keywords from your tracking software. This simple move gives a more accurate picture of your rankings going forward.
3. Follow Google’s Best Practices
Google is the Internet’s top search engine, so optimize your site according to Google’s guidelines.
Is Your Site Returning Crawl Errors?
A crawl error is anything that interferes with Googlebot’s ability to interact with your website. There are two kinds of crawl errors:
- Site error: Googlebot is unable to access your site
- URL error: Googlebot is unable to access a specific page on the site. (It may not exist, or the URL may have changed.)
First, check your site in Google Search Console for error messages.
Then, if you see 404 errors, redirect the missing page to the most relevant live page (often the homepage).
If you see consistent server errors, like 502 or 503, your server may not have enough memory or resources to handle the requests. Reconfigure your server settings or upgrade to a new server.
The screenshot below shows instances of URL errors, all with a 404 response code.
Googlebot tried to crawl the [sitename]/testimonials page but was unable to find it, so set up a 301 redirect to send traffic from [sitename]/testimonials to [sitename]/about/testimonials.
Are Your URLS Indexed Correctly?
Sometimes there’s a discrepancy between URLs submitted to Search Console and URLs indexed.
Note that not every file on your site should be indexed – for example, you wouldn’t want an administrative login page to show up in Google.
However, ensure that important pages do get indexed.
The screenshot below shows a discrepancy between the number of URLs submitted and the total number indexed by Google:
If you see a similar discrepancy, crawl your site with software like Screaming Frog to make sure important pages aren’t blocked.
4. Conduct a Website Audit
A website audit should consider both on-page factors, like title tags, meta descriptions, and H1 titles, and off-page factors, like links and social shares.
The following on-page and off-page components are critical for searchability.
Are You Checking Your Backlink Profile?
Backlinks are essential to SEO because they prove your site’s value on the web.
Here’s an example of a site’s backlink profile on Majestic:
If you don’t have backlinks, or the sites that link to you are low-quality, there are a few tactics you can use to grow your backlink profile:
- Run a competitive analysis to see which sites link to your competitors
- Reach out to sites to offer a guest post, infographic, or webinar in exchange for a link
- Find unlinked mentions using a tool like Rank Tank
Here’s an example of a Rank Tank search that identifies unlinked brand mentions:
Contact sites that mention you but don’t link to you. Request a link or offer value in exchange.
There’s no get-rich-quick-scheme with staying power here; just put in the work and you’ll likely reap rewards.
Has there been a drop-off in site traffic?
Check Google Analytics to see if site traffic is low. If so, something on your site may not be working.
Cross-reference the start of the dip with changes you have made to see if you can spot where things went wrong.
Is your content original, in-depth, and valuable?
Quality content is important for bots and humans alike.
Search engines are meant to satisfy human readers, so Google programs search bots to prioritize quality content based on attributes like word count, author expertise, and length of page visit.
If your content is duplicate, thin, or provides little value to the reader, Google will penalize you accordingly.
Then, check content length: 400 words is the minimum needed to provide value to users, but according to a 2016 study by Backlinko Founder and SEO Consultant Brian Dean, the average Google first page result is 1,890 words long.
Look for pages on Google Analytics that do not deliver value– that is, they have low time on page and/or high bounce rate.
Here’s an example of a concerning content scan.
Content is broken into three categories:
- Duplicate Content is content that’s found on more than one page of your site. (We recommend less than 15% per page.)
- Common Content include elements like menus, sidebars, and the footer. Search bots don’t punish your site for common content.
- Unique Content can’t be found elsewhere on your site.
Because the unique content level is low, content developers should write new text to set the site’s pages apart.
Is your site modern?
A modern website not only boosts SEO but also serves the user.
Your site may be out-of-date if it has:
- Unsupported functionality
- Security vulnerabilities
- A CMS that can’t be customized
If these characteristics apply to your site, or you’re still using SEO practices from before 2012, it’s time for an upgrade.
Is your site fast?
If your site takes longer than three seconds to load, it is below average.
According to a 2016 Kissmetrics study, 49% of users will abandon a site if a page takes longer than 6-10 seconds to load.
Run your site through GTMetrix, a free site speed tester.
Here’s Wikipedia’s scan:
It took the Wikipedia homepage a second to load, which earned it an A from PageSpeed.
If your site takes longer than three seconds to load, follow GTMetrix’s recommendations.
Is Your Site Mobile-Friendly?
If your site doesn’t load properly or intuitively on mobile, you will lose customers. With the shift to mobile-first indexing expected to happen later this year, time is running out to make the switch.
Run your site through Google’s Mobile-Friendly Test and follow the recommendations if your site does not yet make the cut.
You Can Fix a Decline in Site Rankings
If your site’s rankings decrease, don’t panic.
Ask yourself these questions about website basics, keywords, Google best practices, and site auditing.
The answers will help you zero in on any problems and give you the information you need to correct the problem early.
Be proactive, flexible, and continue to monitor overall trends, and you’ll likely see ranking success going forward.
About the Author
Brock Murray is a full-stack marketing specialist at seoplus+. He is actively involved in all aspects of digital marketing but specializes in SEO. In just 5 years he has built a 7-figure agency using the practices he preaches.