Has Your Website’s Traffic Dropped? It May Not Be a Penalty.
Posted on 13th March, 2017 by Joe Balestrino | Comments 0
The first word that pops into many website owners head’s when they see a drop in organic traffic is… “penalty”. While that may very well be the case – you need to first determine if the issue is in fact “technical”. In my personal experience, 8 out of 10 times a site looses organic traffic – it’s due to a technical issue.
Technical issues can be scary, because the solution isn’t clear. You’ll need to do some investigating in order to figure out where the technical issue lies. If you aren’t very technical this may seem like an impossible task. It doesn’t need to be.
I’ve complied some of the more common as well as some unique issues I’ve seen over the last 13 years an an SEO.
Check your robots file
When traffic automatically declines the first place I look is the robots.txt file. It’s not uncommon for a developer to block the search engines while working on a website. Sometimes they forget to unblock the site. It’s an easy fix that can have your site reindexed within an hour or less.
In rare occasions a site can still be blocked from a robots.txt that is not your own. If your robots.txt file isn’t blocking your site and you see this message “A description for this result in not available because the site’s robots.txt” in Google, it could be coming from another source.
One place that you can look to as a possible source could be the CDN service you use. I’ve seen instances when a CDN accidentally blocked Google via their own robots.txt file and that rule transferred over to the client’s website.
Spike In Pages Crawled
Check search console for an increase in crawled pages. If your site only has 100 pages but search console is reporting 1,200 there is an issue. Especially if the spike is all at once and traffic from Google is declining around the same time as the spike in pages being indexed.
In many cases Google has started crawling pages it should. Typically, these could be search results from your own sites search function. Google shouldn’t crawl them. Also check to see if Google has gained access to the back end of your site.
Soft 404s can become problematic if they are in large numbers – especially if these redirects are to the home page. If you’ve removed a section of your site or gone through a recent redesign don’t generate massive bulk redirects to the homepage.
A soft 404 error in search console means that Google feels the redirects aren’t relevant. Don’t get lazy. It’s also okay to have 404 pages if there aren’t any relevant pages to redirect them to.
Blocking scripts is another culprit that many site owners are unaware of. If you’re serving users scripts that are required to render a webpage and you block Google from seeing those scripts it can impact your search visibility.
You can use Google’s “fetch as Google” tool within search console. Google will show you how they see your web page versus what users see. They will also highlight any scripts that are blocked. Not all script blocking is bad, but those that change how a site is viewed are.
Check Your Site’s Navigation
When a site goes through a redesign, pages are sometimes removed from the main navigation. It could be intentional or an oversight. This can cause ranking to those pages to drop. When pages are in the main navigation it gives pages that are there importance. If pages are removed from the navigation it decrease the value of that page. Pages that aren’t tied to a navigation or sub navigation can become “orphaned”.
Now you may be saying to yourself…”I’ll just add those pages to my xml sitemap. Problem solved”.
Think again. Just because a page is indexed doesn’t mean it will rank in Google. As I stated above, pages that aren’t tied to a navigation won’t rank as well. Years ago you could place pages in a sitemap and they would rank. Today, they won’t rank at all. Creating an xml sitemap and uploading it to search console is not a guarantee those pages will rank. Indexing and ranking are two totally different things.
Maybe, you’ll add them to the footer? Wrong again. Google put less weight on pages in the footer as Google knows that pages there have no real SEO value. You get the idea.
Hopefully, this article will give you some ideas of where to look if you’re site has dropped in indexing or search visibility. If you have any of your own personally experiences, I’d love to hear about them.
- Shopify is an great platform to sell e-commerce products on. …
- Start-ups are started on a shoestring and everyone on the …
- LinkedIn is the worlds largest business network with over 460 …
- If you violate any Google’s terms of service, advertising policies, …
- How To Manage Your Online Reputation A reputation is hard to …