Nothing is as frustrating as a drop in site ranking on Google. The top-ranked website on Google gets 32% of all click-through traffic, meaning that a drop in ranking will hurt your bottom line massively.
Almost every website loses its rank on SERPs, but you can implement measures to reduce the frequency of such drops. Some reasons for losing rank are obvious, while others are not so direct.
What Causes a Drop in Site Ranking on Google?
Here are a few reasons why your website might lose its rank on search results.
Low-quality Backlinks
A low-quality backlink is a link that you purchase or create to manipulate Google’s algorithms so that you rank higher on SERPs. If you resort to such tactics, you risk getting a penalty or getting your site de-indexed from the search engine.
Example of sources of such links includes low-quality article directories, low-quality guest posts, low-quality web directories, and spam comments. Here is a complete list of link schemes that go against Google’s best practices.
If your site has low-quality links, the best remedy is removing them. You should then send an email to Google’s webmasters notifying them of the adjustments made on your website. If you had received a penalty, you should ask them to reconsider.
Bad Redirects
A redirect refers to an instance where a user clicks on an address but is instead taken to another address.
For example, if a specific page on your site ranks top on Google.com.sg but instead directs users to your home page when clicked, that is a bad redirect. Since it returns information that is not relevant to the searcher, you will lose rankings on search results. Links that generate 404 errors also count as bad redirects.
The quickest solution to this problem is creating an alternative URL for every new redirect. The new link should direct users to a page with similar or even better content than the original page. If you are yet to create the new pages, you should not implement redirects.
Duplicate Content
According to Google, duplication occurs when two different pieces of content on your website match each other, either partially or entirely. If your site has duplicate content, it can hurt your rank on SERPs.
The SEMRush Site Audit tool is an excellent resource for identifying duplicate content on your website. It flags all pages whose contents have an 80% similarity or more, as well as all pages with too little content.
The solution to this problem is using 301 redirects so that both users and search engine bots cannot access the old content. Alternatively, you can use rel=canonical tags so that search engines can differentiate original content from duplicate content.
Updating Title and Meta Tags
The title tag is essential in site ranking because it tells Google about the contents of your webpage. A slight change in the structure of the title tag can impact on your ranking. For instance, if your previous title was ‘the best digital marketer in Singapore’ and you change it to ‘digital agency in Singapore’ you can lose ranking on Google.
Google Analytics enables you to compare the performance of the two titles by checking the traffic generated by each title. However, you need to have a backup of your website so that you can access the old page with the original title tag.
When you are migrating your site, always keep the title and Meta tags that were generating lots of traffic by ranking high on search results. Otherwise, you will experience a drop in site ranking on Google.
Changing Your Content
Another cause for a drop in site ranking on Google is changing your content. Content is one of the main ranking factors on Google, meaning that the smallest adjustments have a significant impact on your SERP rank.
Google Analytics is useful in comparing the performance of old content to new content. If you notice a difference, you can rectify it by including the keyword phrases from the old content that are not in the new version. Also, you can use the old content as a reference when rewriting new content.
Incorrect Use of Noindex Tags, Nofollow Attribute and Robot.txt File
The noindex tag, the nofollow attribute, and the robots.txt file are technical aspects of a webpage that have an enormous impact on your site ranking.
The noindex tag tells Google to avoid indexing a specific page on your website. This means that it cannot appear in search results. Robots.txt has a similar impact, meaning that it prevents your webpages from ranking on Google.com.sg. A nofollow attribute can hurt your ranking if applied on any part of your website.
You can check for the presence of noindex tags or nofollow attributes by right-clicking webpages on your website and viewing the source code. Also, make sure that you assess the robot.txt file to see if it blocks search engines from crawling some pages on your website.
If you find noindex tags and nofollow attributes, you should remove them immediately. Similarly, you should edit the robot.txt files so that search engine bots crawl all pages on your website.
Google Algorithm Updates
Google continuously changes its algorithm in a bid to deliver the best possible results to searchers. More often than not, web admins are caught unaware, resulting in a significant drop in ranking that happens overnight.
If you notice your website has dropped 10-20 positions for several keywords, there is a high likelihood that you were penalised. Note that there is a difference between algorithmic and manual penalties.
The algorithmic penalties happen automatically following Google algorithm updates. A Google employee executes manual penalties after carefully analysing your website.
You can tell whether site has been penalised if the drop is swift and you are still ranking well on other search engines such as Bing and Yahoo.
The first thing you should do to recover from a Google penalty, be it manual or algorithmic, is by checking the Google Search Central account. Here you will find notifications regarding the actions by Google and the reasons. The warnings are listed in the site messages menu.
While there, visit the Manual Actions section in your Google Search Central for details of the manual penalties imposed. Please desist from responding offensively, instead accept the decision and devise ways of solving the issues highlighted amicably.
Besides, knowing what caused the penalty will help you prevent the same from occurring in the future. For instance, if it’s duplicate content, scan your website and delete it. Spammy or unnatural links can also result in a manual penalty.
Once you have resolved all the issues highlighted, submit a re-evaluation request to the support team. Keep in mind that it may take several days or weeks to recapture your previous position on the search engine results page.
Moving forward, create a schedule to monitor inbound and outbound links and other risks by using advanced website audit tools. They will alert you whenever your site starts to bleed.
Drop Due to a Google Flux
We, SEO experts, have an extensive understanding of how Google works and rank for target keywords. Unfortunately, one thing that we have no power or control over is Google flux. It is both volatile and unpredictable, so there is no need to look for its signs.
In some instances, Google flux is interpreted as a negative SEO attack. That said, it is harder to spot compared to the other five types of Google ranking drops. Here are some of the signs.
- No on-page issues
- No linking issues
- No issues related to competitors
- No known algorithm updates
If everything looks OK, you don’t need to do anything. Just back and wait for Google systems to resolve the issue. There is simply no way of preventing a Google flux, so don’t waste your time and resources looking for the underlying reasons. In addition, no website is immune to this problem, and expect your site to be ranked back to its previous position in a few days.
Copyright Infringement
You can experience a drop in site ranking on Google if you continually receive valid copyright removal notices. According to Google, this ensures that searchers gain access to legitimate and quality sources of content. This particularly applies to websites that deal with music, videos and other forms of streaming content.
You can see the copyright violation requests filed against your website by searching your URL on the Google Transparency Report. Once you see the report, make sure that you remove all the pages that infringe the copyrights. You should also implement a strict policy against the uploading of copyrighted content to your site.
Getting Outranked by a Competitor
Competition for top positions in SERPs is cutthroat across all niches. Getting outranked is part of the game, so don’t fret about it too much. It shows that your competitors are noticing what you are doing to rank and creating better content to knock you off.
Unlike Google penalty, the drop is usually small, and you will notice other sites that ranked either below or above you remain untouched. You can quickly know if the change in ranking is a result of a competitor by constantly monitoring your ranking and comparing it with that of the competitor.
The only solution to avoid been knocked off your current position is by keeping tabs on their websites and activities on social media platforms. Monitor their content marketing and link-building strategies and come up with better ones.
Please don’t make the mistake of copying what they are doing. Be creative, diverse, and unpredictable to avoid unnecessary competition. For example, if they are ranking for general articles, create cornerstone or evergreen content. Write plenty of FAQ-based articles using inspiration for the “People Also Ask” section on the Google results page.
Non-responsive Website
Google emphasises responsiveness, which is why they rank websites based on the mobile version of their site instead of the desktop platform.
The increasing use of mobile devices has led to a corresponding increase in voice search. If your platform is non-responsive, you should expect a drop in site ranking on Google.
Google has a free tool for measuring the mobile-friendliness of your website. It also gives suggestions on how to make your site responsive to all screen sizes. You should also remember to optimise for voice search.
Poor Security Features
You can experience a drop in site ranking on Google as a result of poor security features. Incidences such as malware infection, hacking and other security breaches can encourage Google to push down your site on search results.
Besides, the search engine will notify any user trying to access the website that it poses a security risk. This can reduce your organic traffic significantly.
In most cases, Google sends messages to your Search Console when a security breach occurs. You can also use the Safe Browsing tool to see if you are compromised.
The good thing is that Google suggests ways through which you can improve security on your website. If you are unable to do it, it is advisable to upload a backup and request a review from experts.
Changing Internal Links
Changing the structure of your internal links can cause a drop in site ranking on Google. The Search Console is an excellent platform for checking if you have multiple internal links pointing to crucial pages on your website. If you notice any broken links, you should fix them by restoring them to their previous state.
Loss of Valuable Inbound Links
By now, you probably already know that backlink is one of the most vital website ranking factors. Google perceives websites with a healthy backlink profile and plenty of links from sites with a high domain authority as more relevant than those with fewer links.
Your site will be demoted if you lose plenty of links from sites that previously linked back to you. In your account, you will notice colossal link velocity spikes. Some links will be missing from your backlink profile.
Check the links manually to know if the respective web admins intentionally deleted them. Google already noticed the change, so it is not wise to contact the admins to gain them back. If you do, you will still be penalised soon for the same links.
Luckily, there are advanced tools that you can use to track links to avoid responding to the issue too late. Delegate the role of link building and monitoring to one of the SEO experts in your team or outsource to MediaOne Marketing to remain at the top.
Rank Drop Due to On-Page Issues
We have written multiple articles on on-page SEO and how to ensure that the pages are functioning optimally. Not monitoring it will often result in a drop in ranking and consequently losing revenue and organic traffic.
The drop can be either small or big depending on the magnitude or impact of the issues. It would be best if you sprang into action immediately your rank starts to fall behind other websites. Before we proceed, it is essential to note that rank drop due to on-page issues affects virtually all websites no matter how robustly the content and link-building strategies are implemented.
One of the surest and recommended ways of pinpointing the on-page issues that caused the drop is by checking your Google Search Central account. Here, you will see all the errors detected that need to be resolved.
The errors can be anything from poor internal linking to broken links and tags issues. Leverage the HTML Improvements feature to identity tags issues. Count on GWT to provide accurate information on the exact on-page problems that caused the drop in ranking.
Before you sign out, change the settings in your account to be getting GWT notifications via email. That way, you will be able to respond to the issues promptly.
Bonus Tips on What to Do After a Website Ranking Drop
The internet is filled with intelligent website owners and web admins who promise clients to get them to the top of SERPs within a given timeframe. The truth is that winning a top spot in SERPs is not easy and takes time. That’s why you should do everything possible to regain your position after a ranking drop.
Here are additional tips on what to do after a website ranking drops.
Confirm the Basics
Most rank drops can be resolved by checking the basics – not all complex technical issues cause site bleeds. One essential step is checking keyword performance. You may be ranking for keywords that the target customers no longer use to find you.
Which code are your pages returning? If you are getting a 200-status code, it means that the page HTTP requests are successful. Use an HTTP Status Code Checker to confirm this before looking at other sophisticated aspects such as link velocity.
You can also troubleshoot for other error codes such as 404 or 410. 404 error means that the page cannot be found, while 410 error means that the website page has been deleted or removed permanently.
The next basic thing you should do is confirm if Google bots can crawl your website. Robots.txt is an important text file stored in the webserver whose role is to guide the bots as they crawl your website. You can set exclusions and inclusion by changing the file. For example, you can disallow the bots from crawling duplicate pages or the dev site.
Be careful when setting the restrictions though, if they are too strict, the search bots won’t be able to crawl the main pages, and this will result in the site been ranked below the competitors. Concisely, check the robots.txt file to be sure the restrictions are OK. If not, upload a new and more permissive file to the webserver.
Check Google Search Console
Google Search Console is a free tool by Google to help web admins monitor, maintain and optimise websites. The reports it generates can help enhance the SEO and visibility of your website online.
Check them regularly to see if there are errors in your site that need to be resolved, such as crawling issues. If the bots are unable to crawl certain pages, you will start noticing a drop in ranking and organic traffic. Other errors should look out for are:
- URL errors
- Server errors
- DNS Errors
The errors are usually listed in the index coverage report.
You can also use Google Search Console to submit an XML sitemap to inform Google of your website architecture or structure. After the sitemap is loaded, confirm any discrepancy in the number of URLs submitted and the indexed URLs.
If there is a discrepancy, some of the page URLs are inaccessible to the search bots. Screaming Frog is another invaluable tool that you can use to scan your website for this kind of issue.
However, if you are on a tight budget, you can use Google Search Console to inspect the URLs and fix the errors. We recommend enabling email notifications to get timely alerts of any changes on your website.
Evaluate Your Content
Quality content will get you to the top of SERPs and convert regular website visitors into customers. Unlike in the past, you cannot get away with duplicate, low value, or thin content. It should be unique, comprehensive, and in line with the information that the target customers are looking for when they land on your website.
Your site probably lost its position in SERPs because the competitors have better quality content. Check the kind of content they are publishing to know how to beat them at their own game. Writing SEO content is an art and a science. It is prudent to hire a professional content marketing agency in Singapore to create concise content for your brand.
Conclusion
The factors mentioned above are some of the leading causes of a drop in site ranking on Google. You need to avoid them if you do not want your SEO investments to go to waste.
That said, you can maintain and improve your site ranking by creating valuable and relevant content. More importantly, provide an optimal user experience on your website. More importantly, you should adhere to Google’s best practices.
Get in touch with us for more tips on how to do SEO in Singapore.