A lot of times when browsing the internet, you are most likely to find a page error. There are many errors that can occur in a webpage. A famous example is the HTTP 404 page-not-found error. This happens when you try to click and follow a link to something like an image that is no longer there. This is called a “broken link”. As a website owner you don’t want your visitors to go through that. Not only is it bad for your visitor’s experience, but it is even worse for your SEO as this can lower your rank on Google.
Here’s our two main suggestions on how to fix and prevent your website errors.
1. Use tools to find errors
There are many free tools out there to help you find errors. The following are two of my favorites.
Google Search Console
Google Search Console is a free service offered by Google that helps you monitor and maintain your site’s presence in Google Search results. You don’t have to sign up for Search Console for your site to be included in Google’s search results, but doing so can help you understand how Google views your site and optimize its performance in search results.
Here are some helpful Google Search Console features to check for site errors.
Crawl errors
This feature generates a report for websites that provides details about the site URLs that Google could not successfully crawl or that returned an HTTP error code.
The report has two main sections:
- Site errors: This section of the report shows the main issues for the past 90 days that prevented Googlebot from accessing your entire site.
- URL errors: This section lists specific errors Google encountered when trying to crawl specific pages from a desktop or phone browser. Each main section in the URL errors reports corresponds to the different crawling mechanisms Google uses to access your pages, and the errors listed are specific to those kinds of pages.
robots.txt Tester
This tool shows you to validate and check if your robots.txt file blocks Google web crawlers from specific URLs on your site. For example, you can use this tool to test whether the Googlebot-Image crawler can crawl the URL of an image you wish to block from Google Image Search.
Sitemaps
A sitemap is a file you create for web crawlers, such as Googlebot, that gives them a list of web pages to crawl on your site. Although most web crawlers can explore and discover all the files on your site, a sitemap helps the crawler and can also provide metadata, such as details about content that is difficult for a search engine to parse, such as video or image file descriptions. You can view, add, and test sitemaps using the Sitemaps report in Search Console.
Twinword Free Site SEO Audit Tool
We also offer a free easy tool to help you audit your site in 3 minutes. With this SEO Audit Tool you can check your website’s overall SEO health and get a detailed report identifying any issues we find.
All you need is to type in your website’s URL and email and you are good to go.
2. Set up redirects for pages that no longer exist
Page redirection or page forwarding is when pages are redirected to a different URL. For example www.example.com/old-pricing
automatically redirected to www.example.com/new-pricing
There are two main kinds of redirects: 301 and 302.
301 Permanent Redirect
A 301 redirect means that the page has permanently moved to a new location. You should use a 301 to signify to the crawlers that your content has moved permanently.
302 Temporary Redirect
A 302 redirect means that the move is only temporary. What it does do is get the user to a proper location for the time being so that you aren’t showing them a broken link, for example a 404 page not found or an error page.
301 vs 302 and their affects on SEO
In the past, SEO experts have advised people to always use 301 regardless of whether it was a temporary or permanent redirection. They believed that 302 redirect doesn’t carry or pass the “link juice” (or “PageRank“) to the new location. However, this is no longer true; Google announced that 301 and 302 redirects all pass PageRank. This means that you should be free to choose 301 or 302 depending on its purpose of permanent or temporary.
For those of you who are interested in reading more about redirects, there was an experiment done after the Google’s announcement. They discovered that even though PageRank was passed, topical relevance power may not be passed via 301.
Make it a habit
Auditing is not a one time thing , it’s better to make it a habit to audit your website every once in a while. You never know when a 404 error will pop up.