Website navigation is one of the most important factors to both SEO and user experience (UX). A well-organised, well-designed website with straightforward navigation will not only improve your site’s UX, but will also help the search engine bots crawl a website by helping them find all the site’s content and making more efficient use of their crawl budgets.
Factors to consider when assessing and improving website navigation include:
- Site depth
- Site architecture
- URL structure
- Breadcrumb navigation
- Internal search
- Including a sitemap and using robots.txt
The basis of creating a well organised website are providing the optimal UX, and creating a logical hierarchical structure for content.
There are two basic types of structure: deep and shallow. On deep-structured websites, content is far from the homepage and users need to click multiple links to get from the home page to the page they are searching for.
This can make it confusing for a user to reach pages and revisit them. Deep content is also less likely to be reached by the search engine’s index during a crawl.
A shallow site structure allows users access to the majority of the website’s content within two or three clicks. This makes the website succinct and easy to use, and also makes the job of the search spider easy, as they are not searching through every nook and cranny for elusive pages.
Even if your site hosts a great deal of content, there is almost always a simple method of laying it out. However, thought should be given to the most efficient way to structure your website. Shallower is always better.
Site architecture refers to the way a website’s content is organised. There is some evidence to suggest Google prefers a site where content is arranged by topic areas. This is sometimes called “siloing” and can provide a more logical system for users (and search engine spiders) to explore the site with. Google is quick to reward topic authority, so the more relevant content you can include under the same silo, the better.
This could take the form of, for instance, adding new FAQ pages within a silo on a related topic rather than creating a separate FAQ section on the site where all FAQs across different topics are stored. This is also particularly beneficial if your site’s content is so broad that a deep site navigation system is all but unavoidable.
It is also essential that all of your URLs are logically named. Many content management systems automatically suggest web addresses, and some are done in a numerical format based on how many pages have gone before it. Making your URL relevant to the content of the page will improve its rankings, and improve UX by making it obvious to users what they will be reading when they click the link.
Choosing appropriate names for the URLs of your content also enables you to include keywords which are relevant to the page. These can be used as anchor text when other sites (internal or external) link to your page, as well as appearing in search engine results pages (SERPs). Keeping your page’s URL short is also important—omitting words like “a”, “the”, and “but” can actually make the address more readable.
Breadcrumb navigation is the most effective method of helping the user know where they are on a site, breaking down the path taken to reach the current page. Breadcrumbs can be organised either by following the URL path, or by the architecture of your site, as outlined above. For example, if a user has reached our ‘SEO Pricing’ page, the breadcrumb would look as follows:
Homepage – Web Marketing – SEO — SEO pricing
Many websites use specialist internal search forms, which index all of the pages of a specific website, and give the user the option to search for any pages within that website. They can be extremely useful for content rich websites, in which users want to quickly find the answer or page they want without having to work out which navigation path to find it in.
Internal search is convenient, particularly when your site has a number of “orphaned” pages with no relevant hierarchy leading towards them. However, leaving pages isolated in this manner is not recommended. If you have chosen to use a silo site structure as detailed above, all content will be much easier to access, as it will have been organised in a logical manner. Internal search engines are there for convenience, not as a replacement to a solid internal link structure.
Include a sitemap and robots.txt
A sitemap is a list of all of the pages on your website, and is usually in either .html and .xml formats. The former is a comprehensive navigation system for users, though if your site is shallow , it is not as essential as an .xml sitemap. These are submitted to Google Search Console, and create the starting point for a search engine spider’s crawl of your site; your sitemap should also omit pages which you do not want spiders to crawl.
Robots.txt is a file uploaded to a website which tells crawlers the best way to crawl the site. It highlights the location of the sitemap, and can also instruct crawlers on certain pages of the site to avoid indexing, such as images or videos, backing up the exclusions from your sitemap.