The purpose of this report is to assess how well your website is optimised for search engines and your users. Our goal is to help you identify opportunities for ranking improvements and increase the number of people that visit your site and end up becoming customers.
To accomplish it, we will review the key elements on your site which are the target of search engines while considering the ranking for any given user’s query. We will highlight identified issues that may be decreasing the traffic to your website, and we will recommend viable ways to fix them.
We divided the report into eight main chapters, each focused on a different set of elements which have a substantial impact on your website’s visibility in search. These elements are placed in sections, each with thorough description, our recommendations and supporting data.
We hope that the information in this report will help you and your team get a better understanding of how specific elements on and off your site play a crucial role in ensuring the excellent visibility of your website in search results. It aims to show you how you can draw more prospective customers to your website from the search engines like Google or Bing.
- Site-Level Factors
1.1. Manual Penalty
1.2. Algorithmic Demotion
1.3. Domain Extension
1.4. Subdomain
1.5. Internal Links
1.6. Keywords in Internal Links
1.7. Orphan pages
1.8. User Generated Content
1.9. Preferred Domain
1.10. Site Depth
1.11. Subdirectories
1.12. Accessible Subdirectories
1.13. Outbound Links
1.14. Website Sitelinks
1.15. Descriptive Media File Names - Page-Level Factors
2.1. Missing Title Tag
2.2. Keywords in Titles
2.3. Duplicate Titles
2.4. Title Length
2.5. Headings
2.6. Keywords in Headings
2.7. Keywords in Main Content
2.8. Missing Meta Descriptions
2.9. Keywords in Meta Descriptions
2.10. Meta Descriptions Length
2.11. Duplicate Meta Descriptions
2.12. Keywords in URLs
2.13. Separating Words in URLs
2.14. Descriptive URLs
2.15. URL Length
2.16. Dates in URL
2.17. Unsafe Characters in URLs
2.18. Missing Alternative Text for Multimedia
2.19. Keywords in Alternative Text
2.20. Meta Keywords
2.21. Number of Links on a Page
2.22. Important Content in the Source Code - Technical
3.1. Page Indexation
3.2. 4XX Client Error
3.3. 5XX Server Error
3.4. Page Render
3.5. Pages Restricted From Indexing
3.6. Canonicalisation
3.7. 301 Redirects
3.8. 302 Redirects
3.9. Redirect Chains
3.10. Meta Refresh
3.11. Sitemap for Robots
3.12. Language Markup
3.13. Schema Markup
3.14. Rich Cards
3.15. Soft 404 Client Error
3.16. Frames
3.17. URL Parameters
3.18. Redirects to Preferred Domain
3.19. Server Up-time
3.20. Crawlable Resources
3.21. Hypertext Transfer Protocol Secure (HTTPS)
3.22. Google Publisher Markup
3.23. Preventing Directory Snippets - User Experience
4.1. Mobile Friendly Test
4.2. PageSpeed Insights
4.3. Accelerated Mobile Pages (AMP)
4.4. Response Time
4.5. Pop-ups
4.6. Hidden Content
4.7. Text in Images
4.8. Breadcrumb Navigation
4.9. Ads
4.10. Sliders
4.11. Pagination
4.12. Flash
4.13. Sitemap for Humans
4.14. Image Pages
4.15. 404 Page
4.16. Search
4.17. Code Validation - Content
5.1. Thin Content
5.2. Internal Duplicate Content
5.3. External Duplicate Content
5.4. Keyword Cannibalization
5.5. Related Keywords
5.6. Hints & Tips or How-To Content
5.7. Fresh Content
5.8. Content Pruning
5.9. Content Ideas
5.10. Internal Search Queries
5.11. Queries for Which Your Images Rank - Local
6.1. Country Targeting
6.2. Geo-Focused Keywords
6.3. Business Name, Address, Phone (NAP)
6.4. Google My Business
6.5. Local Citations
6.6. Bing Places for Business
6.7. Apple Maps
6.8. Reviews
6.9. Responses to Reviews
6.10. Server Location
6.11. Physical Web - Off-Site
7.1. Referring Domains
7.2. Referring Pages (aka Backlinks)
7.3. Link Targeting
7.4. Broken Backlinks
7.5. Anchor Texts
7.6. Online Mentions
7.7. Disavow File
7.8. Branded Search Query Results
7.9. Search Autocomplete
7.10. Mostly Shared Pages
7.11. Site Neighbourhood - Competitor Analysis
8.1. Search Visibility
8.2. Referring Domains
8.3. Referring Pages (aka Backlinks)
8.4. Type of Content
8.5. Keyword in Title
8.6. Content Length
8.7. Search Ads History
8.8. Search Ads Data
8.9. Engagement
In this section, we will look at your website's structure and its less technical elements, which affect the search visibility of your entire website rather than individual pages.
Furthermore, we will look at some recent major Google's search updates and their impact on your site's traffic. On average, almost every month there is a major change in the Search Engine industry.
Search engines can apply a manual penalty on your site after a human reviewer determines that some pages are not compliant with Google's webmaster quality guidelines.
Sites are manually reviewed after being reported by someone via the search engine report form or when the search engines' algorithms flag them as suspicious.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Search engines use algorithms which are sets of rules followed by their software to grade and rank pages found on the internet. They can automatically demote your site if it does not meet set requirements.
We track Google search algorithm updates, and we are aware of those which have the biggest impact on shifts in search results. In addition, Google shares information on some of search updates via its main blog and a blog for webmasters or spokespeople, like Matt Cutts (former Google's Head of Web Spam Team), Gary Illyes (Google Webmaster Trends Analyst), John Mueller (Google Webmaster Trends Analyst), and Danny Sullivan (Google Webmaster Trends Analyst). Bing shares important information about their search with webmasters via their own blog.
We use collected information to compare your website's organic traffic losses and gains over time with Google algorithm updates. This way we can identify those which have affected your website's visibility and better understand the areas of your site that need our special attention. Once the issues are fixed, your site should automatically recover from algorithmic demotion, and next time Google will crawl the pages that used to be problematic.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Domain extensions are categories of Internet domain names, which are used by search engines to determine in which countries your content is most relevant for users. There are two main types: Generic Top-Level Domains (gTLD) and Country-Code Top-Level Domains (ccTLD).
Examples of gTLD extensions include .com, .io, and .net. These extensions suggest to users and search engines that this business could be located anywhere in the world. These domains are best suited to international businesses that serve customers in multiple countries.
Country-Code Top-Level Domains (ccTLD) extensions end with a country code which indicates where the company is operating. Examples of ccTLD extensions include .ie, .co.uk and .nl. These extensions are best suited to local businesses as they send a strong signal to users and search engines that the company serves customers in a specific country.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
A subdomain is a smaller part of a larger domain. For example, the subdomain blog.example.ie is part of the domain example.ie. Subdomains are used to separate pages and their content from the main domain.
Search engines look at subdomains similar to new domains, and they may not inherit the ranking signals the main domain accumulated over time.
This may make it difficult for pages published under the subdomain to rank in higher position initially, something they would not encounter if they had been published under the main domain.
If your company does not have sufficient resources, which can be dedicated to building a topical relevance of the subdomain, it is recommended to publish all your content on your main domain under a subdirectory (e.g. example.ie/blog/).
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Internal links are vital as they allow users and search engines to navigate through content on your site quickly. In addition to this, they improve your website's ranking by passing ranking signals from one page to another.
Your most vital pages should be linked throughout your site more often than those that are less important, as this increases the chances of visitors finding that page when they are browsing your website. Each link also passes a portion of the reputation and rank signals the page has. The high number of them indicates to search engines that this is an important page on your site.
However, having many links on a page decreases the amount of rank-value that each of them passes to the page they are pointing to. Therefore, it is a good practice to limit the number of links to pages that are not helpful to users and are not crucial to your business. It's better to have a few carefully chosen links on each page, which are relevant and useful to your visitors, than a lot of them for a no specific reason.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Descriptive and clickable words in a link (aka anchor text) are a useful signal that help search engines and users better anticipate what content they will find on that page.
It is a good practice to have at least a few of these links with included important keywords, like ‘new HP laptops', and some with relevant keywords, like ‘laptops from 2019'. This will make the page to which the link points to more relevant for user queries that are related to ‘new HP laptops'.
Next time, when the user searches for these products, your page will have higher chance to appear in the search results and be clicked on.
All your links should include either anchor text or, for images, an alt attribute that is relevant to the target page.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Orphan pages do not have any links pointing to them from other pages on your website making it difficult for users who browse your site to find them. They are often used for marketing or advertising campaigns, but often they should be an integral part of your site.
These pages be visible in search engines because they are in sitemaps, were link to in the past or external websites linked to them. However, because the orphan pages are disconnected from the rest of the site, search engines do not treat them as an important part of your website. In result they tend to rank lower in search results, even if they receive a lot of external links.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
User generated content is anything that a user adds to your website, which appears on the pages. It can be a comment, forum thread, post, image, etc. This content provides a strong signal to users and search engines that there is a living community around this business, helping the website rank higher. Gary Illyes, Webmaster Trends Analyst at Google tweets in his ‘Did You Know' series that ‘quality comments can be a signal of a healthy website'.
Google treats the user-generated content as a part of your site's content. When users add helpful information to you pages, this can increase the value of your site, improving its ranking for targeted and related user queries. However, if your pages are full of low-quality user generated content which is irrelevant to the topic, this can have the opposite effect, leading to a lower ranking of your content in search results.
Furthermore, spam-like user-generated content with links to questionable sites may destroy the hard work of creating your company brand. Google is not in favour of websites that support spammers by having links to their pages, even if they were added by users and not by site owner. These pages receive the lowest Page Quality Score, thus reducing their ranking.
You should control what your users add to your site and set all the links added by them as ‘nofollow'. You can also blacklist obvious spam terms and use automated systems like Akismet to defend your site. This will prevent your website from supporting those unknown sites on Google.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Users can often access and link to your pages using different versions of the site's URL, for example:
- http://www.example.ie (HTTP + WWW)
- http://example.ie (HTTP + NON-WWW)
- https://www.example (HTTPS + WWW)
- https://example.ie (HTTPS + NON-WWW)
The preferred domain is the one you want the search engines to display in the search results. Not having a specified preferred domain can cause search engines to index some pages with one version of your domain, and other pages with a different version. Multiple versions of your website's URL displayed in the search results may be confusing for your customers.
On top of it, the search engines may treat the ‘www' and ‘non-www' versions of your domain as separate websites. This may prevent your site from gaining a higher ranking as ranking signals could end up being spread across different versions of your site.
As a result, it is important to inform the search engines of the preferred version of your domain. You can do this by uploading a sitemap of your website to Bing Webmaster Tools and by selecting the correct version of your domain in Google Search Console under ‘Site Settings'.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Site depth indicates how many clicks away certain pages are from your homepage. Fewer clicks mean the page is easier to find, allowing the visitors to reach their destination quickly. More clicks away from the homepage may cause your users to struggle with finding their way to the content they are searching for. This will ultimately result in search engines treating that page as less important.
This hurts the discoverability of your content in which you have put so much effort, as well as on the ranking in search engines. Ideally, key pages should be only one click away from the homepage, less important content – a maximum of two clicks away, and everything else – maximum three clicks away.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Subdirectories indicate that a page belongs to a set of pages with a similar purpose or topic (e.g. example.ie/services/search-engine-optimisation/). They also help users and the search engines to understand the structure of your website better.
However, too many directories may make the URL of your page lengthy, thus adding unnecessary complexity. Therefore, it is best to limit the number of directories in the URLs to 1 or 2, using them only when it is logical to do so.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Each of the subdirectories (aka folders) used in the URL should be accessible, implying that each of them should have its own dedicated page. Users and search engines will expect that they can visit each of the folders in the URL to find pages with similar content.
Avoid showing error messages when accessing those pages is essential. When a 404 error (Page Not Found) is shown to the user on a folder page, it interrupts their journey and has a negative impact on the user's experience.
Make sure each of your important subdirectories looks professional and is easy to navigate, as this will help users find the content they desire.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Outbound links are links on your pages that point to other websites. When you link to quality and relevant sources on the Internet, it helps to build the credibility of your site. Searchers will appreciate that you recommend them trustworthy websites with valuable content that may be of great interest to them.
Furthermore, when you back up the statements on your site with links to authority websites that include data mentioned by you, it assures users and search engines of their accuracy. This makes your brand look more credible for users, Google, and Bing, associating you with quality sites and rewarding with a ranking boost in search results.
However, when those links are spam or potentially malicious sites, it could defame your company's brand in the eyes of your users and decrease the credibility for search engines. When you link to suspicious websites, add to those links a ‘nofollow' tag. This will tell the search engines that you don't want to be associated with these sites, and your online authority should not help them rank higher.
Furthermore, according to Google Guidelines, any paid or referral outbound links should be ‘nofollow', too. Otherwise, your site may be seen as one that helps others manipulate search engines' rankings. Ignoring these guidelines could cause your website to suffer a drop in ranking.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Sitelinks are a search-listing format, displayed below the main website, along with some important internal pages when users are searching for your brand. They are organised in two columns of up to twelve sub-results. This helps users navigate to popular pages on your site quickly.
It is not possible to have control over the search engines to show this feature for your website, as it largely depends on the popularity of your brand. Links used in this search feature are usually selected based on the received number of views, in-links, as well as the search engines opinion of how helpful they are to users.
However, these links are sometimes not optimised efficiently for search, irrelevant or not helpful enough to the user. You can enhance them by rewriting their title and description, or de-index them if they should not appear in search results, to make more room for more important pages.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Each media file on your website, like a video, an image, or a PDF, should have a descriptive name (e.g. casino-interior.jpg). It provides the search engines with additional information what is in the image, and it can improve a topical relevance of a page when it's placed on it. Also, it helps these media files rank higher on their own in the search.
Including your primary or secondary keywords in your media file names would help you further, but it's important not to overdo it. If you update an image name, you should set up a ‘301 redirect', pointing the old image URL to the new one. This will help users and search engines find the new address of this file. Additionally, none of the ranking signals will be lost while renaming the file.
Words in URLs should be separated by special characters to improve readability. Hyphens are the most intuitive and popular way of separating words in an URL. Users are very familiar with this practice, and major search engines recommend using hyphens, which are treated just like a space character, instead of an underscore, which may join words. This will help your pages rank for multi-word queries, so make sure that words in your URLs are always separated by a hyphen.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Overview
This section reviews elements on your page which increase the search engines' understanding of how relevant your website is to the searchers' needs. It determines if your website should appear in search results for a user's query, and how confident Google and Bing are that the information on your page is what a user is looking for.
Title tags are HTML elements that contain the title of your page and are hidden at the top of your page source code. They don't appear on your page, but they rather show up in the search results above the page URL and also in the browser tab.
The information you include in the title tag indicates to users and search engines what this page is about. They are used as an important ranking factor and can help you rank higher. Therefore, each page should have the title tag present in the code and never be left empty.
When a page has an empty title tag, search engines will generate a text that will be shown instead of it in search results, which may not be so effective. You want to control what people see on Google and Bing when they find your page to maximise the number of people visiting your website.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Using important words or phrases (aka keywords) in the title tag helps users and search engines define the topic of the page and its relevance to a user's query. This improves ranking and assures the users that they will find what they are looking for within your page, increasing the percentage of people that click through to your site in search results (click-through rate).
We recommend beginning your title with a primary keyword because users don't read everything that we show to them, they scan text. Google knows this and assigns more importance to the first works in the title. Also, you can add your secondary keywords after your primary one if it is possible to do in a natural way. It increases the likelihood that your page appears high in search results for both keywords.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
It's important to create a unique title for each page on your website. Pages with duplicate titles make it more difficult for users and search engines to differentiate the pages.
This could cause the rank of your page to be lower, have a wrong page in search results for important keywords and even make your pages appear to be spam.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Words in the title tag give a clue what your page is about to both users and search engines. When you use only 1-5 words, you may miss the opportunity to convince a user to click through to your site. In addition, you will also reduce the chances of your page to rank for long-tail keywords, which is, for example, determined by the combination of words that occur in the title.
On the other hand, Google says that you should avoid ‘using extremely lengthy titles that are unhelpful to users'. There is a set amount of space for a title within the search engine results page. This means that if the title is too long, search engines will show only a part of it. This may make it difficult for users to understand what the page is about.
It is essential that you take advantage of all available space in search results by creating a compelling page title that is 50-65 characters long, which is around 10-13 words. Having shorter page titles than this will provide less information to users and search engines and may lead to receiving less traffic to your pages.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Headings play a vital role in drawing users' attention to important text on your pages. They divide your content into smaller and easier to grasp chunks of categorised text, which allow users to quickly scan your content and find the information they are looking for.
It's vital that the size of your headings reflect their importance, as search engines may prioritise, assigning value to them based on how distinguished they are in comparison to the rest of the headings and text on the page. Therefore, your main heading should have the biggest font size, and secondary headings be proportionally smaller.
It's good practice to use one main heading (H1) on each of your pages which summarises what the page is about, and add secondary headings in logical order (H1, H2, H3, etc.). Your headings tags should never contain images or logos, but only text.
You should use them often throughout the page when it makes sense to do so. This way you will help users and search engines understand the hierarchy of information on your page and improve the readability of your content.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Headings help emphasise the important text on your page to users and search engines. They are used as a ranking factor and they highlight the topical relevance of your content to the user's query.
Including your primary keyword in the main heading and relevant keywords in secondary headings indicates to search engines that there is a high probability that a user will find on this page what he or she is looking for. In addition, this will help improve ranking for related keywords.
We recommend placing the keyword at the beginning of your headings, but it's more important that you make them sound interesting. You should use headings throughout your page as they highlight the key categories of content to users and search engines and associate words with the text below.
Furthermore, it is good practice to have your main heading consistent with the title of a page, as this is something that Google is looking for. When they are different, it may appear blurred to search bots what your page is about.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Mentioning your primary and related keywords in the copy of the page helps to increase its relevance and assure search engines that it contains information that a user may be interested in.
Repeatedly indicating them in your content is beneficial for the first few times. However, afterwards it provides diminishing return and, if overused, may damage your ranking. Therefore, using synonyms and variations of your focus keyword (e.g. dog, doggy, man's best friend, etc.) are recommended.
A good practice is to avoid using your main keyword more than once every 100 words, and not more than a dozen times in the main content of your page. To make sure you haven't overused keywords, read your content aloud and ask yourself if it sounds natural. It is also good to ask someone else to read it and tell you if anything sounds strange.
Make sure that the main keyword appears at the beginning of the content, as this strengthens the relevance of your page. The concluding sentence of your page is an opportunity to solidify the relevance of your page in the eyes of the user. Therefore, you should remember to include your keyword here, too.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Meta description is the biggest section of the text displayed above the URL of your page in search results. It should be placed at the top of your page code, but it does not appear on your website.
When meta description tag is missing or is empty, search engines will decide which text to show for the description of your page in search results. They may display a snippet of your page content or text from Open Directory Project as description.
For searchers, this generated text may not be as convincing as an optimised description to visit your website. You should ensure your important pages have a specified custom meta description.
However, for a long-form content, which ranks for a variety of users' queries, it is difficult to write one meta description that will be relevant enough for all different keywords. In this situation, it is worth testing the click-through-rate to your page with and without a specified the description.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
It is good practice for your meta description to contain keywords, ensuring the important aspects of your page are highlighted. Search engines will present these keywords in bold when they match the user's query, pointing out and reassuring them your page contains what they are looking for.
It is also beneficial to include other variations of your primary keyword within your meta description. However, do not mention each keyword more than once, as this can look like spam and discourage people from clicking on your link. Ensuring the description sounds natural and interesting is paramount.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Text in meta description helps to convince users to choose your page over your competitors. However, by having short descriptions you may be losing your opportunity to do this, and in extreme cases, they could cause a user to consider your page incapable of solving his problem.
On the other hand, search engines have limited space available in search results to display the description of your page. If this text is too long, the excess amount of words will be replaced with ‘…'. Users may have a hard time understanding what you wanted to convey.
You should aim to use 140-150 characters, which is around 24-26 words, to describe what they can expect from the content of your page. Any description longer than 150 characters is at greater risk of being cut out. Whereas, short meta descriptions are the ones that have less than 70 characters (12-14 words), which is approximately half of the available space for a meta description.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
As meta descriptions appear below your page title in search results, you should avoid using the same one for a number of pages. Each page is unique, and it addresses a particular user's needs. A meta description optimised for a specific page won't be just as effective in driving traffic to another one.
Furthermore, when duplicates are noticed by users, they could cause them to skip over your page, as it often looks like spam. However, you can prevent it, ensuring your important pages have a unique meta description.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Keywords are important words or phrases users type into the search bar to find websites that match what they are looking for. By strategically placing keywords in URLs, you assure the users and search engines that this page has high topical relevance and will most likely contain what the user is looking for.
Including keywords in the URL also helps your page rank higher in search results, as Google's former Head of Web Spam mentioned in one of his posts: ‘Having keywords from the post title in the URL also can help search engines judge the quality of a page.'
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Words in URLs should be separated by special characters to improve readability. However, picking the wrong character for this task could confuse users or cause the search engines to join words, thus changing their meaning.
Hyphens are the most intuitive and popular way of separating words in an URL. Users are veryfamiliar with this practice, and major search engines recommend using hyphens, which are treated just like a space character, instead of an underscore, which may join words. This will help your pages rank for multi-word queries, so make sure that words in your URLs are always separated by a hyphen.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Descriptive URLs are those made of words (e.g. example.ie/accounting-services/) instead of special characters and numbers (e.g. example.ie/?p=123). They are easier to understand and remember for users and give them a general idea of what to expect to find on the page that this URL leads to.
In addition, Google recommends using descriptive URLs, and it uses the words in it as one of their ranking signals.
The type of URL your website shows, usually depends on the system it uses, and how the web developer has set it up. If it only includes special characters and numbers, it tends to be seen as confusing and creates less trust in the brand.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Short URLs are easier for users to remember and type into a browser, while long URLs add unnecessary complexity. When Google has two pages with identical or very similar content and authority, they prefer to show the page with the shorter URL.
You should make your URL descriptive, but short. Ideally, an URL should not exceed 80 characters, which is around 7-8 words. The fewer words in the URL, the better. In fact, Matt Cutts (Google's former Head of Web Spam Team) said: ‘If you have got three, four or five words in your URL, that can be perfectly normal. As it gets a little longer, then it starts to look a little worse.'
Content Management Systems often automatically create an URL from the title of the page. Unfortunately, this can make the URL very long by including all the ‘stop words' like ‘at', ‘in', ‘under', ‘the', which are not needed there. These words are ignored by search engines when they scan the address of your page, and you should aim to avoid using them in the address of your pages.
Moreover, when you try to create a new page or post and use the same URL that another page on your site already has, your system may automatically add a version number to the end of that page URL (e.g. example.ie/contact-2). This may not look professional and confuse some users, especially if it occurs on important business pages. You should also avoid this.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Content management systems like WordPress often add a date to their URL of each post, identifying when the page was published. This allows users and search engines to quickly determine how fresh the content on that page is.
This could have a positive effect immediately after the URL is published, as it will signal to users that a page contains the most up-to-date information. However, after a year or even a few months, an old date could cause a drop in traffic, as it suggests to users and search engines that the information on that page could be outdated.
Even if you update the content itself, the URL will still contain the old date. Therefore, you should avoid having dates in your site's URL, as this can discourage searchers from clicking on your site and cause search engines to associate your content with previous year, making it less relevant next year. Both of these reasons can reduce your rankings.
If your page contains a date in the URL, you should consider rewriting and redirecting the old URLs to the new ones that do not contain any date.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Some of the special characters used in URLs may create issues for systems and search engine crawlers, also known as spiders. Furthermore, special characters may also confuse some of your users and make it more difficult for them to understand the content of your page.
You should avoid adding the following characters to the URLs of your pages, unless it is necessary:
- Unsafe characters: blank/empty space and ‘ < > # % { } | \ ^ ~ [ ]
- Non-ASCII characters
- Reserved characters: ; / ? : @ = &
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Alternative text (aka alt text) can provide search engines and visually impaired users with additional information about the multimedia on the page, such as images, videos, and animated objects via canvas. This alt text helps them understand what is included in your media files, how they are connected to your main content and what value they add to it.
This will improve your site's accessibility and build a better user experience. Visual and hearing-impaired users, as well as search engines, will appreciate your extra effort.
In addition, this will help your media files better rank on their own in search and improve the ranking of pages which include them.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
The alternative text of your images, videos, and other media files should accurately describe what is included in them. It is good practice to use a keyword relevant to the content of the page or its synonym, but only if the keyword is also related to the media.
This will help you increase the topical relevance of your page, improving the confidence of search engines that this page contains a plenty of information on this subject. Ideally, the alt tag on the first image of the page should always be complete and contain the target keyword.
However, beware of over optimising the alt text by adding a list of keywords to it or the same keyword in each media file on the page. This could be seen by search engines as spam.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Meta keywords tag is used by people as an easy way to inform search engines of the content of their pages. It is placed at the top of your page code but does not appear on the page itself.
Meta keyword tags have previously been heavily spammed. As a result, in 2009 Google decided not to use information from meta keyword tags anymore. Other main search engines, such as Bing and Yahoo, followed this practice soon after.
It is not recommended to include keywords in this meta tag. They will not improve your ranking position. Moreover, having dozens of those words and phrases there may damage your ranking, as this may be seen as a sign of a spam website. It is advisable to include any keywords within the page itself.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
An excessive number of links on a page can overwhelm and confuse your users. It may also look spammy to searchers and search engines, if the purpose is just to manipulate the Google's rankings. In extreme cases, this could lead to a manual penalty from search endings' Spam Team, which would decrease the visibility of your site in search results.
Another reason of refraining from listing many links on one page is that each additional link reduces the PageRank (aka authority) that they pass from one page to another. So, when you have 300 links on one page, each of them passes only 1/300th of the authority to the link destination.
In the past, it was recommended by Google to avoid having more than 100 links on the same page for various reasons. Now, this has increased, and as the current Google's guidelines say: ‘Limit the number of links on a page to a reasonable number (a few thousand at most).' We recommend keeping this number below 300.'
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Important Content in the Source Code
A lot of websites that depended heavily on JavaScript do not include important content in the initial page (the source code) that they send to visitors. They add content to the page through JavaScript after the browser downloads it.
Unfortunately, this makes it more difficult for search engines to discover the content and links that point to other pages on your site. Some search engines just look at the initial page that they download from your website because in most cases it only takes a 0.5-1 second to get it. Downloading multiple Javascript files and waiting to see if they might embed any additional content takes a few or even tens of extra seconds.
It might not seem to be a lot, but when you take into account billions of pages on the Internet, it significantly decreases how fast the search engines can discover updated and new pages.
Other search engines like Google, look at the source code on the initial visit of the page but come back after a few days or weeks to check if any additional content is visible after downloading JavaScript files and rendering the page.
Loading important content with JavaScript might cause search engines or be seen only after a long time. Then your announcement, news article, the promotion might not be as relevant any longer.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
This section covers more technical aspects of Search Engine Optimisation, which unlike On-Site and On-Page SEO, will almost certainly require your developer's support.
The main purpose of this section is to ensure that nothing prevents your important pages from appearing in search results, while also assuming that any pages that should not be displayed in search results are not there.
We will determine if search engines can see your content and whether they have access to all the files that make up the pages. We will also establish if search engines can flawlessly crawl your website to discover new and updated pages quickly.
Page indexation allows us to check if there are any indexing issues present on your site. These issues cause the search engines to maintain an incorrect index of the number of pages from your site that appear in the search results.
We then compare this number with the pages we are able to find on your website. This way we are able to establish if people can find all of your content in major search engines.
If search engines have indexed substantially fewer pages than we were able to discover on your site, this could mean search crawlers have a problem with discovering your content or are prevented from showing them in their results.
On the other hand, if more pages are indexed than you have on your site, this may indicate that you have an issue with duplicate or auto-generated content, which could be hindering your ranking.
It is crucial that all your important pages appear in the search results, as every indexed quality page is an additional point of entry for users to your site.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Users and search engines encounter 4XX errors (e.g. 404 Error: Page Not Found) when trying to visit a page that cannot be found. This may be caused by a broken link to a page or a file that was removed, but also when a URL of a page was changed or misspelt.
In this situation, the user and search engines are redirected to a ‘Page Not Found' instead of the page they were looking for. This could frustrate some users, having a negative impact on their perception of the state of your website and your brand.
Furthermore, it prevents some of the ranking signals accumulated by the previous version of the page (like links from other sites pointing to it), from benefiting your website, as it was confirmed on Twitter by Google Webmaster Trends Analyst Gary Illyes. It is always beneficial to fix those links.
If the page does not exist any longer, you should set up a redirect on your server to send users, search engines, and ranking signals to the next most relevant page on this topic. However, avoid redirecting all URL-s that cause 4XX error to the homepage. This may confuse users, and search engines will treat it as a soft 404 (more about it in the next section) if it does not provide the answer they were looking for. As a result, the passed ranking signals will stop benefiting the page you pointed to. x
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Errors that begin with the digit ‘5' indicate cases in which the server is aware it has encountered an issue or is otherwise incapable of performing the request. These errors should be monitored, and their causes investigated, as they may indicate a more serious problem.
When occurred often, this could cause a negative impact, as your site's reliability will be diminished in the eyes of the search engines. They may not be inclined to sending users to pages that are not available.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Occasionally, an issue may cause search engines to not see your pages in the same way as you or your users can. This may cause these pages to have lower search visibility than they should, as Google or Bing cannot rank your pages for queries they cannot find in your content.
It is crucial to ensure all of your content is visible to search engine crawlers on desktops and especially on mobile phones.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
There are situations when you want to block search engines from crawling and indexing specific pages on your site. This allows you to prevent duplicate content and sensitive information from being displayed in search results. Furthermore, you should restrict Google and Bing from indexing your ‘Thank you' pages, as they can cause conversion-tracking issues.
However, often sites have wrong pages or even the whole section of the website blocked. This may cause search engines not to be able to crawl your important pages, eliminating them from search results. This also hinders these pages from passing any accumulated ranking signals to other pages on your website, as Google cannot access them. This could result in a lower performance of your site in search.
Therefore, it is a good practice to periodically check which pages your site restricts search engines from crawling and indexing.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Often people may use various URLs to load or link to your page. For example:
- http://example.ie
- http://example.ie/index.php
- http://Example.ie
- http://example.ie/Index.php
This creates a problem, as these URLs act as duplicates of the same page. When people link to those pages, the ranking benefit is divided amongst multiple URLs rather than being awarded to the original one.
This means that the duplicates weaken the popularity of the main page. These duplicates compete with the original page for room in Google and Bing searches, instead of improving its ranking position.
You can combat this issue by adding rel=‘canonical' to your pages, which directs search engines to the main URL. When a search engine knows which version is canonical, it can count the ranking signals accumulated by multiple URLs and assign them to the original page. Using this tag will help you improve the ranking of your original pages.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
‘301 redirects' are used to indicate to the search engines that the old page won't be used any longer, presenting the new page to them instead. This helps move users and accumulated ranking signals from an old page to the new one. Also, browsers will update any bookmarks that are linked to the page that is being redirected once they encounter the permanent redirect.
However, using ‘301 redirects' can reduce the speed of the page loading and may discount some of the transferred ranking signals. Therefore, we recommend using a direct link whenever possible.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
302 redirects' behave in exactly the same manner as ‘301 redirects' by automatically moving visitors and the search engines from one URL to another. However, there is one small difference in that ‘302 redirects' are used to pass on ranking signals, as they indicate that the change is temporary, and the old page will be back soon.
Recently, Google's Search Team announced that 301 and 302 redirects do not cause a loss of any PageRank, which is one of the stronger ranking factors. However, some sites still see ranking improvements when ‘302 redirects' are converted to 301s or when they are converted into direct links.
It's best to avoid using ‘302 redirects' and instead use the permanent ‘301 redirects' or, even better, a direct link wherever possible.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
A redirect chain is when a visitor is redirected several times in a row before they land on a final page. This usually occurs when a page's URL is changed frequently, with a new redirect created each time rather than updating the old redirect.
A redirect loop is the worst example of a redirect chain, as it creates a loop of never-ending redirects, which results in the user not being able to view the page. This usually occurs when redirects have been incorrectly set up on a website.
Redirect chains cause the page to take longer to load and may decrease the value of the ranking signals. Moreover, Google indicated in 2011 that their search crawlers may give up following more than 4 or 5 redirects in a row. This means that your final page may not be seen by search engines and may not appear in the search results.
All redirect chains should be fixed by changing them to a direct link or a single redirect. x
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
A meta refresh is a client-side (as opposed to server-side) redirect, which Google's Webmaster Trends Analyst John Mueller strongly discourages from using. It was heavily abused in the past by spammers who inserted it into a page's metadata, as it automatically redirects visitors to pages with unrelated or malicious content.
This causes confusion with users and puts them at risk, resulting in most search engines not wishing to see them on crawled sites. Therefore, whenever you want to redirect users from one page to another, a ‘301 redirect' should be used instead of a meta refresh.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
A sitemap for robots (XML sitemap) is a list of all pages you would like to have indexed by the search engines. This provides the search engines with an easy way to identify all of them in one place, ensuring no pages that are vital to your business are missed. This also helps search bot to crawl more intelligently when they visit your website.
Sitemap for Robots can provide crawl bots with information, such as when a page was last updated, how often it tends to change, how important it is to your website, and more. It should be located one directory structure from your homepage (e.g. https://example.ie/sitemap.xml).
To keep your sitemap organised and your pages prioritised for the search crawlers, it is recommended to create a separate sitemap for each type of content on your website:
- Pages
- Products
- Posts
- Videos
- PDFs
The XML sitemap should be updated each time you add a new page or make changes to an old page. It should also include a date of when a page was last updated. When the crawl bots visit your site, they easily identify new or updated pages that have a higher priority to be crawled.
You should not include pages with redirects or those that should not be indexed, as this will waste what limited time the search crawlers have to crawl all of your pages, lowering their trust in your website's sitemap accuracy. x
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
If you use several languages on the same page, you prevent search engines from figuring out what is the primary language of that page and for which audience it is suited. Moreover, you risk the page in question being categorised by search engines in only one of the languages, though not necessarily your primary language.
This situation could result in less traffic to your website and your pages to appear in countries where your business does not operate, rather than your primary market. Therefore, it is important to markup languages with ‘hreflang' tags whenever your site has content in more than two of them.
These tags clearly label different language versions of your content and link them together. This helps search engines to understand which language your content is available in, allowing them to show the most relevant pages to your users who may speak a different language.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Schema markup provides search engines with specific information regarding your content, so they no longer need to guess or interpret what they find on your website. This helps search engines understand what specific sections of your pages are for and how they can be valuable to users.
Examples of information you can markup on your site are other profiles and sites owned by your business, articles, contact details, products, reviews, courses, recipes, events, and many more.
There are two methods of implementing Schema markup on your website: microdata and JSON-LD. JSON-LD is the preferred choice of the search engines. This adds marked-up information to the top part of your website, and as with a Title tag and meta description, is only visible to the search engines.
Currently, Schema markup is not a ranking factor, although Google have announced that they require it to feature your content in voice search. They state that more than 20% of all mobile searches are performed with voice commands. This creates an opportunity for your business to gain an advantage over your competitors and potentially dominate voice search results in your industry.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Rich cards are search features that use information you have marked up with schema to enhance the appearance of your pages in search results. As they help you stand out from your competition, they significantly increase the click-through rate to your site.
An example of this for a restaurant could be a list containing the following information: local business reviews with an average star rating and the number of reviews, a list of events with their respective dates, a recipe with a featured image, an average star rating and the time taken to prepare a meal.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
When search engines crawl 404 pages and do not see a special 404 code sent by your server with the missing page, they mark such pages in their system as soft 404. This might happen due to an error in the server configuration. Pages marked as soft 404 are treated the same as a normal 404 pages, and the ranking signals accumulated by them are ignored.
Often search engines use soft 404 also in other situations. For example, when sites redirect backlinks to the homepage from 404 pages. This way the backlinks are ignores if the final page does not contain the content that was on the page that is gone.
It's important to monitor when search engines use soft 404 in relation to your pages and fix them by configuring the server properly and redirecting users to pages with very similar content when you decide to remove a page from your site.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
A frame is a HTML element that will copy everything from an external page and embed it into your page. This is often used to pull advertisements and widgets to display on your website.
Although a frame may look nice, it will essentially be loading another page within yours. In this situation, the search engines may not associate this content with your page as it belongs to another website. When the search engines want to send a user to the information within these frames, they will often link to the frame directly, rather than to your page. Furthermore, as they load additional pages within yours, it extends the time needed to display all elements on your site.
Therefore, it is always best to avoid using frames on your website whenever possible. If you must use a frame, ensure you provide alternative text in a ‘NOFRAME' tag describing the content within the frame. This will improve the experience for visually impaired users and those who have ‘frames' disabled on their devices.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
A URL Parameter is a string that some websites add to the end of a URL to change the content of the page. For example: ?sort=price_ascending would sort the products on the page by price. However, pages with URL parameters visible in search results are often seen as confusing and less trusted to users, as they are not always easy to understand.
Such pages may be seen by search engines as duplicate content if not managed properly. This is because they are by default eligible to appear in search results, but they add little or no value to the original content.
Furthermore, URL parameters could lead to ranking signals being divided among several pages, instead of strengthening the ranking position of the original page.
Finally, with sites that have a large number of pages, URL parameters waste search crawlers' resources, which could be used elsewhere to discover new or updated content on your site.
To prevent this, you can inform Google by using their Search Console and Bing via their Webmaster Tools that pages with specific URL parameters are duplicates and should, therefore, be de-indexed, or assure them that they are unique and should appear in search results.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Users and search engines may be able to access your pages by using different versions of your site's URL:
- HTTP + www (e.g. http://www.example.ie)
- HTTP + non-www (e.g. http://example.ie)
- HTTPS + www (e.g. https://www.example)
- HTTPS + non-www (e.g. https://example.ie)
Each website should help its visitors navigate to a page with the correct version of the domain. This can be resolved by setting up a directive on your server, which will automatically redirect users to the correct URL. This way, your users will not see a ‘Not Secure' label when landing on an unsecured version of your domain. Furthermore, your ranking signals will not be split when linked to the www version of your domain instead of non-www, and vice-versa.
It is important to pick one main version of your domain and set up redirects from the other ones to avoid any issues in the future.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
A server which hosts your website can sometimes be down due to a variety of reasons. Examples of these include software updates, system and hardware errors, maintenance and various other factors. When this happens, customers and search engines may not be able to view your site for a significant amount of time.
Unfortunately, no hosting provider can guarantee 100% up time of its server. Big service providers such as Amazon promise a monthly up time of at least 99.95%, while other companies state your website will be available 99-99.9% of the time. However, this number is often lower.
It is vital to monitor your server up time in order to identify any problems and detect how often search engines and users may be able to access your website. If your server is frequently down, search engines will may consider your site unreliable and won't suggest it to their users. A reduction in server up time may also damage your brand image in the eyes of your customers.
When your hosting provider fails to deliver a reliable service, it is best to contact them to address the issue immediately. If the situation does not improve, you should consider switching to a different hosting provider.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Search engines want to crawl all files that create your pages. This allows them to discover all content, and then check whether the website uses any forbidden practices which can mislead them to obtain a higher rank position.
Therefore, sites that block crawlers from going through JavaScript, CSS and image files may be automatically suspect, not be considered as mobile friendly or even your content may not be seen by them. In the result your pages will rank lower. Your site should be transparent and allow the search engines to crawl all files used to create your website's pages.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
HTTPS encrypts the connection between a user and your website, securing any sensitive data that the user may have submitted on your site, such as personal details, password, and credit cards. It ensures that you really are talking to the server you think you're connected to and that no one can listen in on the conversation. Pages that are served over an HTTPS connection also get a small ranking boost in Google.
Without encryption, a user is exposed to a ‘man-in-the-middle' attack. This is where a hacker can obtain sensitive information during the process of sending them from a user to your system. Personal information exposed in this way can lead to serious problems for your customers.
Therefore, from January 2017, Google began showing a ‘Not Secure' warning in their Chrome browser on sites that request users to input their person information, password or credit card information without providing a secured connection. In the long term, they plan to label all pages without HTTPS with this message and a red triangle warning to be more honest with searchers.
This has the potential to have a serious impact on a number of conversions on your site and damage your brand image. All websites should use HTTPS.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
The Publisher markup is a piece of code you can add to your website, which links information that is displayed on your business' Google + account. This way, the search engines have a better understanding of your brand. Even if you do not have a Google My Business listing, Google will still display your company's name, logo and other available information next to your website's pages when a user searches for your brand.
To do this, a rel=‘publisher' schema markup should be added to your site's business pages. You should also have a link on your Google+ page to your site, as this will confirm that these two entities belong to the same business. Through this, you can strengthen your brand visibility in search results.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
The description of your pages that you would like to display in search results may be replaced by Google or Bing with information from third-party directories. Open Directory Project (ODP) is one of these sources. Search engines may display a text taken from ODP as a ‘meta description' of your page when they believe it benefits the user.
This could result in a loss of control over what text is being shown below your pages in search results, which could lead to fewer people clicking on a link to your site. You can prevent the search engine from using the ODP information by adding ‘noodp' to the robots meta tag in your pages.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
User experience places importance on making the process of obtaining information from your website as fast and easy as possible for users Additionally, ensuring your users have a positive experience when visiting your site will improve your rank, help to build a positive brand image and increase customer loyalty.
In regard to the search engines, it is extremely important for your website to provide their searchers with a great user experience. When searchers are happy with the results provided by Google or Bing, they will be wanting to return to these search engines more frequently. This will lead to the continuing growth of these companies.
This section evaluates how well your website adapts when needing to fit onto the small screens of mobile devices. Ensuring your site is adapted effectively will provide users with an enjoyable experience and make it easier for them to find what they are searching for.
A mobile friendly test will check the size of your clickable elements and fonts, as well as ensuring your content fits onto the screens of mobiles.
Up to 80% of your website's monthly visitors can come from mobile phones. Because these users are on the go, they will tend to be less patient when loading your pages. If your website is difficult to navigate or to read on a small screen, users will more than likely head to your competitors. Needless to say, this can have a significant impact on your business revenue.
Your website should always aim to meet all of Google's mobile friendly requirements, otherwise it may not even appear in the search results when a user is searching for your products and services on a mobile.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Google developed the PageSpeed Insights tool to help test how well our sites are optimised for page loading. This tool will check sections of your site that are known for slowing down sites and will then suggest how you can fix them.
Although it doesn't cover every aspect you could optimise to speed up your site, it does offer a good starting point to base your changes on. Furthermore, some of the elements checked by this tool are used to rank your page in Google.
After performing this test, your website receives a score between 0 and 100. The higher the number, the more optimised your website is in the eyes of Google. A score below 65 points is deemed to be poor and will hurt your ranking. A score of 65-85 is deemed to be acceptable and any score above 85 is treated as good.
It is essential to invest some time optimising your site for speed, as Google's patent ‘Using resource load times in ranking search results‘ argues that ‘given two resources that are of similar relevance to a search query, a typical user may prefer to visit the resource having the shorter load time.'
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Accelerated Mobile Pages are a new initiative from Google. If your site is slow to load on mobile phones, coding your mobile pages using AMP could be the best solution as these enable your pages to load almost instantly.
AMP does not replace your current mobile site. AMP will create a simpler version of your pages, which contains less code and fewer features that usually make pages load slower. These will then be displayed in the search results on smartphones rather than your current pages.
Unfortunately, AMP still isn't used by search engines as a ranking signal, and therefore won't improve the ranking position of your page. However, Google strongly promotes them in its search results by using the following search features:
- Distinguishing them from the normal pages by labelling them with an AMP icon
- Making AMP more prominent in Google's Top Stories than normal pages
- Showing AMP multi-source carousal on the top of search results
- Displaying a single-source carousal down the search results page
Accelerated Mobile Pages are ideal for news and content-rich sites. However, they can also help other websites gain extra visibility in a search. Therefore, every website should consider creating AMP versions of its pages.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
A Response Time indicates the time taken from requesting your page in a browser to getting the raw HTML file of your page with its content for a browser or search engines to see. This time is affected by the amount of traffic your website receives, the distance between a user and the server from which your pages are sent, as well as how well your server and your site are optimised.
The longer that search bots have to wait for your pages to load, the more time is wasted that could be spent on discovering new or updated content elsewhere. More importantly, your users will also need to wait longer for your content to appear in their browsers.
As a result, search engines use the response time as one of the ranking signals that help them to determine how good is the experience that your site provides to a user, eventually affecting the rank in search results. You should keep this time as short as possible - ideally not exceeding 200 milliseconds.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Google started lowering ranking of pages that show intrusive interstitials to visitors on mobile devices or even removing them from their search results. This is because these pop-ups and ads which push the top part of the content further down the page make more difficult and even frustrating to access the information that the users are looking on your site.
To add to this, the search engines could interpret pop-ups that appear after loading the page or a scroll action to be the main content of that page if they cover the majority of the screen. In this situation, the original content could be discounted, decreasing its ranking position.
Login, cookies and age verification interstitials will continue to be acceptable, in the same way as top bar ad banners, as they do not take up a substantial amount of space on the screen. Furthermore, exit pop-ups, those that appear when a user wants to leave a page, are still allowed.
It is recommended to limit the usage of the intrusive interstitials on desktop and mobile to a minimum. Customers hate them, and in order to protect your ranking and your brand, it's better if they do not encounter them on your site.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
4.6. Hidden Content
Hidden content is text or images that are not initially visible on your page, but it is in your page source code. Usually, it is hidden behind a tab, an accordion or a button and requires some action from the user to see it.
It may end up rarely being seen by users, and because it is hidden in the first place, the search engines do not consider it as a vital part of your page. Therefore, Google reduces the weight of hidden content in search.
On mobile phones, due to limited space, Google states that it is acceptable to place your content behind tabs and accordions. However, it is still recommended to not hide any important content from users on desktops.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Some websites style their headings and other sections by adding images that contain text. Unfortunately, though Google has an approved patent from 2008 titled ‘Recognizing text in images', Google Webmaster Trends Analyst Gary Illyes still advises us not to include text in images. Probably search engines still have issues with being 100% confident in reading and understanding the text within these images.
Not following this advice may make harder for your pages to rank for keywords included in these images. Therefore, it is always the best practice to ensure important text is not embedded in a picture. Use CSS styles to overlay these words over an image in given location.
Furthermore, content in the image will appear very small when scaled down from a desktop to a smartphone screen. This makes it difficult to read and will result in a bad user experience, not mentioning that this text won't be accessible by screen readers which are used by visually impaired searchers.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
A breadcrumb navigation is a secondary navigation that helps visitors to quickly understand where they are currently located on your site. They also provide the visitors with additional information on how your website is structured. This helps them easily move to a different level of your site's hierarchy.
The breadcrumb navigation can also be selected by the search engines and displayed in the search under your page, rather than a standard URL of your page, which will make it easier for users to read.
Google recommends using breadcrumbs on your site, as users find them helpful. The easier your website is for searchers to find their required information, the higher the chance of it ranking closer to the top of search results.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Another ranking factor Google added to its algorithm back in January 2012 was the page layout algorithm improvement. This punishes pages where ads take most of the visible space, especially above the fold. Google knows users do not want to visit these types of websites, meaning they will avoid showing them in their search results.
You should always consider being consistent with where you show ads. Ideally, they should be at secondary locations within your page or in the margins. This will help you to retain the key areas of the page for quality content, yet still provide a balanced mix of content and advertisements. If you decide to place them within the main body of your page, Google recommends not having more than two ads there.
Moreover, the search engines require all links in all ads to be tagged as ‘nofollow' which will prevent ranking signals from being sent to the advertiser through this paid placement.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Sliders were heavily used on websites in the past, as they offered a visually appealing way of displaying information and allowed sites to place more content into the same space on the site. However, due to the fact that nowadays most users are in a hurry, they do not stay long enough in one part of your site to get to see the information that is on all of your slides.
Because only one slide is visible at a time, the rest of the content stays hidden to the user. Therefore, the search engines will devalue this content as it has a smaller chance of being seen.
In addition to this, as these slides tend to look similar to banner ads, the human eye has learned to ignore them. Very often, people will skip them entirely, eventually scrolling down or navigating to a different page.
Lastly, most sliders are very heavy and will also significantly slow your site's loading times if not optimised correctly.
For these various reasons it is always best to avoid them or, at the very least, exclude them from the important information.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Many websites use paginated pages for posts or products category pages to improve the time that is needed to load the content. This is because it takes a browser a shorter time to display 10 posts than 50.
However, Google discovered in a research that users don't mind waiting a little bit longer to see more results instead of going through many paginated pages. If on average, your user needs to see 3 listings before finding what he or she wants (e.g. Google's search results pages), displaying only 10 of them at once is perfectly fine. On the other hand, if the user goes through 30 products before buying something, your page should show 30 items on each component page.
Furthermore, when you use paginations, you need to make sure that they are set up correctly. Otherwise, rankings signals (aka indexing prosperities) may end up being divided across all of the component pages instead of being accumulated in one central place, consequently decreasing the chance for your category page to rank well in search.
Therefore, you should have the rel=‘next' and rel=‘prev' markup added to your pages which tell search engines to treat all of these component pages as one block, and any signals in any of them will benefit all pages. Doing this will help your paginated pages appear higher in search results.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Flash was developed by Adobe and allows web designers to create image and animation rich web pages, widgets, and games that can be opened in the browser.
However, due to its heavy impact on the battery life of a device, browsers on mobile devices block Flash entirely. In 2016, Google announced it would start blocking Flash on the desktop because it's slowing down the page loading. Furthermore, Flash is very insecure, putting your users at risk and creating usability issues, which limit the interaction of your users with it.
While Google crawls text in Flash, it doesn't recommend including your content in it as it doesn't guarantee that other search engines can discover it, too. Furthermore, the content displayed in Flash won't be accessible to your visually impaired visitors unless you also provide an alternative text for these users.
For above reasons, you should avoid using Flash on your website.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
In February 2016, Google updated its Webmaster Guidelines and added a recommendation for you, as a site owner, to include on your websites a human-readable site map. This should list all of the important pages on your site, which provides an alternative means for visitors to find what they're looking for. Having this on your site may help you improve your ranking in Google.
The sitemap should list pages grouped together in a logical way. When the number of links is higher than 100, it is recommended to break them down into multiple pages. The link to that sitemap should be easily accessible, and we recommend placing it in your website's footer.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
To help your unique images rank higher in image search, each of them should have a dedicated page on your site displaying an image in full resolution. This page should also include an image title and description to provide users with additional relevant information regarding the image.
Ensuring your images are easily located in searches will increase the number of people who will use them in articles on topics related to your industry. When they link back to your website to give you a credit for the image, it strengthens your site's authority and topical relevance.
Visuals and images created by your company are an important part of your business. These pages will make it easier for other people to locate and use your beautiful images, resulting in an improved website ranking position.
Content Management Systems such as WordPress can create these pages for you automatically, meaning you only need to add a unique title and description when you upload each image to your site.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
This page appears when a user or a search engine wants to see a page or file that cannot be found due to it being moved or removed from your site. It is important to be as helpful as possible in this situation in order to avoid having the user leave your site due to a bad experience.
404 Page (aka Page Not Found) is an opportunity for your business to retain these users by providing them with an alternative way of finding what they are looking for. A well-designed 404 page can mitigate a bad experience of not finding an answer for which a user is looking for.
To improve user experience significantly, you should design a custom 404 page with a list of relevant pages to one that the user was looking for on your website and an eye-catching image. This page should also clearly state that the page they were looking for couldn't be found. Other helpful elements include search boxes, the most popular pages on your site, a suggestion that the URL might be misspelt, and a quick way to report a broken page. Additionally, the 404 Page should have a consistent design with the rest of your website, including your business logo, site navigation and footer.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Search box allows users to quickly find what they need on your site. A small text entry field suggests that people should enter shorter queries. However, if people need to enter longer queries in order to find the content they desire from your site, make sure you provide them with a sufficiently long text entry field.
There are two ways to improve the search functionality on your site. Firstly, you should ensure the search term is repeated on the results page. This allows users to see if what they searched for is spelled correctly. By leaving this in the search field for the users to see, they are able to edit and refine their search if your site were to return multiple results.
Secondly, you should list the search results in order. Each result should have the standard format of a thumbnail, title of a page, description and the URL. Thumbnail photos are extremely useful for guiding users towards the right content and can provide more information to them. The first line of text linked to the thumbnail should contain the title of the page. The description text should be made up of the page summary.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Your pages may occasionally contain an invalid code, which may not be visible at first. However, this may cause your users to encounter some trouble with viewing your content on certain browsers and could potentially pose a problem to search engine spiders to understand information included on your site.
Although not strictly necessary, you should ensure that each of your important pages does not contain invalid code. This minimises the chance of errors occurring when search crawlers extract the content from the code on a particular page. Search engines, such as Google, have openly suggested adhering to W3C standards to ensure the code is easy for them to interpret.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Users find and visit your pages because of the content on your site, as well as answers available on them to their questions. Google's employee confirmed that content on your pages is one of two the most important ranking factors. However, poor quality content will provide less value than other pages that already rank for keywords you target, meaning you will struggle to outrank them.
In this section, we look at the keywords your site is currently displayed under in the search results and discuss ways of ensuring they rank higher. In addition to this, we will suggest content you could create to provide your customers with more value and attract new users.
The search engines appreciate you making extra effort by creating in-depth and relevant content to help their users.
A thin content is defined as content that provides little or no value to a user and is exemplified by pages with a lower number of words. These pages reduce the overall quality of your site, making it less attractive to users and search engines. If you answer a user's questions with ‘thin' information, search engines will not think that you are proficient at this topic and will not rank your site at the top of the search results.
On the other hand, when your content is in-depth, it is much more valuable to searchers, as it does not only answer one question that they have, but potentially many more. This may make them more interested in your brand and what you have to offer, as your information is helpful to them.
In addition, more words on the page will help your content rank for more keywords, bringing additional users to your site. However, don't sacrifice the quality of your content by adding a meaningless text just to increase the number of words. This reduces the quality of your pages, making searchers leave your site.
Generally, pages with anything under 500 words, are at risk of being seen as with thin content. These pages may damage your ranking, as they decrease the overall value of your site.
This is because it decreases the chances of users to find a quality content when they arrive at your website. In simple terms, if 4 out of 10 pages on your site contained a lot of helpful and interesting information, a user would have 40% chance of finding it and having a great experience. When 6 out of 6 pages on your website are high quality, regardless of the page a user will click on, he or she will have a very high probability of finding something engaging and valuable.
You can fix the thin content issue in one of the following ways:
- Expand on them by adding more content that will be of value to your users
- Combine them with other related pages to create more helpful content
- Remove them from your site if they do not provide value to your users
- If you cannot remove them, you should consider using rel=canonical, which will tell search engines where they can find a more in-depth page on this topic on your site
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
We talk about internal duplicate content when identical or very similar content appears on more than one page on your site. This may be caused by not properly configured content management system, which could automatically create duplicate pages, or occur when you reuse the same text on multiple pages. This repetition makes your pages less interesting to read, wasting users' time when they go through the same information in multiple places on your site.
Furthermore, pages with duplicate content often compete to appear in search results for the same query, as they contain the same text and keywords. For search engines, it is important to show diverse results, which increase users' chances of finding the answer to their question. Showing 4 or 5 pages with identical or similar content is not beneficial as it extends the time the user needs to find necessary information and results in a bad experience for them.
When Google encounters pages with duplicate or very similar content, it puts them into one basket, then shows the most relevant and authoritative copy, removing all the duplicates. This does not necessarily mean that it will keep the page you want to appear in search results. Therefore, it is vital that you remove all duplicate content on your site or use rel=canonical to point to the version that you want to have indexed, in order to prevent search engines from guessing which page should be displayed.
In addition to this, Google may decrease the ranking of your site or remove it entirely from search results if the duplicate pages are perceived as an attempt of manipulating their rankings. For this and the above reasons, you should have unique content on all of your pages.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
You may have problem with an external duplicate content when search engines see the same or very similar text on your and other sites. For example, when you syndicate your content on other sites or copy product description from a manufacturer's site.
When they are not sure which version is the original, they will display in the search results one that is the most relevant and authoritative, removing all the duplicates.
Occasionally, the competition or a scraper site (one that copies content from other websites using bots) may steal text from your website. This can cause confusion amongst the search engines and even lead to these sites outranking you by using your own content on their pages.
Search engines have created tools enabling you to flag unfair competition, meaning they can have them removed from their search results if found guilty. Therefore, it is vital to scan the Web every few months to combat these bad practices in your industry.
You should also not copy third party content and add it to your site, as this is against Google Webmasters Guidelines.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Keyword cannibalization happens when two or more pages are competing for the same keyword. This can confuse the search engines and force them to show only one of your pages for the searcher's query.
This typically occurs when your website uses single terms or phrases which you would like to rank in searches; either in the title or in the main heading on many of your site's pages. Usually, this is unintentional, but can result in several or even dozens of pages competing for the same keyword.
This also splits ranking signals such as internal and external backlinks over two pages, decreasing the chances of you outranking your competition. You should avoid keyword cannibalization on your site to give your pages a higher chance to rank well in searches.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Both Google Search Console and Bing Webmaster Tool provide you with information regarding queries for which your pages already rank for. This a great place for content ideas, as it tells you which keyword the search engines already associate with your brand. You should then use this information to further optimise your current pages, helping them rank higher for these queries by ensuring they appear in important places on your page.
Alternatively, you can go beyond this and create a dedicated page for each theme of keywords. This is a group of very similar keyword phrases, which aim to answer a similar user's question. An example would be: social media; what is social media and what should I know about social media. This will allow your page to be even more relevant for those keywords, eventually increasing its chance to rank higher and providing a searcher with more information than a page that only mentions ‘social media'.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Hints & Tips and How-To Content is a great chance for your business to demonstrate its knowledge. For instance, if you are a law firm, you can tell people which clauses to be particularly aware of when signing a contract. Or, if you are an estate agent, you can inform visitors to your site of the key aspects to getting the best deal when buying or selling a house.
Most people do not have the time or knowledge that a company like yours does, meaning they can see the benefits in paying you do the legwork for them. You are able to add alternative content highlighting your expertise within your industry too.
However, it is not necessary to provide all the answers yourself, as linking to helpful resources elsewhere on the Internet can be just as effective. A strong list of links can be bookmarked by users and used as a starting point for their research. Such links could be to trade group sites, suppliers or forums that provide relevant, useful information.
Ultimately, Hints & Tips content should be just that - useful information that allows visitors to have a point of reference, helping them to resolve their queries. Having this type of content will ensure users view your site as a reliable source of information within your field.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
As a healthy company grows with time, your website should also have ever-growing content providing users with fresh and up-to-date information. This signals to users and search engines that your business is thriving and here to stay.
You can do this by constantly updating and improving the quality of the content on your pages by adding more facts, data and helpful information that your users may need. This way, you can increase the loyalty of your current customers by providing them with a free value.
Furthermore, you should be regularly creating informative blog posts, guides or videos on topics relevant to what your business offers. This will ensure your brand ranks for new related queries, bringing more searchers to your site. In the end, more relevant traffic leads to more conversions, and more conversions leads to more customers.
Growing your content not only provides more value to your current users and attracts new ones, it also helps your site accumulate more ranking signals which will improve the position in the search of your important pages.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Content Pruning is the process of reducing the number of indexed, low-quality pages that visitors are not interested in seeing, and which do not add value to your site. In a broad sense, both users and the search engines measure the quality of your site by the quality of each of your pages.
You should update or de-index old content that is no longer relevant or true. The search engines are sophisticated enough to check whether facts stated on your site are correct. If they are not, it may result in less trust of your brand and your content to rank lower in the search results.
You can identify these pages by checking those that receive the lowest number of clicks in the search results. You should index only those pages that you're interested in ranking. Otherwise you may end up damaging your site's rank.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Continued keyword research helps you identify untapped opportunities and trends that emerge within your industry, which you may not be aware of. By capitalising on these, your business could gain an advantage over your competitors and gain more customers.
Furthermore, knowing the language searchers use when they look for a business such as yours helps you to adjust how you speak to them on your site. By having the words within your content similar to what your customers would use, you increase the chances of resonating with them. This can have a positive impact on the growth of your business.
In addition to this, the closer the keywords are to users' queries on your pages, the higher the chance that they will appear in their search results.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Internal search queries highlight the words users type into your site's search box when they are unable to find what they are looking for. This provides you with information on what pages should be more prominent on your site to make it easier for your users to navigate to them.
Additionally, you can use this information to identify important content that your site is missing. By adding it to your site, you have a chance of removing frictions that may be stopping your user from becoming a customer.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Your images are visual content, which users may look for on Google and Bing Image Search. Thousands of users are looking for images related to your industry every single day. This is another way for your business to attract searchers, who may become future customers.
In addition to this, interesting and unique images also create an opportunity to link back to your site as a source when referenced in an article on another website. This provides your site with new ranking signals, increasing your website's authority and helping all of your pages to rank higher in the search results.
Reviewing queries where your images are shown can help you further optimise them. You can do this by adding partially matched or closely matched keywords in the name of the image, the image Alt text and on the dedicated image page in its title and description. This should lead to a higher-ranking position, more visitors and more future backlinks, which will translate to a higher overall ranking of your pages.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
This section focuses on helping you target users within the area in which your business operates. We will discuss signals, which will highlight the relevance of your company to searchers and search engines within a specific geographic area.
In addition to this, we will look at other popular places online where customers are searching for local businesses. We will also check whether these sites possess current contact details for your website. If the contact details they have for your business are out of date, it will make calling or finding your company a frustrating experience.
When your domain uses a Country-Code Top-Level Domain (ccTLD), it receives a small ranking boost in that specific geographic location.
Generic Top-Level Domains (gTLD), such as .com, .net, .io, etc., do not have geo-targeting by default. For this reason, Google and Bing developed tools (International Targeting and Geo-Targeting respectively), which allow web developers to specify the country that you would like to target with your website and provide this information to the search engine.
You can also geo-target different countries with a subdomain or even a subdirectory when your site has a Generic Top-Level Domain. However, Country-Code Top-Level Domains has higher chance to rank higher in a specified country.
Geo-targeting will improve your ranking in a selected country, but it may make it more difficult for your website to be listed in search results in other countries. Therefore, you should only use it when a page on your site targets a single country. Not choosing anything will leave the decision to search engines and their best judgement. Selecting ‘Unlisted' will prevent them from associating your website with any country.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
When a local presence is important to your business you need to highlight the relevance of your company and content, within a specific region, to the users and search engines. You can do this by using geo-focused keywords (e.g. car dealer in Dublin or construction company in Leinster) at significant places on your pages. These could be the:
- Page title
- Meta description
- Tagline
- URL
- Heading
- First paragraph
- Alt tag of first image
- Internal links anchor text
It is not necessary to have a geo-location in all of these places. However, you do need to find a healthy balance, ensuring content on each of your pages sounds natural and is interesting to read. Therefore, it is recommended to display them at a few of these locations, as this will help you rank higher in local searches.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Contact information for your business, such as your company name, address, and phone number, should be visible on every page of your website. As a result, this makes easier for your users to find your contact details and figure out where your business is based when they arrive at your site.
Furthermore, this strengthens the belief in search engines that your company is highly relevant to people leaving in this geographic area. This makes them more confident and willing to display your business in search results when looking for your products or services.
We also recommend showing your opening hours alongside the contact information, as this ensures your customers know the best time to reach your business. An ideal location to place such information would be in the footer, at the bottom of each of your pages. Your contact details should also be marked-up with Schema to make it easier for search engines to process.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Google My Business is a tool that allows you to provide Google with the most important information about your company. This information is then displayed directly in search for local and branded queries, making it easier and quicker for users to access. Such information may include; phone number, store location, area of service, nature of business and opening hours. Users will then be able to call or text your business directly via the search results by clicking on the number provided when viewing your page on a mobile phone.
In addition to this, your business will then be eligible to appear in the ‘Local Pack', which is a search feature that Google displays above normal links (organic results). This feature lists local businesses with their location highlighted on the map, which are relevant to the user's query.
Having a Google My Business profile will also allow users to find your business when using Google Maps, both on desktop and mobile. This would usually occur when the user is exploring the area in which your business is located, or when they are searching for your type of business within the ‘Maps' application. Google Maps will then show them directions and even navigate them to your business via the fastest route. Therefore, it is important to ensure Google has the correct location of your business.
To help your Google My Business profile rank well in the search results, it is essential that you verify it and fill out all the details about your business, added photos of your store, linked to your site's homepage or a store location-specific page and have at least a few reviews from your customers.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Local citations are mentions of your business name, address, phone number and website on third-party sites. It is important for your company to display correct details in these citations, as searchers often use them to find businesses similar to yours. Incorrect company information could diminish user experience by directing them to an old location or prompting them to call a dead number.
Furthermore, Google views correct details in local citations as a reason to trust your website. Therefore, if your basic business information (name, address and phone number) matches the details provided on your Google My Business page, Google will have more confidence in your site and boost your ranking.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Bing also provides you with a tool allowing you to create an online business profile on their platform in the same way as Google. As Bing has over a 20% share in the global search market, you should add your contact details, opening hours, services provided and photos to this platform too. This information will then be directly displayed in the search results, enabling your users faster access to it.
Furthermore, the location of your business will be displayed to Bing Maps users when they are looking for a business such as yours in the same area. Users will then be able to easily call you or receive directions directly to the front door of your store.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Apple Maps provide directions and information to iOS users about local businesses. Most of the time this information is pulled from Yelp listings. However, Apple has created a tool allowing you to submit your company's details directly to them, significantly improving the accuracy of this information. Additionally, you will be able to connect your social media accounts to your Apple listing.
Placing your business on Apple Maps could deliver additional customers to your doorstep. Therefore, it is important for your company to ensure Apple has accurate information.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Reviews left on local listings and business directories help potential customers and search engines to form an opinion on how satisfied customers were after engaging with you. In other words, what your customers write in their reviews influences your ranking. Each of your company's listings should have at least a few reviews, as this will help bring new visitors to your store.
You can increase the number of reviews on each listing by asking your happy customers if they could share their experience with others on a specific site. You can also send your customers an email with a link to the site on which you would like them to leave you a review. The majority of customers are happy to do this when asked.
Regular reviews on your site indicate genuine feedback and that your business is consistently delivering great service.
In addition to this, it is not all-important for each review to receive a 5-star rating. It is normal for any business to have low scoring reviews from time to time. Providing your average score is above a 4-star rating, you are doing great.
If your company only receives 5-star reviews, this often looks too good to be true. In addition, customer expectation will become very high, making it more difficult to satisfy them. This could result in negative reviews or lost customers when their experiences do not match their expectations.
Customer feedback also provides you with valuable information on which areas your business should improve in order to increase the number of repeat customers. This should be monitored and used by your company.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Responding to reviews is a great way to build strong relationships with your customers. This shows that their feedback is important to you and that you appreciate their time in writing a review.
Furthermore, it gives you're an opportunity to increase relevance of your online profiles by phrasing your response in such a way to mentioning your primary keyword, product or. It is important to not overdo it, which could cause your reply to sound strange.
Your responses are incredibly vital in terms of damage control when a customer leaves a bad review about their experience with your company. This presents you with an opportunity to transform unhappy customers to happy ones, whilst showing potential customers that you value all feedback.
You should also thank each customer who leaves you 4 and 5-star reviews and engage in a conversation by relating directly to their feedback. When receiving 2 and 3-star reviews, they should be responded to immediately, apologising for any inconvenience and providing a plan of action as to how you will prevent a repeat experience in the future. For reviews with 1-star, take extra time to investigate what caused this feedback.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
The geographic location of the server you use to host your website can increase your site's relevance in the country in which it is located, because it is used by search engines as one of the signals to determine for which users your content is the most helpful.
Keeping your website in a country where the majority of your customers live can give you a slightly higher chance to reach those users. If your site is translated into a different language, you could host that translation on a separate server, which is located in the country where that language is used. However, this is more a nice-to-have than a critical factor to have your pages appearing in the search results of these countries.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
The Physical Web enables your customers to discover web pages associated with your business objects and locations. It is powered by either Bluetooth Low Energy (BLE) beacons or Android devices, which have installed the Beacon Toy App that broadcast URLs to nearby phones.
The Chrome 49 browser and the above display these URLs via Nearby Notifications, and other mobile browsers are working on doing so in the near future. Customers with Chrome devices can already receive your notifications when they are near your company's beacons if they have an active data connection with Bluetooth and Location turned on.
These notifications could provide your users with additional information regarding your products and services when they are inside or near your business. Furthermore, they could be used to prompt your customers to sign up to your newsletter, take a survey, create a membership account on your site, or inform them of your latest deals.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
This section primarily focuses on identifying and evaluating other sites that link to you, which allows you to acquire additional visitors and vital ranking signals. Google's employee confirmed that links to your site are one of two most important signals that can help you rank higher in search results.
The number of quality links pointing to your pages earns a reputation for your brand in the eyes of users and search engines. This can lead to a higher ranking of your pages for specific keywords, but also the higher visibility of your whole domain in the search, resulting in more visitors from Google and Bing, your organic traffic. Having no links pointing to your websites makes it more difficult for search engines to trust your site, therefore making it harder to rank for competitive queries.
The websites that link to you also have an audience visiting their pages and reading their articles. When these people come across a link to your site, they may decide to click on it to learn more about what you do. This is called referral traffic, as other sites refer their users to you.
When an external site links to yours it signals to the search engines that this site trusts your business enough to send its users to you. It also indicates that you have helpful, relevant information for that audience. The more sites that link to you, the more legitimate and respected your site will appear.
In addition to this, when a larger, more established site sends users to you, it is a more powerful message to the search engines than if a smaller site had done so. Therefore, it is more rewarding to build relationships and links from other more popular and authoritative sites. This will also lead to your page receiving more referral traffic.
On the other hand, links from spam or low quality sites may be devalued or not counted at all. Moreover, having a high number of these links pointing to your website may also lead to penalties. Therefore, it is important to review the sites linking to you every few months, and distance yourself from those with which you do not want to be associated with by disavowing them. You can use Google's and Bing's Disavow tool for this.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
The number of backlinks from each domain also have a significant impact on your website. However, each additional link after the first one will not have as great an impact on your site ranking. You should always welcome genuine links from respected websites. However, avoid having backlinks on a high number of low quality pages, forum threads or blog comments of the same site as this may raise a red flag to Google.
Additionally, paying for or exchanging site-wide ‘dofollow' links, which pass authority and help you rank higher, is also frowned upon as this indicates to the search engines that you are trying to manipulate your ranking. Search engines could then review your site and all backlinks to determine whether forbidden practices are being used to provide you with an unfair advantage. This can lead to manual penalties being applied to your site or some of your pages.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
In order to have a greater impact on your ranking within the search results, you should ensure that backlinks from other sites point to important pages you wish to improve the ranking for. This sends a direct signal to Google and Bing that these pages are vital to your business, and that users should find helpful information there.
A safe ratio of links going to your keyword focused pages and to others on your site (including your homepage), is around 30%/70%. As other websites link much often to interesting articles or guides than to a services page. The opposite may make the search engines suspicious and may lead to a manual review of your site.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Moving your website to another domain or changing the URLs of your pages during a redesign may cause a broken backlink issue for any sites linking to you. This is because some of these sites may be linking to previous content of yours, which no longer exists.
Therefore, it is vital to keep track of any URLs that have changed and always set up redirects from the old URL to the new one to avoid losing the rankings for your pages, as ranking value of backlinks which point to 404 page tend not to benefit your site. In addition to this, this then prevents your users and the search engines from arriving to a blank or 404 page (aka Page Not Found) after clicking on one of these broken backlinks.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
An anchor text refers to text used in a link pointing to your page, informing users and search engines of the page content. It is used as a ranking signal and will help your site show up in the search results for similar queries.
It is vital that some of your backlinks contain your primary or secondary keywords, also called‘exact-match anchors'. However, having only links could indicate to Google and Bing that you are trying to manipulate the search results. The search engines know that sites use a variation of anchor texts when linking to other sites, such as URLs, the business name, the text ‘click here' or ‘website' and many more.
Therefore, if you run a link building campaign, remember to diversify your anchor text. The majority of them should be ‘branded anchors', which are links containing your brand name. This could be your company name or a product name.
If your exact-match anchors outweigh the branded anchors, this may send a spam signal. Generally, your target keyword should not be in more than 10% of links. When this percentage is higher, you have two solutions: add more backlinks with the branded anchor or changing your anchor texts from your keyword to your company name.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Online mentions refer to other sites utilizing your company name, branded product or service name, without an actual link to your website. These mentions may carry some ranking value for your business, but a backlink has a much higher impact in improving your ranking and helping users find your site.
It is important to regularly check if other domains have provided a backlink to your site when they write about your company. These could be local newspapers, blogs or industry-focused sites, which could help further improve the ranking of your content in the major search engines.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Over time, your site may accumulate backlinks from spam-like or untrustworthy sites. Having many of these can signal to Google that you could be involved in spam activity in order to gain a higher ranking. This may lead to a manual penalty from Google, causing some of your pages, or even the whole site, to rank lower in search results or disappear altogether.
Google and Bing have created tools that help you to flag to them spam, untrusted and unnatural backlinks when you are unable to manually remove them. This prevents those links from hurting your site. However, these tools should be always used with caution, as disavowing good links may reduce your ranking position.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Often, people search on Google or Bing for your business name, branded services or products to learn more about them, and what other people who did business with have to say. It's vital that the first search results page only display pages within your control and that show your brand in a good light.
It is not uncommon that when users search for a specific company, their competitors are shown on the same page, as well as forum threads with complaints or negative reviews for the business. This may cause your company to lose potential customers to your competitors.
Monitoring what pages appear when users search for your brand, product or service name is vital, because they change over time or may even vary based on a person location. The results that appear help potential new customers form an opinion on your business and then decide if they want to use your products or services.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Search Autocomplete helps users to quickly complete a query by clicking on suggested options while they type words in the search box. These suggestions are chosen based on their popularity in Google and Bing's search engines.
Monitoring the Search Autocompletes may help you prevent your brand from being damaged when it suggests queries that may show your company in a negative light. This could happen if the search engines show an autocomplete option of ‘company X complaints', ‘company X scandal' or similar. In these cases, you can try to push them out by increasing the volume of performed searches of other queries containing your company name.
It also helps you to identify the most popular content your searchers are interested in finding on your site. You should be ranking #1 for all of these queries. You need to ensure such content exists on your website, and these pages are easy to find from your homepage.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
When your pages are shared on social media, it helps search engines to discover them sooner and keep them updated. Although often positive, the impact on your rankings when your pages are shared on social media is a heavily debated topic in the SEO industry.
Social media offers an alternative method of drawing users to your site, as you have more control over how and what information is shown to them. It is also easier for you to convert these users to customers, or even remarket to them later through the use of Remarketing Campaigns.
Therefore, you should promote your product or service pages, event pages and content on various social networks to increase the amount of traffic to your website.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Most websites are hosted on servers with sites that belong to other people or companies. Occasionally, these websites may be less secure than yours, potentially exposing you to hacker attacks. To add to this, when a significant majority of websites on one server are spam, the search engines may then devalue all sites on the singular server, causing them to stop their ranking for targeted keywords.
It is always important to know what type of neighbourhood your domain lives in. This way, you can move to a safer server when necessary, preventing them from holding your site back.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
In this section we will check which sites also rank for keywords that are important to you. This will help you to better understand why the search engines may be choosing your competitors over you in the higher positions of their search results.
We will analyse the key areas that we know are strong for ranking signals, and we will compare your competitors' pages that appear in the search results alongside yours. This way, we will be able to identify where your site may be lacking relevance, quality or authority.
Search Visibility checks the prominence of yours and your competition's sites in the search for keywords important to your business. The higher the position of a site for a given keyword, the higher its visibility score is.
By looking at this metric you can identify who the key players are in your industry. These companies are providing the search engines with what they need for their users. When you compare the competition's pages and site with yours it can provide you with ideas on where you should improve in order to reach a similar position or surpass their ranking.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
The number of referring domains to a page is one of the strongest ranking signals. By checking how many websites link to pages ranking for the keyword you are interested in, you can determine how difficult it may be for you to outrank them.
You should try to earn backlinks from more domains than your competition has. However, this number is a constantly moving target, as pages tend to earn additional backlinks with time.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
The overall number of backlinks also helps your ranking on Google and Bing. However, each backlink from the same domain provides a decreasing value. This metric also allows you to better understand how difficult it may be to rank for your targeted keyword.
Remember to put more focus on earning backlinks from new domains, rather than gaining a lot of them from the same domain. This may be seen by the search engines as spam-like practice and in extreme cases may lead to a penalty.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Depending on how users interact with pages displayed in their search results, the search engines can understand what kind of intent they have when entering a specific query. Google and Bing analyse which results are clicked more often and how fast a person returns to their search results after visiting a page to identify which type of content is most helpful for a specific query.
Therefore, some queries may show different types of pages. For example, a query for ‘games' mainly displays pages on which you can play a variety of online games. Whereas a query for ‘how to build games' mainly displays pages containing informational content on how to create a game.
By looking at what type of pages rank in the top 10 search results for a certain keyword, you are in a better position to optimise your content and provide users and the search engines with what they want. This will then increase your chances of ranking well for your targeted keyword.
You won't rank a sales page for educational queries. People want to understand something, not being sold to, and the search engines know it.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Titles provide a strong indication to both users and the search engines of what the page contains. When a page repeats the user's query in the title, it strengthens the relevance of that phrase and increases its chances of ranking higher.
In addition to this, a strong title name also helps convince searchers to navigate to your site rather than your competitors. Titles used by your competition can be used as ideas to optimise yours.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
The more useful information your pages contains, the more valuable it is to users and search engines. By checking how in-depth the pages that rank for your targeted keyword are, you can understand how long yours should be.
You can also review the type of information they contain, which you may have missed on your pages. It is important for your pages to have a correct page rank for your important keywords, and you should always be pushing yourself to make your content even stronger. You should aim to find a unique angle on the topic, as the search engines want to show diverse results and duplicate content will be filtered out.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Search Ads allow businesses to show their pages above their competition in search results. This way, users may not even look at pages that rank organically, as they are pushed down the page. On mobile phones they may not even be seen on the initial page load (above the fold) if there are many advertisers bidding for the same keyword and taking up all available space on the screen.
This tactic does not require waiting months for your pages to be displayed for important keywords. You are able to start showing your message above everyone else immediately if you have a marketing budget that allows you to invest in this.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
It is important for ads to be optimised and compelling to an audience, as this will encourage more users to click on them. As Google, Bing and Facebook want to show ads that people like and engage with, they will reward advertisers who go the extra mile to ensure their ads look and sound great by reducing the cost of running them.
Ads that run for a long time are usually optimised to achieve the highest engagement rate possible and the lowest cost-per-click. By analysing the copy, they use, you can improve and test new titles and meta descriptions of your pages, increasing the number of visitors you receive from the same ranking position.
Additionally, you can use ads your competition has previously ran as inspiration for your own ads. This will help you to come up with creative ideas in a fraction of the time that it would normally take.
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]
Engagement metrics help you and search engines understand how interested users are in your competitive content. Google is giving preferential treatment to sites with the best usability, so it is vital in terms of search and business purposes to continuously work on the quality of the content.
We compare your competitors' bounce rate, daily page views per visitor and daily time on site per visitor during the last three months, which is data provided by Alexa.com. Alexa, owned by Amazon, provides commercial web traffic data and analysis.
Their traffic data is generated from the browsing behaviour of people. Companies take advantage of this tool to gain insight in measuring the web traffic of their competitors and make better estimates. Generally, this is a data panel of millions of people using more than 25,000 different browser extensions.
Bounce Rate shows the percentage of times a user left your site after seeing only one page, which should be kept as low as possible. Daily Page views per Visitor describes the average number of pages that searchers see on your site, while daily time on site per visitor shows the average time a person spent on your website. Both of these values should be as high as possible, which will indicate that your content is interesting to your users. SimilarWeb data comes from 4 main sources:
- A panel of monitored devices, currently the largest in the industry;
- Local internet service providers (ISPs) located in many different countries;
- SimilarWeb web crawlers that scan every public website to create a highly accurate map of the digital world;
- Hundreds of thousands of direct measurement sources from websites and apps that are connected to SimilarWeb directly.'
[Described detected issues supported by screenshots and code snippets, as why these things are happening to allow a client to recreate what we are see.]
[Explicit instructions on how to fix the issue supported by screenshots, code snippets with the fixed issue and mock-ups how it should look like.]
[Spreadsheets with URLs where all these issues are happening or direct links to tools that you might be using.]