SEO Basics - Can Your Site be Crawled and Indexed?
Two basic components of SEO are crawlability and indexability of your website. If a bot is unable to crawl and index a page, it will not be ranked by Google. If your site cannot be ranked, you will not get any organic search traffic. Growing any business without organic search is both expensive and unlikely.
MARKETINGSEOON-PAGE SEOSEARCH ENGINE OPTIMIZATIONCRAWLABILITYINDEXABILITY
Bill Arnold
1/17/20246 min read
There are certain aspects of marketing that are simply foundational. You MUST do them and do them RIGHT. If you engage in digital marketing, one of those is one-page search engine optimization (SEO). There are entire agencies whose sole purpose is to enhance an aspect of your on-page SEO. We are going to attempt to provide you with an understanding of its importance, what elements you need to consider, what is a good score, and how to identify and fix any on-page SEO issues you may find.
Getting a perfect on-page SEO score is akin to Ahab’s quest for Moby Dick. It will be elusive, difficult to achieve and often you might seem maniacal to others. Is that quest worth it? It is worth it and absolutely necessary for a website that will produce marketing/sales success.
There are over 50 billion web pages from over 1.93 billion websites (Search Engine Journal). If you want a chance to be found through an internet search, you need to follow all the rules. As they say, the best way to hide a body is to put it on page two of Google Search.
Editors Note: We cannot attribute the original image source, but if contacted, we would be happy to provide attribution.
What is Search Engine Optimization
Search engine optimization involves the activities you undertake to please the algorithms (primarily Google) to obtain higher ranking and more prominent placements. Typically, these are all incremental optimizations that when combined have a noticeable impact on your site’s user experience and performance in organic search results. It is a game of inches that can ultimately bring high-quality traffic to your website.
There are a number of factors that are part of these algorithms. The good news is that Google is fairly upfront about how you need to improve your site and content to achieve higher scores. All of these requirements will fall into one of three categories: on-page SEO, technical SEO, and off-page SEO (HubSpot). At Prevail Marketing, we like to keep things simple and more descriptive, so we combine some of these categories and speak about On-Page SEO (combines on-page and technical SEO) and Performance SEO (off-page SEO).
We will be sharing several blogs that provide you with a blueprint on how to optimize both On-Page SEO and Performance SEO.
How Does SEO Work?
On-page SEO is critical to both getting Google and other search engines to send you organic traffic. It is also vitally important for enhanced user experience, and conversions, once someone does find your website. Organic traffic is, obviously, not the only method for traffic origination.
If you get traffic from other means (e.g., direct traffic, backlinks, internal links, external links, paid demand, etc.), and they find the site difficult to understand or navigate (UX/UI), they will bounce, and you have lost a potential customer. We will explore each area as part of this blog series.
The two areas we will address in today’s blog are crawlability and indexability of your website. We are focusing on these two areas because if a bot is unable to crawl and index a page, it will not be ranked by Google or any other search engine (yes, more exist). If your site cannot be ranked, you will not get any organic search traffic. Growing any business without organic search is both expensive and unlikely.
Crawlability
Crawlability refers to whether an SEO spider can access, scan, and index your website. The first step, to begin taking advantage of on-page SEO, is to ensure your website is crawlable by Google bots and other search engine spiders. Many companies who produce a copious amount of high-value content are surprised to find that none of it is benefiting them, because search engine bots cannot discover it.
How Important is the crawlability of a website? Consider what Google has to say about it.
“We [Google] don’t do a good enough job getting people to focus on the important things. Like CREATING A DAMN CRAWLABLE SITE” — Gary Illyes (Google’s Chief of Sunshine and Happiness & Trends Analyst) / Feb. 08 2019 Reddit AMA on r/TechSEO
Folks, if Google says it is important… it is damn important. The SEO bots must be able to discover that a page exists through external and internal links and be able to “read” the page’s contents.
The most common reasons that an SEO bot is unable to crawl a website are:
No Follow Links – When you have internal and external links, you have the option of telling the search engines whether they should consider where they lead. Google finds that internal and external links help the user experience to find additional information about a topic. They also help users, and the bots, better understand your topic and niche. We will have an entire article in the next few days about internal and external linking best practices.
Redirect Loops – These are broken, or bad, redirects that cannot be resolved by the SEO bot.
Bad Site Structure – This can include a number of issues, as poor internal linking strategies, poor canonical tags, which tell the search engine which version of a website page to scan and index.
Robots.ext – This means it is not being found in the root directory or having a Noindex instructions:
Here’s a sample robots.txt file:
User-agent: *
Allow:/blog/
Disallow:/blog/admin/
That simple instruction tells a bot that this applies to them, and they can crawl the blog directory, but not the blog administrative area.
No XML sitemap or one that is not accessible. An XML sitemap tells the SEO bots the important pages on your website that you want them to crawl and index. It is, essentially, a roadmap for them to follow. Make sure you utilize a script that will automatically update your sitemap when new pages are added.
Test Your Website – If you don’t know how well the SEO bots can crawl your website, you can use Google Search Counsel or SEMrush site audit. While there are many other tools that you can use to generate the same information, we prefer these two sources for the following reasons:
They are both free.
Google is the most definitive source.
SEMrush provides actionable items to improve your score and will address other search engine issues.
Here is an example of a good SEMrush scorecard.
Indexability
Indexability measures the search engine’s ability to analyze your website and add each page to its index. If a site is not crawlable, it cannot be indexed, so sort out all the crawlability issues.
Poor Content - If a site is being crawled by a search engine, but not being indexed, it usually means that Google (We all know we are speaking of Google) has decided that the information is not worthy. In other words, “You have been weighed, measured, and found wanting” (The Knight’s Tale).
Quality content matters when Google determines whether to index a page. Make sure your website does not suffer any of the following content issues:
Limited Content – too little information on a page.
Poorly Written Content – Your English teacher was right, grammar and spelling mistakes matter.
Lack of Unique Content – Copy and Pasting does not work. Create original content and be rewarded for your efforts.
Lack of Relevance – If a link to your website says it is about a particular topic, the bots better find that information relevant to that topic.
Test Your Site – The best way to test whether your site is being indexed by the search engine is to type in the search bar the following site: "yoururl.” For example: go to Bing.com, and type in site: prevail.marketing, and you will see the number of pages that the Prevail Marketing site has indexed by Bing. The same holds true for Google.
Alternatively, for Google, you can go to Google Search Counsel, and it will tell you the number of pages in your website that are being indexed by them. If you do not have 90% of your web pages being indexed, you have a serious problem.
Conclusion
On-page and technical SEO play a vital role in driving organic traffic to a website. These two components are crucial for ensuring that search engines can crawl and index the site effectively. When a website is optimized for on-page SEO, it means that the content is structured in a way that search engines can easily understand and categorize. Without proper crawlability and indexability, search engines cannot discover and rank a website's pages, resulting in no organic traffic. Therefore, it is essential for website owners and digital marketers to prioritize on-page and technical SEO strategies to ensure maximum visibility and organic traffic.
We will discuss how to fix these issues in upcoming blogs. If you can't wait for the next blog to learn how to fix your site, we can be reached by clicking the button below.
Contacts:
prevailer@prevail.marketing
(424) 484-9955