The foundation of any website. This is where it all begins. If you don’t have a site that runs well, most of your other SEO factors won’t hold up. We take this provable SEO concept serious enough to prove it as needed. Let’s check a number of items that will affect your Technical SEO and is the most important aspect because if you don’t this, you don’t have a functioning website for traffic to get to. Without hesitation, here we go!
1. Make sure your website is live and functions correctly.
The UI and UX elements of a website are so important because you want to make sure your website is working and ready for people to visit it. A short and simple request, but a must have if you are serious about a website working for you.
2. Check or submit your website for indexing.
Google Analytics and Search Console are two “must-haves” if you are in it to win. Once your website is up and running, you will want to submit your website within Google Search Console (GSC) to ensure that search engines can find your site and pages. You can also do a quick check by entering the following Google Operator into the search bar to see if your site has already been index, you can do this by typing “site:yourdomainnamehere.com in your target search engine.
This should produce results and display the number of pages that have been found by the search engine. If there’s a large gap in the number of pages indexed than you expected, you’ll need to review your pages and figure out what is going on with your on-site seo. Which brings us to the following point.
3. Make sure all resources are crawlable that you want indexed.
In order to check your website’s crawlability, you may have heard of looking in your robots.txt file; but even then, it’s usually as inaccurate as it is simple. Robots.txt is just one of the many processes to restrict certain pages or content from indexing, so you may want to use a free tool that does SEO crawling to get a list of any blocked media content or pages on your site, regardless of the robots.txt file.
If you want to try different resources to find your sites outside of Google, you can try free tools like SEO Powersuite – WebSite Auditor or something like Screaming Frog.
4. Optimize crawl attempts.
Crawl attempts is the anticipated number of website page attempts that search engines crawl during any given period of time. You can get an idea of what your crawl number is in GSC.
Unfortunately, GSC won’t give you a play by play (or each single page) breakdown of the crawl stats and outcome.
After seeing how often your site is crawled, you can increase this process by different ways. Want to guess what it is? Well, there is this thing called SEO, tada! Search engine optimizers aren’t sure how Google assigns when to crawl sites, but after doing a lot of research, there are two major theories that we guess are key factors is the amount of number of internal links to different pages within your website AND the number of backlinks pointing to your site, from other websites related to your niche. (Pronounced Nitch, NOT neesh…sorry, just had to throw that out there.)
We constantly strive to test these types of practices in order to provide a Proof of Concept in SEO. We often scour back link profiles with their own on-site SEO that points to our sites that have internal links for their own site and link out to our sites.
Most companies don’t know that SEO plays a huge part in getting your website found, in more ways than one. But one shouldn’t only rely on backlinks alone. No one (in their right mind) can grow (or would with a good SEO conscience) a backlink profile overnight (event though this was a big practice back in the day – quantity over quality). If that is tempting, lets get you started off on the right foot.
• Delete duplicate pages and content. Google isn’t so much lazy as it is efficient and less time it can spend on your site to get to other sites, it will. So don’t make it work twice as hard. You show it some love and it will show you some love.
• No SEO value = Do Not Index. Contest rules, policies dealing with privacy, terms and conditions and promotions of goods that have expired are ideal candidates for a Disallow rule in robots.txt to ensure they don’t become indexed and potentially cause more problems.
• If it’s broken, fix it and fast. Preventative maintenance is crucial to any SEO campaign. If you have a bunch of sites that throw 4XX/5XX error pages, that’s one less attempt for Google to find what it’s looking for and present it to your customers and clients.
5. Internal link auditing.
A shallow and easy to navigate, logical site structure is ultimately the best practice of great user experience and crawlability when it comes to SEO practices; internal linking will help pass around ranking power around to more pages effectively. This will overall increase the Proof of Concept that your SEO is working, and the right way.
Need a bit more help when doing your internal link auditing process?
• Click depth. Try not to make any page more than three clicks from the home page. We focus on not going past three because it’s easy to follow and when we tell people where to go on our site, it’s easy to say, “go to our website, then here, then here” instead of “here and then here, then over there, and click this, not that, ok, we’re half way there…”. Not cool.
• Broken links. As mentioned before, find them and fix them. Remember how it makes you feel when you thought you found what you needed and then BAM 404. Yeah, this doesn’t help your customer or your SEO on your site.
• Redirected links. Some companies out there will redirect a page 2-3 times to land on the right page. I say, it should be a one page redirect. Plan your SEO in such a way that most of the time, you’ll never need to redirect a page, and if you do, make it a single leap.
• Orphan pages. These are pages that aren’t linked or connected to each other, often called Orphan pages or landing pages. If they aren’t connected, there must be a good reason for it.
6. Schedule Technical SEO sitemap check ups.
Hopefully, by this point, you know how important sitemaps are to your business website. They share with search engines what your site structure is and lets them discover new and better content, faster. Here are things to watch for and to check your sitemaps against: (this also can be automated if you are using a CMS (content management system) e.g. like wordpress).
• Freshness and often. XML sitemaps should be updated as soon as new content is added to your site. The sooner the better.
• Cleanness. Simple and easy to read, both by search engines and users. Keep the sitemap free from useless pages and information. (4XX pages, non-canonical pages, redirected URLs, and pages blocked from indexing) — if you don’t, you will run the risk of having the sitemap completely ignored by the search engines and your business will suffer. Remember often to check it weekly, no longer than monthly. GSC has a great way to help you with this.
• Size. Google and other search engines will limit its sitemap crawling powers to +- 50,000 URLs each iteration. Attentively, you should focus on keeping much shorter pages than that so your important converting pages get crawled more frequently. Some SEO points are typically given out to those with a reduction in the number of URLs in sitemaps has, thus giving it more crawls, more frequently.
7. Page speed and hosting.
Page speed is just one of the many Google priorities for 2019 websites, it’s also a ranking factor and signal. There are a lot of tools out there to help you test your pages’ load time. Google’s own PageSpeed Insights tool is great and so is something like GTMetrix.com. It can take some time to enter all of your pages’ URLs to check for speed, so be patient or look for other alternatives.
If your site doesn’t get green lights on some of the areas of the test, Google will provide you details with “how to fix” recommendations which is super helpful. You can use a download link with a condensed version of your images if they’re too heavy. If you need to compress your images, check out this tool, it’s amazing for image compression.
Also, don’t forget about hosting. Hosting can make or break your website and load times so if are using any of these on this list of hosting companies, change now. Your Technical SEO depends on it.
8. Get mobile and get results.
Years ago, the news was announced that Google started the mobile-first indexing of the web, meaning that they will index the mobile version of your websites instead of the desktop version. Mobile comes first as most searches are done on mobile. Have you started on your voice searches for 2019 and 2020? I know, it just keeps on going.
Just a small reminder, here are few things that you’ll need to keep in mind when getting your site indexed and what to pay attention to so your website gets some indexing love.
• Test your pages, both mobile-friendliness and fast desktop loading with Google’s mobile friendly SEO test tool.
• Run comprehensive audits on your website, just like you do with any other website. Take the time to do it, it will be well worth it. Keep track of what is happening and if something becomes broke, fix it right away.
What are your thoughts on the technical SEO for this day and age? Which practices have you seen that are most effective for your industry and niche?