Video Transcription

Technical SEO

Hey there. Welcome back to Life Science Marketing Lab.

In this video, we’re gonna be discussing some technical aspects of SEO. This is the stuff that has nothing to do with your keywords, on-page, getting your keywords into your content or anything like that. This has more to do with the developing of your website. What your developer should be dealing with, but it’s very good to have a basic understanding of this kind of stuff.

I split this up into two sections:

1. Indexing – Ensuring that search engines are finding the content that you want it to.

  • Sitemap
  • Robots.txt file
  • Google Webmaster Tools

You can use these files and tools to ensure that search engines are indexing the correct pages. When it indexes those pages, it can then use them in the search results. It’s also important for you to tell the search engines what pages you don’t want it to produce in the search results. By using these three things, and other stuff, you can do that.

2. Site Functionality and UX (User Experience)

  • Page Speed
  • Broken Links

There’s a reason why your site functionality and page speed are important. If a user comes to your website and he is waiting minutes and minutes for a webpage to load, he is gonna bounce to another site. So, this is something to do with your site functionality that’s really important for SEO. Because, if your page speed is slow, and users are bouncing to other websites for that reason, that’s gonna impact your user behavior data, which Google picks up on. It’s gonna know that, for some reason, you have a bad site, and that could be due to page speed. In fact, Google has a page speed tool that it uses to understand what your page speed is, and it uses it as a direct ranking factor.

The other thing is broken links. Broken links are pages on your website that don’t exist anymore, that are still being linked from either external websites, or from internal pages in your website. That causes a bad user experience. If these users are constantly landing on pages that don’t exist, again, it’s a bad indicator. Google picks up on this stuff, your users are gonna go to different websites, because they have bad user experience, and your user behavior metrics are going to be affected.
Let’s have a look at the computer, and I’m gonna show you in a bit more detail some of this stuff.

Here we have our website, supremeoptimization.com. First, we’re gonna have a look at sitemap, what that is and how it helps Google to index pages. This is the sitemap for our website, and you’ll always find out the root of the website. From here, they are subsite maps. If you click on the first one, the post-sitemap, it lists all the different posts on our blog that we have published. What a sitemap basically does is it lists all the pages on your website that you want Google to index. You want to make sure that every single page is on here, that you want to be displayed in the search results. When you have the sitemap, it is very important for Google to find it, which is why you always put it in the roots of your site.

There are two other tools to ensure that Google finds your sitemap. The first is your robots.txt file. Again, this is always at the roots of your site and the URL is always in this format (robots.txt) so that Google (and other search engines) web crawlers can find it. And what you do in here is you actually reference where your sitemap is. So, this entry

(https://www.supremeoptimization.com/sitemap_index.xml) references where the sitemap is. What also really good about your robots.txt file is that you can tell Google what files and pages on your website you don’t want it to index. For example, we have /wp-admin/ subdirectory. These are files used by the WordPress admin users that nobody needs to see. In fact, you probably can’t even see these pages, so it’s good to tell Google not to go and crawl these pages. The reason why that’s important is because one of web crawlers comes to your site, it only uses up so much energy, so much ?? before it bounces to a new site to index that. It is wasting time crawling pages that you don’t want it to crawl. You may miss out, or not index the pages that you actually wanted to. It’s good to disallow any files or pages that you don’t want to be crawled.

The next tool is Google Webmaster Tools, also known as the Search Console. It is a free tool provided by Google, to actually tell Google how exactly you want it to view your site. There are many, many tools in here that we could talk about, but the main one we’re gonna talk about is submitting your sitemap. It’s very simple, you go to Crawl, then you go to Sitemaps, you click the red button (ADD/TEST SITEMAP), you add URL of your sitemap and it is submitted. This is just a way to tell Google where your sitemap is, go and crawl it, crawl the pages, and then that will be indexed. So, that’s indexing.

The next thing we’re gonna talk about is your site functionality and your user experience. As I was mentioning, your page speed is very important. It’s very good for your user experience, which Google sees as a very significant ranking factor. There’s a really good tool that Google provides, called PageSpeed Insights. You can type any website in here, click analyze, and it’s gonna analyze your page speed. I analyzed BD Biosciences here, to see what the page speed is. It’s pretty low. For such a big brand of life sciences, it’s very low – 33 out of 100. This actually gives you recommendations of what you can do to fix your page speed. We want to be looking to get up into the 90s, we want this to go from red to orange to green. We want it in the green category, which means it’s very fast. Some of the things it suggests is to optimize your images. BD Biosciences, obviously, have a lot of heavy images here, that they can reduce the size of. So that will reduce the length of the time it takes to upload this webpage, if you reduce the size of the images. Other things like leveraging and browser caching, compressing or minifying your JavaScript files, things like that. There will be similar recommendations for your mobile site as well. So that’s your page speed, and that should significantly improve your user experience and your site functionality.

Next is broken links. Broken links are links from external websites, or links on your internal web pages, that link to pages that no longer exist. If people are clicking on those links, they are landing on these pages that don’t exist anymore, which is really bad for your user experience, and therefore it’s gonna impact your SEO and your rankings. This is the Google Search Console for Supreme Optimization. Google actually references all the broken links that exist on your site. Here it is: 81 broken links, also called not found pages. You just go to a crawl and crawl errors, and click on not found and we can see all the pages that are referenced, that no longer exist. These have actually been fixed since, but let these on here to give you some good examples. We can use this one for an example, the rollout consultation page. This is what somebody will see if they clicked on the link of this page that no longer exists. At some point, this page did exist and we deleted it. But what we forgot to do was to leave the link from some of our internal web pages. So, we’ve since dealt with that, and this has been fixed. This is a really good tool to go and check your broken links and get them fixed. If it turns out that some of these links are external websites, you may not have the control to go on and remove those links. In fact, those external links could be very good for your SEO, it’s very good for your link profile. You don’t wanna waste those. What you want to do is get this page redirected to another internal page that’s relevant to that link. In that case, you are saving that external link and the ranking power that it gives your website.

That’s it. Here, we discussed indexing, site functionality and user experience for your SEO – some more technical aspects that can improve your rankings.

Thanks again and see you soon.