80% of Secure URLs Aren’t Appearing in Searches [Report]

google announces

It’s no secret that online privacy is a pretty big deal these days. It’s such a big deal that last summer, Google announced that they’d begin to give a slight boost in rankings to sites that used strong HTTPS encryption by default.

The announcement wasn’t missed by many webmasters, who rushed to enhance the security of their sites in the hopes of improving rankings and grabbing an easy share of more leads or driving more sales. Unfortunately, as many as 80% of these webmasters didn’t take the extra steps necessary to ensure their site’s improved rankings.

Google’s Gary Illyes announced recently that 80% of eligible HTTPS URLs aren’t being displayed as such in Google Search Engine Results Pages (SERPs). It’s an unfortunate number, but it’s not surprising. If your site is using HTTPS URLs but you’re not seeing them in SERPs, here are some of the most likely reasons why, and what to do about them:

URLs have crawl issues

urls_crawl_issues

If this is your issue, you’re in luck, because these are an easy fix. The HTTPS URL can’t be accessed by Google, either because it’s password-protected, blocked by the server, or the page doesn’t exist anymore. To fix this, have your webmaster first make sure that the page exists (meaning that it doesn’t return a 404 error), and then make sure that your servers aren’t blocking it, and that it doesn’t need a password to be accessed.

URLs have “noindex” designation

urls_no_index

Often, when pages are under development, webmasters will use the “noindex” designation to make sure that Google isn’t crawling an incomplete page. If an HTTPS page has a “noindex” tag on it, even when it’s done being developed, Google still won’t crawl it, and therefore it won’t show up in SERPs. The fix on this one is an easy one too: just have your webmasters remove any noindex tags on HTTPS pages that are ready to be viewed by the public.

Rel=“canonical” tags use HTTP variant

rel_canonical_http

Canonical tags are often used by webmasters to inform search engines which pages should be indexed when there are multiple pages on the site that are very similar. In this case, it’s likely that webmasters put a rel=“canonical” tag on the HTTPS URLs that point at the HTTP version. Again, this is an easy fix: just have your webmasters switch this tag out so that the canonical tags on the HTTP pages point at the HTTPS pages.

XML sitemap contains HTTP variant

xml_sitemaps_http_varient

XML sitemaps are the roadmaps (remember those?) of your site for search engines. Many content platforms will automatically refresh the sitemap after a certain interval of time (sometimes days, sometimes weeks), but they don’t all refresh themselves, so it falls on the webmaster to make sure that the most recent URL structure is easy for search engines to find. This fix may be a bit more involved, especially if your site doesn’t have a plugin that updates the sitemap, but it should still be a pretty easy task for accomplished webmasters.

Once you can cross all these items off of your to-do list, re-submit the sitemap to Google via Webmaster Tools and it shouldn’t be long until you’re able to see your HTTPS URLs ranking in SERPs!

Ryan Durling

Ryan Durling is Digital Project Manager at 451!

Leave a Reply

Your email address will not be published. Required fields are marked *