This article was published on June 17, 2018

7 detailed tips on how to improve your website in 2018


7 detailed tips on how to improve your website in 2018

It’s no secret that Google loves technically excellent sites. If you want your sites to rank as well as possible, there’s no substitute for ensuring that you’re nailing the basics and doing everything you can beyond that.

While some aspects of technical excellence have always been important, in this article I want to focus on seven reasons that your site isn’t performing as well as it could be in 2018. If you can fix these issues, your site will be in the best position possible for this year and the foreseeable future.

1. Optimizing for mobile

Mobile site visits currently account for over 52 percent of web traffic worldwide. In countries like the US and UK, this number is closer to 60 percent. Even before 2018, having a good mobile site was essential if you wanted your business to retain and convert as many users as possible.

Now, the announcement of Google’s mobile first index, which means that the mobile version of a site will be used to determine the baseline for how well a site should rank, rather than the desktop version, makes mobile sites essential for search engine visibility as well as conversion.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

If you want to make your site mobile-friendly or update an existing mobile site, it’s important to do it the right way. There are three common solutions for mobile sites:

  • m.example.com — the original solution; involves creating a separate site on an “m.” subdomain and redirecting mobile users to it
  • Dynamic serving — separate site designs are created for each device; the server loads the right one to the user depending on what device they’re using
  • Responsive — only one set of HTML code is created and maintained, but it’s rendered differently based on screen size

While all three solutions are viable, Google has explicitly named responsive as its preferred method.

Responsive design has multiple benefits. From Google’s perspective, it’s preferable because there are no redirects involved.

From the point of view of developers and web managers, it means you only need to deal with one set of HTML, which makes building, maintaining, and tracking the site much easier.

If you’re dealing with multiple sites through m. and dynamic serving, monitoring the customer journey becomes much harder, and we know that many conversion funnels involve visits from different devices.

In 2018 and beyond, it’s hard to think of situations where a responsive design isn’t the best mobile setup. While Google doesn’t actively punish non-responsive mobile sites, if they’ve listed it as their preferred mobile solution it’s reasonable to infer that they see some benefit from doing so.

2. Considering AMP for content and ads

This is probably the most controversial of the seven points I’ll raise in this post. Many digital marketers, myself included, have raised warning flags over AMP (accelerated mobile pages) in the past, encouraging webmasters and fellow marketers to assess the value that AMP can actually add to their business before spending time implementing it.

While this is still the case, AMP is important for enough businesses that it’s worth including in this article, and I believe that it’s importance will increase in the future. In particular, regular news publishers and ambitious ecommerce store owners should pay particular attention to this point.

As a quick recap, AMP is a way of serving content at lightning speed to mobile searchers. Google caches an AMP version of a page (separate from the page’s original), which it can serve to searches without them needing to wait for your server to serve up a full page (Google’s cache servers are much faster than yours). Accelerated mobile pages are commonly seen in Google’s news carousel and the article feed found in the Google app, but are starting to creep into ecommerce as well.

Because AMP requires the creation and management of new versions of your content, it’s essential to consider whether or not implementing it is actually going to benefit your site. If it doesn’t, general site speed could be a more beneficial focus, given that the whole point of AMP is to speed up mobile browsing.

That said, AMP’s prevalence is on the rise, and I could see it becoming a new web standard that all sites need to achieve, in the same way that SSL certificates became the standard. It’s possible to build new sites with AMP in mind from the beginning, so it could become more widespread as websites naturally evolve.

With this in mind, my team and I are looking into the possibilities of AMP for ecommerce sites and Google Ads. It’s currently possible — though not widespread — to serve AMP content through Google’s text ads, where they’ll load on Chrome and on Android devices. To do so, simply enter the landing page’s AMP URL as the ads mobile URL.

In addition to the news carousels, AMP pages are also eligible of replacing the standard URL version in mobile search results, as shown in the images below with the AMP lightning bolt icon.

I’m still not sure to what extent this will catch on as a fully-adopted standard, but what’s certain right now is that site speed is essential if you want to see good results in both organic and paid channels. How a fast load time is achieved is less important than it being achieved in the first place, as we’ll explore more next.

3. Site speed

Whereas the speed of AMP content is controlled by Google, there’s no reason why you can’t achieve a really fast website through your own actions. Your server architecture, site build, and caching can all be set up in such a way that your site loads quickly on all devices.

If your site is slow, you will be losing out on both organic visibility and conversions. Speed is a ranking factor in its own right, but slow sites will also struggle to retain users throughout.  

There are multiple different ways that you can optimize site speed. We can look at a few of them by breaking the page loading process into four general (simplified) parts (based on this breakdown):

  • Request, where website data is requested from the server
  • Response, when the server brings together the files needed to create the webpage
  • Build, when the browser turns the data from the server into a HTML and CSS DOM (document object model)
  • Render, when the browser adds resources from other sources, such as a style sheet or JavaScript file

There are ways to optimize the speed of each step. When all the stages are optimized, your site will be significantly faster than it could be otherwise.

Request — using a CDN and fewer files

CDNs sit in between the server and the browser to bring assets like images, CSS and JavaScript closer to the user. In some cases, you can even put a whole site onto a CDN. My teams favorite content delivery network is Amazon Cloudfront; a solution like this cuts a user’s request time from any given location.

In addition, I recommend carrying out an audit of the files requested for each page of your site. Many themes and scripts will try to load big files in the background of every page on your site, even if those files aren’t needed for that page. Cutting out these unneeded requests cuts down your load time.

Response — caching and HTTP/2

Most dynamic and content managed websites are comprised of contents in a database, styling templates and some kind of software package that can pull it all together. To prevent the server needing to pull together a site’s content for each user, we can cache it to store a ready-made version of the site.

Caching can be implemented within a site’s CMS, but server-side caching can also be used to store more powerful configurations at the server level. Caching and CDNs also combine well, as you can cache whole pages on a CDN to handle the request before it hits the server.

You can also make use of the latest server technology, like HTTP/2, to speed up the process. HTTP/2 allows data to be ‘streamed,’ which means the server handles multiple requests from the browser at the same time, rather than dealing with them one after the other.

The key takeaway from this section is to avoid cheap, shared or ‘no frills’ hosting. These options might seem attractive but the lower price will cost you valuable speed and performance, which could cause you to lose out on more money in the long run. For more information, take a look at this blog post on performance.

Build and render — optimizing the critical path and loading files asynchronously

The build process will be sped up if you employ some of the techniques I’ve mentioned in the past two sections, but there’s more you can do to cut the time it takes to render the page on user’s browser.

The key at this final stage is to optimize the critical rendering path, which means cutting down the number of resources that could block the first part of the page that the user would see from rendering.

You can prevent render blocking, thus speeding up the whole rendering process, by loading files asynchronously or de-prioritizing some of them so that they load last. Most content management systems actually come out of the box with some optimization for render blocking, but if you continue to see poor in-browser performance you should look more closely at this issue.

If you’re looking for ways to speed up the process yourself, it’s essential to ensure that big JavaScript and CSS files don’t need to be loaded in the document head. If possible, all JavaScript should be de-prioritized to the footer, CSS should be loaded only when necessary, and external resources should be loaded asynchronously if possible.

Site speed optimization is a very technical activity, but it’s well worth doing if you want your site to perform well in both organic and paid channels.

4. Understanding how Google crawls (or doesn’t crawl) JavaScript

JavaScript frameworks are becoming a popular tool in modern web development, but Google has historically struggled to crawl pages which are rendered this way. In 2018, Google’s crawlers are better than they’ve ever been at crawling and rendering JavaScript, but they’re not all the way there yet.

JavaScript frameworks basically allow pages to be rendered from JS, rather than loading a large quantity of HTML from a server. Sites built this way only require server requests for product- or page-specific data; everything else is built dynamically.

Dozens of popular JavaScript frameworks now exist: Vue, React, and Angular, to name a few. All of these render content in a similar way. Proponents of this development method argue that it leads to a better user experience, and they may be right.  

However, Google isn’t completely on board. JS frameworks are known to produce issues for search engine marketers working in both organic and paid search. We know that Google renders JavaScript in its crawls some of the time, but not always. Search Console now allows us to see what a page looks like when rendered by Googlebot, but we know that this doesn’t happen each time Google crawls your web page.

A page without JavaScript rendering just looks like a page of mostly empty HTML tags to Google, which means it can’t see any of the content it needs to ascertain that page’s topic and quality, including both text and images. This has long been an issue for pages where certain text, graphics and links were hidden behind or loaded with JavaScript, but could now be an issue for entire sites if they rely on JS rendering.

We do know that this is something that Google is working on. A year from now, maybe they’ll be all the way there. However, as of right now, high performance sites should be avoiding JS frameworks for all key content and navigation items.

It’s going to take some time to iron out the best practices for search visibility, and it’s probably going to be worth waiting until Google’s crawlers are better at rendering JavaScript before you commit to it entirely.

5. Setting up international sites properly

In the natural course of our work, we see both good and bad examples of internationalization on a weekly basis. Various different agencies and brands have been building websites for different territories for some time now, often without any consideration for search optimization.

In 2018, nearly every business is getting online, and even some of the smallest are looking into options for internationalization. The need to have the right set up is more important than ever — if you do it wrong, you’ll lose out in an increasingly competitive global market. The more you can do to get your site right across multiple territories and languages, the more you stand to gain.

Correct implementation of internationalization is where a user lands on a site that’s in their language or targeted at their country. Google is getting a lot better at serving different users the right content for their country, but there’s still a lot that webmasters need to be doing to ensure that their users’ experiences are as good as possible.

Google has helped to make it as easy as possible for organic search marketers to get internationalization right by providing a meta tag called ‘hreflang’. This is a tag that sits in a page’s HTML head. The tag sits in a link element and contains both the language and the page that the user should see if they speak that language.

Multiple languages can also be specified in the same element. Google has a support page on the topic that gives concrete examples of how to implement it. Plenty of content management systems now support hreflang out of the box, so there’s really no reason not to implement it if you’re targeting international audiences.

Wherever there are multiple language options for a page specified in hreflang tags, I would recommend also setting a default target so that it’s clear to Google that any users in unspecified territories should see that page.

This gives you as much control over different users’ journeys through the site as possible, making it less likely that Google serves the wrong content. As Google gets better in this regard, the risk should naturally decrease anyway.

6. Putting content into silos

Putting content into silos, which means creating a site architecture wherein key informational pages are nested in the same folders as your main service or category pages, has been known to help improve organic rankings for core keywords for some time. Now we’re in 2018 the strategy has matured, making silos more user friendly and more effective than ever before.

There are at least two main benefits to a well-made content silo: improved organic performance for related keywords and improved user experience. Topical content nested in a single folder helps the other pages in that folder to rank by improving the quality of related content on the site and allowing for a clearer internal linking structure.

Google is always looking to rank sites with a good level of information about their targeted topics more highly, even if that information is not all contained in one page. In addition, the higher number of internal links through breadcrumbs and contextual links makes pages easier for Google to find and index, and reinforces the topical relation of a group of pages.

User experience can also improve as a result of well-made silos. For example, an FAQ for a high-value category in the same folder as the category pages allows users to find more information quickly and easily, which could be the difference between them staying on your site or looking elsewhere.

If all your informational content is hidden in a blog or guides section, it can be very tricky for users to dig it out, especially if you don’t have an effective internal site search feature in place.

Many content management systems will allow you to set up silos straight away, but some may need plugins installing first. When planning out the structure, you need to ensure that the URLs stay clean and are easy for users to navigate, otherwise you could end up weakening some of the benefits I spoke about above.

If your CMS doesn’t allow you to silo content out of the box, you may need to speak to the site’s developers about the best way to implement it.

7. Reviewing your old content and strategies

While it’s important to look ahead, sometimes the way to prepare best for the future is to optimize what you already have. There are a number of different ways to do this, but we only have space to touch on a handful.

One method is to look at your current rankings for ‘quick win’ opportunities. These exist where your pages are ranking just off page one, or maybe just outside the top three results. By building links to these pages, updating the content or adding relevant media, you could make them more successful with relatively low resource.

Another key strategy is to keep revisiting your high-performing informational and commercial page to ensure that they’re presenting the most up-to-date information. Doing so can help pages that are already receiving high levels of traffic to convert more of those visitors, translating into more revenue from your website.

This is especially important where you’ve employed an evergreen or cornerstone content strategy. If you want your top level content to remain relevant, it needs to be looked at periodically and updated if necessary.

Whatever content already exists on your site, it should be factored into the plans you have for your site in the future. It’s all well and good talking about this year and beyond, but you’ve invested time and money into your current content that shouldn’t be wasted.

Consider what you already have if you thinking about how to improve site speed, or which content silos to create, or what the mobile experience of your site is like. For these forward-thinking strategies to be at their most effective, they should work alongside and improve what you already have.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with