Improving your search engine ranking (SEO) with Fastly

SEO is a murky science, with search engines striving to present the best possible result, and everyone else trying to figure out what "best" means in practice. What we do know is that reliability, speed, and security make a significant difference, and Fastly can help get you closer to number one.

For sure, the most important aspect of getting to the top of a search result listing is the content of your page. Fastly generally leaves the content of your site to you, but let’s first cover some of the more common content optimizations you should be doing:

  • Include the key terms that describe your content in the form people might use when searching it. Include terms that consider the way people might pose a query to a voice assistant, such as Siri or Alexa. The humble FAQ is ideal for this (as long as the questions really are "frequently asked!").

  • Add microdata to your page, marking up all relevant entities such as people, companies, recipes, and so on. Check out schema.org for details (one of the most interesting recent additions is ClaimReview, for fact checking). You may also want to include opengraph for social sharing, or the proprietary microdata formats for networks that haven’t adopted opengraph yet.

  • Make "snippet bait." Search engines will sometimes snag a small section of your content and display it directly on the result UI if they think it might directly answer the user's query. Short paragraphs are the most common snippet bait, but lists and tables can also be extracted if properly formatted. Although it might seem counterproductive to allow a search engine to "steal" your key content, it does seem to increase click-through rates.

  • Write longer pages. Apparently, search-ranking algorithms like content that is longer and more detailed, with original text that the crawler hasn't seen anywhere else (so don't just mirror another site).

  • Include LSI keywords, the words that power a search engine's autocomplete or "related searches" feature. Search for some of your site's key terms, and look at the suggestions for related searches. Put those terms on your page too, if they are relevant.

  • Design your site to be responsive. Search crawlers will often load your site on both a desktop and mobile device and will expect to see the same content.

  • Name the author and provide a date of publication. Traceable provenance increases reputability.

Snippet Bait

Adding microdata and perfectly sized paragraphs to your pages makes them “snippet bait.”

For more detail on these, consult an SEO expert or one of the many excellent online guides that are… strangely discoverable!  Most of the advice above is covered by Backlinko's guide, which is great. 

Other things that are important to determine your search ranking but that you can't affect directly:

  • Dwell time: Google's RankBrain system will downrank you if a user clicks into your site and then comes back to the search result page too quickly. This suggests you didn't inspire them to stay. Research suggests you should be aiming for multiple minutes to get a slot at the top of the search ranking.

  • Age: Sites that have been around since the dawn of the internet get kudos for that (something that kept a namesake of mine ahead of me on Google for many years, simply because he registered his domain in 1994!).

  • Referring domains: The original principle of Google’s pageRank system still contributes a huge amount of value to your rank, and it's simply how many other unique domains link to your page.

This brings us to things Fastly can actually help with, and the main focus in this area is security and speed. Faster, more secure sites get higher ranking. This is the challenge we were made for! Let’s break down all the ways we can help:

Make your site more secure and your connection more stable

Just by putting Fastly in front of your site, you’ll get a number of immediate SEO benefits. We use modern crypto for our TLS connections and offer a secure, fast, optimized version of HTTP. Even if your connection to origin is not secure (which is a bug!), you won’t be penalized in search rankings while you fix that problem.

Fastly also offers the benefit of Dynamic Site Acceleration (DSA). If you use Shielding to connect your site to users through two Fastly edge locations (one near you, one near the end user), we can take advantage of permanently open connections across the majority of the route.

Finally, out-of-the-box Fastly will shield you from any traffic on non HTTP ports and any invalid HTTP traffic. That will stymie a majority of DDoS attacks, making your site more resilient. The worst possible search crawler experience is to find that your site is down!

Compress data using client hints, IO, gzip, and brotli

Anything that can be compressed should be compressed to reduce the amount of data transferred on the wire and make your page load faster, especially on bandwidth constrained devices. Fastly offers gzip and brotli compression on the fly, which is great for text-based formats like CSS, JavaScript, and HTML; and our Image Optimization service, which is better for images (that don't benefit from general purpose compression such as gzip).

Images, in particular, often need to scale to different sizes depending on the characteristics of the user’s device, and that’s no different for search crawlers. Using client hints can help you make a smart decision about what size of image to send, or use a responsive image definition in your page source, such as the element or a srcset or sizes attribute.

A reasonable search-engine special here could be to increase the compression ratio for crawlers, since no human eyes will see the image.

Go to fiddle

Serve a stale response first

A common feature many of our customers use is the ability to serve a stale response if your origin server is down. This significantly increases the external reliability of your website and can helpfully paper over the odd period of origin downtime:

Go to fiddle

For search crawlers, we could take this one step further. Search engines tend to explore the darkest nooks and crannies of your site in a way that normal users do not. Search crawlers are therefore more likely to encounter uncached pages or pages that are in cache but stale. Since it may not be necessary for the search engine to have the very latest version of the page, you could consider serving stale to search crawlers proactively, without waiting for the origin to refresh the content:

Go to fiddle

Normalize requests as much as possible

Serving a stale version is all very well, but we need to have something in cache to start with. Another angle of attack to make your site faster is to canonicalise requests as much as possible.

We can do this in a number of ways internally:

  • Remove querystrings from paths that don’t support them; alphabetize the parameters on paths that do support query strings

  • Force all path segments and query param names to lowercase

  • Remove or normalize very granular headers such as User-Agent

Go to fiddle

And some that are better done externally, via a redirect:

  • HTTP to HTTPS

  • Adding a / to bare directory paths

  • Consolidating on a canonical hostname, e.g. www.example.com to example.com

Combining all these techniques will prevent a search crawler from seeing multiple copies of the same content on different URLs and will also increase the probability of a cache hit.

Add policy headers

Showing that your site has a narrower attack surface and respect for user privacy may be a narrow ranking benefit today, and may be a major one in future when combined with technologies like web packaging.

Here’s a quick (and by no means comprehensive) checklist of the key ones:

Putting that together:

Go to fiddle

Use server-timing to monitor critical path cacheability

The new HTTP server-timing header allows a server to send useful metadata to the browser which can be captured into tools like Google Analytics. While at a per-request level it’s hard for Fastly to know whether a particular request is critical to the rendering of the document, that information is something a page can work out in the browser.

Using server-timing, we can annotate responses to show whether they were cacheable (it doesn’t really matter for this purpose whether or not they were actually served from cache).

Go to fiddle

If you were so inclined, with some client-side Javascript, you could sample all the start and end times of the resource loads, find the critical ones (perhaps those that occur before first render, or you could create your own user-centric timing metric), and see if they were cacheable at the edge, then send a beacon formatted using the Reporting API standard if you find an uncacheable one. I love that the web now has such a rich set of tools for evaluating this kind of performance and standards for expressing the data.

Perf your way to the top!

These changes might not offer quite the same impact on your search performance as content modifications, but don’t discount the effectiveness of security and performance best practices in getting to that prized number one spot.

Andrew Betts
Principal Developer Advocate
Published

7 min read

Want to continue the conversation?
Schedule time with an expert
Share this post
Andrew Betts
Principal Developer Advocate

Andrew Betts is the Principal Developer Advocate for Fastly, where he works with developers across the world to help make the web faster, more secure, more reliable, and easier to work with. He founded a web consultancy which was ultimately acquired by the Financial Times, led the team that created the FT’s pioneering HTML5 web app, and founded the FT’s Labs division. He is also an elected member of the W3C Technical Architecture Group, a committee of nine people who guide the development of the World Wide Web.

Ready to get started?

Get in touch or create an account.