You appear to be offline. Some site functionality may not work.

Improving the Delivery of Large Files With Streaming Miss and Large File Support

By  Simon Wistow, Co-founder, VP Product Strategy, October 27, 2014 in StreamingPerformanceProduct

Today, we’re excited to announce two related features that lower bandwidth costs and reduce origin load for Fastly customers, resulting in faster downloads for their users: Streaming Miss and Large File Support.

Streaming Miss

Up until now, an object would be fetched from an origin server in full, written to disk, and then sent back to the client. With our new Streaming Miss feature, objects will be streamed back to the client immediately, and an object is written to cache only after the whole object has been fetched. This reduces the first byte latency, which is the time that the client must wait before it starts receiving the response body. The larger the object, the more pronounced the benefit of using this feature.

To illustrate this, let’s look at a download example. If a 25MB application is being served from an origin over a connection that’s giving each client ~375kb/s, the download will take about 70 seconds. If that application was cached on Fastly without the Streaming Miss feature, then the first client to get a miss would have to wait 70 seconds while Fastly fetched it from the origin, and only then would they start downloading from our edge server.

With Streaming Miss, the first client wouldn’t have to wait — they could start downloading from Fastly as soon as our edge servers receive the content.

Large File Support

The second feature we’re rolling out today is Large File Support. Previously, anything larger than 100MB was uncacheable on Fastly and streamed back to the user. Starting today, the maximum size is set to a whopping 5GB.

The practical upside of this is that our customers will now be able to offload content to us they couldn’t have before, including large PDFs, image-rich documents, audio and video files, and even application downloads and binary updates. Fastly will now be able to cache these files and greatly improve the end user experience.

Further Reading

Learn how to enable these two features here: “What support does Fastly have for large files?” The article also includes a few caveats and more information on potential failure cases.

As always, we welcome any questions, suggestions, and feature requests. Contact us by emailing

Streaming Performance Product

You may also like:

Subscribe to our newsletter

Accelerating Rails, Part 1: Built-in Caching

Caching is one strategy that helps ease scaling pains that I often see Rails developers overlooking. Starting out with caching can be confusing, because terms and documentation can be convoluted, especially if you’re not an…

Using ESI, Part 2: Leveraging VCL and ESI to Use JSONP

In this post, I’m going to discuss how you can leverage ESI and VCL (Varnish Configuration Language, the domain-specific language that powers Fastly’s edge scripting capabilities) to use JSON responses, even when they’re loaded from…

New Gzip Settings and Deciding What to Compress

Fastly recently conducted an extensive analysis of which resources should be compressed. Today, the results of that analysis are reflected in the Fastly app, which allows our customers to adopt better gzip settings. This not…


Simon Wistow | Co-founder, VP Product Strategy

Simon is co-founder at Fastly, where he helps lead product strategy. Before helping found Fastly Simon was Senior Search Engineer at Yahoo! Europe, LiveJournal, SixApart, Scribd and then at social help desk company Zendesk. In a past life he worked on R&D for a leading VFX Company doing films like the Harry Potter series, Troy, Kingdom of Heaven, Sunshine, and Wallace and Gromit. At one point he worked as a cowboy in Australia. Mostly because it seemed like a good idea at the time.