Node.js-style HTTP interfaces for Compute@Edge

Our Compute@Edge JavaScript platform provides Request and Response objects, but these are based on the Fetch standard, rather than the req and res objects traditionally seen in Node.js programs. If you have a program designed for Node.js that you are thinking about moving over to Compute@Edge, or if a library you want to use is designed for Node, our new open-source library, http-compute-js, has got your back.

For a long time, Node.js has provided IncomingMessage and ServerResponse, objects that represent requests to and responses from a web server. These objects don’t match up with the Request and Response objects defined by the modern fetch standard and implemented by JavaScript in Compute@Edge. Although fetch is natively supported in recent versions of Node.js, most Node.js code today is not written for it.

With http-compute-js, we aim to give developers objects that have the familiar Node.js-compatible interface, so that you, or some library that you want to use, can interact with them.

Look, ma! Node.js http objects in Compute@Edge!

Take a look at this:

import http from '@fastly/http-compute-js';
const server = http.createServer((req, res) => {
res.writeHead(200, { 'Content-Type': 'application/json' });
data: 'Hello World!'

If you didn’t see the import statement at the top, you might think this is a normal Node.js program. The createServer function accepts an event handler that gets passed a callback whose arguments can be used as in a Node.js program.

req is an <u>IncomingMessage</u> object, and its readable streaming interface has been wired up to the body of the Compute@Edge request's body stream. As such, you can read from it using the standard stream mechanism such as on('data') and on('end'), or by piping to another stream, or using libraries such as parse-body. You can also read the headers and other information as you would in Node.js.

res is a <u>ServerResponse</u> object whose writable stream interface is wired to an in-memory buffer. Write to it normally using res.write() or res.end(), or pipe to it using res.pipe(). You can also set headers and status code the same way as you would in Node.js.

Here’s a slightly more involved example that shows some of these features.

import http from '@fastly/http-compute-js';
const server = http.createServer(async (req, res) => {
// Get URL, method, headers, and body from req
const url = req.url;
const method = req.method;
const headers = {};
for (let [key, value] of Object.entries(req.headers)) {
if(!Array.isArray(value)) {
value = [String(value)];
headers[key] = value.join(', ');
let body = null;
if (method !== 'GET' && method !== 'HEAD') {
// Reading data out of a stream.Readable
body = await new Promise(resolve => {
const data = [];
req.on('data', (chunk) => {
req.on('end', () => {
// Write output to res
res.setHeader('Content-Type', 'application/json');
res.statusCode = 200;

Notice that the callback passed to createServer is async. The response is built after the await, but the handler is able to wait for the res.end() before constructing a Response and sending it back through the underlying Compute@Edge fetch event handler.

Polyfills help us do this right

To help us get http-compute-js to behave as they’re expected to, we decided to use existing, well-tested polyfills for some of the underlying parts, for example for streaming and buffering. Compute@Edge JavaScript programs use Webpack for bundling into a web worker, which is where we can add these polyfills. To use http-compute-js, you therefore need to make a few changes to the webpack.config.js file that comes with a JavaScript Compute@Edge project. 

Add the following webpack.ProvidePlugin() to the plugins array, and add the following items to the alias and fallback sections, creating the resolve, alias, and fallback properties as needed. 

module.exports = {
/* ...other config... */
plugins: [
new webpack.ProvidePlugin({
Buffer: [ 'buffer', 'Buffer' ],
process: 'process',
setTimeout: [ 'timeout-polyfill', 'setTimeout' ],
clearTimeout: [ 'timeout-polyfill', 'clearTimeout' ],
resolve: {
alias: {
'timeout-polyfill': require.resolve('@fastly/http-compute-js/dist/polyfill'),
fallback: {
'buffer': require.resolve('buffer/'),
'process': require.resolve('process/browser'),
'stream': require.resolve('stream-browserify'),

Once you’ve set these, the examples earlier in this post will work in the same way you’d expect them to work in Node.js.  We're hoping to ship setTimeout support in Compute@Edge soon, so we may also be able to reduce the number of polyfills at some point.

Manual instantiation of req and res

Sometimes, you may need to use Node.js-style request and response objects for only some parts of your program. Or, perhaps you want to make a one-off call to a function that works with them. For those cases, we provide utility functions that help you go back and forth between the Request and Response objects used in Compute@Edge and their Node.js-compatible counterparts.

/// <reference types='@fastly/js-compute' />
import { toReqRes, toComputeResponse } from '@fastly/http-compute-js';
addEventListener('fetch', (event) => event.respondWith(handleRequest(event)));
async function handleRequest(event) {
// Create Node.js-compatible req and res from event.request
const { req, res } = toReqRes(event.request);
res.writeHead(200, { 'Content-Type': 'application/json' });
data: 'Hello World!',
url: req.url,
// Create a Compute@Edge Response object based on res, and return it
const response = await toComputeResponse(res);
return response;

This example shows how easy it is to create Node-compatible req and res during a fetch handler, as well as how to convert to a Compute@Edge Response object and return it.

We want you to run more code at the edge!

The internet as a programmable platform is always evolving. We’re proud to be a leader in this space, and we want Compute@Edge to continue to be an attractive platform for developers to target.

We understand that this constant evolution means everything is changing all the time. At Fastly we continue to build tools that empower you to run even more code at the Edge and develop for it faster, while aiming to enable the use of a wider range of tools. We can’t wait to see what you’ll be creating with this tool, and we’d love to know what this and our other tools have enabled you to create.  Reach out to us on Twitter and let us know!

Katsuyuki Omuro
Software engineer, developer relations
Want to continue the conversation?
Schedule time with an expert
Share this post
Katsuyuki Omuro
Software engineer, developer relations

Katsuyuki, or “Kats” for short, is a Japan-based developer and real-time web enthusiast on the Developer Relations team. He is particularly passionate about figuring out how things work and teaching others, to help them learn and grow as well.