Supercharge Your Web Performance: Optimizing Next.js Apps with Cloudflare Workers and Browser Caching

If you're a web developer, you know how critical web performance is when it comes to user experience. Slow websites lead to high bounce rates, fewer conversions, and unhappy users. But what can you do to improve web app performance beyond reducing the size of your images and code?

In this tutorial, we'll explore how to use Cloudflare Workers and browser caching to optimize Next.js web applications for lightning-fast performance. Let's get started!

What is Next.js?

Next.js is an increasingly popular React-based framework for building server-side rendered (SSR) web applications. When you build a Next.js app, each page is rendered on the server and sent to the client as HTML, rather than purely client-side rendering. This can bring significant performance benefits while retaining the flexibility and interactivity of React.

Why is performance important in Next.js?

Despite the performance benefits of Next.js, there are still some aspects you should consider to ensure your app runs smoothly and quickly.

  • First contentful paint and time to interactive: A delay in pages loading results in a poor user experience. Users expect your pages to appear within seconds of clicking or tapping, especially on mobile devices where data is often slower.
  • Repeated data fetching: Ideally, a web app should make the least amount of data requests necessary to serve its purpose. Repeated requests for the same resource (like product info or user info) slow load times further.
  • Data size: The larger the amount of data that needs to load, the slower the page loads. Striving for smaller page sizes will result in faster load times for users.

In short, performance is essential in any web application, and these three factors above are some of the critical aspects that you need to pay attention to.

How can Cloudflare Workers and Browser Caching Help?

Cloudflare Workers can be a cutting-edge solution to enhancing Next.js app performance. With their ability to modify incoming HTTP requests and outgoing responses, workers allow you to augment your app's performance in unique ways. Here are two ways to leverage workers to optimize your Next.js app.

Benefits of Cloudflare Workers

  • CDN Load Balancing: Single data-centers cannot handle large traffic during high demands, Cloudflare Workers can handle high burst traffic.
  • Serverless: You don’t have to worry about managing operational tasks, deployment and scalability issues.
  • Improved User Experience: Speeding up load times improves user experience, keeping users happy and engaged with your website.

Browser Caching

Browser caching is a technique that stores assets like images, CSS, and JavaScript files on a user's device after they visit your site for the first time. By caching these assets, the browser doesn't have to re-fetch them every time the same user returns to your site or clicks through to another page. This can significantly improve page loading speed as the browser can retrieve the files from the local cache immediately.

However, caching can be tricky, and it's not always clear which files should be cached and for how long. Fortunately, Cloudflare Workers provide a straightforward way to handle browser caching.

To leverage browser caching for your Next.js app using Cloudflare Workers:

  1. Add a Cache-Control header to your Next.js-generated pages using the build time configuration by modifying the next.config.js file.
  2. Write a Cloudflare Worker to add the Cache-Control header to non-Next.js generated resources like images, CSS, and JS files.
  3. Configure the worker to add the Cache-Control header when the request path matches a specific regular expression.

Step 1: Add Cache-Control Headers for Next.js Pages

Next.js comes with an inbuilt runtime-generated cache-control header by default; however, it is possible to extend the caching times by modifying the configuration file as follows:


  # next.config.js
  
  module.exports = {
    poweredByHeader: false,
    generateEtags: true,
    compress: true,
    trailingSlash: true,
    devIndicators: { autoPrerender: false },
    eslint: {
      ignoreDuringBuilds: true,
    },
    future: {
      webpack5: true,
    },
    async headers() {
      return [
        {
          source: '/:path*',
          headers: [
            {
              key: 'Cache-Control',
              value: 'public, max-age=31536000, immutable',
            },
          ],
        }
      ]
    },
  };
  

This configuration extends the Next.js cache to 1 year, maximizing browser caching. However, some files like images and JavaScript files generated by third-party libraries might have shorter cache times. Let's take care of those in the next step.

Step 2: Write a Cloudflare Worker to Add Cache-Control Headers for Non-Next.js Resources

A Cloudflare Worker is a JavaScript program that executes on the Cloudflare network. Luckily, writing a worker for cache-control is simple. We'll write a worker that adds the `Cache-Control` header for files requested from the `/static` path. The worker will also add this header for `js`, `css`, and `svg` file types with `Content-Type` headers.


  addEventListener('fetch', event => {
    event.respondWith(handleRequest(event.request))
  })

  async function handleRequest(request) {
    const response = await fetch(request)

    const url = new URL(request.url)
    const isStaticFileRequest = url.pathname.startsWith('/static/') || /\.(js|css|svg)$/.test(response.headers.get('Content-Type'))

    if (isStaticFileRequest) {
      return new Response(response.body, {
        headers: {
          ...response.headers,
          'Cache-Control': 'public, max-age=31536000, immutable',
        },
      })
    }

    return response
  }
  

Step 3: Deploy Your Worker

By now, your worker code is ready, and it's time to deploy it. To do this:

  1. Log in to Cloudflare and navigate to the Workers section of your dashboard.
  2. Click the "Create a Worker" button.
  3. Copy the contents of the JavaScript file containing the worker code and paste it into the worker editor section.
  4. Save and deploy your worker.

Optimizing Next.js App Performance with Cloudflare Workers

Besides browser caching, Cloudflare Workers also allows you to cache frequently fetched data by placing your endpoint in between your API/backend and client. By doing so, you intercept fetch requests while returning the cached response, thereby utilizing less network bandwidth.

For example, you can cut down on database calls by caching frequently accessed information from your database in memory or on Redis or Memcached to ensure speedy delivery to clients.

Therefore, to optimize performance in Next.js with Cloudflare Workers, we can do the following:

  1. Create a worker that intercepts the API endpoint requests.
  2. Cache the data in Redis or Memory to allow speedy delivery to clients.
  3. Set a unique cache key by hashing the requested data to avoid serving wrong data to clients.
  4. Set the cache expiry to determine how long the data should remain cached.
  5. Periodically clear your cache to avoid feeding clients outdated data.

Here's an example:


  /**
  * Cache global variables
  */
  let cacheMemory = new Map();
  const REDIS_TTL = 60 * 60 * 24; // in seconds
  const CACHE_KEY_PREFIX = "nextjs-app:";

  /**
  * Get Redis client
  */
  function getRedisClient() {
    let redisConfig = {
      host: REDIS_HOST,
      port: REDIS_PORT,
      password: REDIS_PASSWORD,
    };
    return new Redis(redisConfig);
  }

  /**
  * Get from cache (Redis or memory)
  */
  async function getFromCache(cacheKey, defaultValue, expiry = REDIS_TTL) {
    let cachedValue;
    let isRedisEnabled = false;

    // Check if Redis is enabled
    if (REDIS_ENABLED !== undefined) {
      isRedisEnabled = (REDIS_ENABLED === 'true');
    }

    if (isRedisEnabled) {
      // Get Redis client
      let redisClient = getRedisClient();

      // Get cached value from Redis
      cachedValue = await redisClient.get(cacheKey);

      // If Redis doesn't have the item, fall back to memory
      if (cachedValue === null) {
        cachedValue = cacheMemory.get(cacheKey);
      } else {
        // Stash Redis value in local memory cache for future requests
        cacheMemory.set(cacheKey, cachedValue);
      }
    } else {
      // Fall back to simple memory cache
      cachedValue = cacheMemory.get(cacheKey);
    }

    // If we have a cached value, return it
    if (cachedValue !== undefined) {
      return JSON.parse(cachedValue);
    }

    // Otherwise, return the provided default value and add to cache
    return defaultValue;
  }

  /**
  * Save to cache (Redis or memory)
  */
  async function saveToCache(cacheKey, cacheValue, expiry = REDIS_TTL) {
    // Save to Redis
    let redisClient = getRedisClient();
    await redisClient.set(cacheKey, JSON.stringify(cacheValue), 'EX', expiry);

    // Save also to memory for faster retrieval
    cacheMemory.set(cacheKey, JSON.stringify(cacheValue));
  }

  /**
  * Clear cache by prefix
  */
  async function clearCacheByPrefix(cacheKeyPrefix, redisPattern = "*") {
    // Clear Redis cache
    let redisClient = getRedisClient();

    let keys = await redisClient.keys(buildCacheKey(cacheKeyPrefix, redisPattern));
    if (keys.length > 0) {
      await redisClient.del(keys);
    }

    // Clear memory cache
    for (const [key, value] of cacheMemory.entries()) {
      if (key.startsWith(cacheKeyPrefix)) {
        cacheMemory.delete(key);
      }
    }
  }

  /**
  * Build the cache key from the requested data
  */
  function buildCacheKey(cacheKeyPrefix, data) {
    const cacheKey = CACHE_KEY_PREFIX + cacheKeyPrefix + ':' + crypto.createHash('md5').update(JSON.stringify(data)).digest('hex');
    return cacheKey;
  }

  /**
  * The main worker function
  */
  async function handleRequest(request) {
    // Extract variables from the querystring or request body
    const { data } = await request.json();
    const uniqueIdentifier = data.identifier;

    // Get the cached response if it exists, else fetch from the server
    const cacheKey = buildCacheKey('post', uniqueIdentifier);
    const cachedResponse = await getFromCache(
      cacheKey,
      false /* do not always cache a fallback value */
    );

    if (cachedResponse !== undefined) {
      console.log('Cache HIT for ' + cacheKey);
      const response = new Response(cachedResponse, {
        headers: {
          'content-type': 'application/json',
        },
      });
      return response;
    } else {
      console.log('Cache MISS for ' + cacheKey);
      const response = await fetch(request, fallbackUrl);

      // If response is okay, cache it
      if (response.ok) {
        try {
          const jsonData = await response.json();
          await saveToCache(cacheKey, jsonData);
        } catch (e) {
          console.error('Unable to parse JSON response:', e);
        }
      }
      return response;
    }
  }

  addEventListener('fetch', event => {
    event.respondWith(handleRequest(event.request))
  });
  

By leverage Cloudflare workers as mentioned above, we can optimize our Next.js app and reduce network traffic, all while keeping users engaged with fast load times.

Conclusion

Optimizing the performance of our Next.js web applications is vital to ensuring a seamless user experience. With Cloudflare Workers and browser caching strategies, we can supercharge the performance of our web apps, resulting in happy users and better conversion rates.

Browser caching can be a quick and easy win by leveraging Cloudflare Workers, wherein we extend your browser's cache on your Next.js generated pages. Workers can also be used to cache frequently accessed data.

While Cloudflare Workers can be an incredibly powerful tool, remember that performance optimization is a long-term game that requires monitoring and tweaking over time. By continuously analyzing and fine-tuning your Next.js app, you can build a web app that runs smoothly and consistently for your users.