Mastering SvelteKit Deployment Part 3: Deploying to Vercel

  • SvelteKit
  • Deployment
  • Vercel

Published on

In part 2 of the series, we demonstrated the concept of SvelteKit’s Adapters and experimented with adapter-node annd adapter-static.

From this part onwards, we will begin deploying our app to different platforms, with each step we will explain how the target platform works and what inputs it expects to get our app deployed.

For this blog post, our target deployment platform will be Vercel, known for its simplicity, customizability, and top-notch DevEx. We will start by explaining their deployment pipeline, serverless nature, supported runtimes, and static apps. Then, we will dive into using adapter-vercel and explain how it works behind the scenes. Finally, we will explore how to deploy our example app differently. Although this blog post may be longer than usual, I will try to simplify it as much as possible

What Is Vercel?

Vercel is a cloud-based platform that lets developers build and deploy their apps. They are known as the “Frontend Cloud”, meaning that, while they are a cloud-based product, they aim to simplify the process of building and shipping web applications for frontend developers, including a variety of features like hosting, storage, databases, caching, analytics, AI, and more.

They are also known as part of the open-source community. They own and fund frameworks like Next.js and SvelteKit. However, SvelteKit offers broader platform support and compatibility and isn’t taught to Vercel.

How Vercel Works?

Vercel platform is based upon the serverless architecture, empowered by AWS Lambda functions with Amazon Linux 2 runtime. Being serverless means you cannot run long background tasks or HTTP streaming* unless using an add-on like inngest.

*Although Vercel doesn’t natively support HTTP streaming, their team built a custom solution that makes this feature possible, which is done by triggering a bridged socket connection. It’s used by frameworks like Next.js and SvelteKit for their streaming support.

Continuous Integration with Git

Vercel recommends deploying to their platform via Git for additional features like Continuous Integration, previewing deployments before merging to the main branch, and live feedback. In this part of the series, we will set up automatic deployment with vercel and vercel-edge branches and set up a preview URL for each.

Vercel’s Supported Runtimes

When deploying your app on Vercel, it uses serverless functions. The runtime for these functions is supported in two different options.

  • Serverless Runtime: This is the default and usual runtime. It fully supports different runtimes like Node.js, Go, Python, Ruby, etc. However, it’s slower due to its cold starts and may even be expensive in some cases.

  • Edge Runtime: With this option, your functions will run faster with minimal cold starts, as it runs with lightweight resources and uses Web API Standards. Edge functions are deployed through Vercel’s Edge Network, which allows significantly faster response times as content becomes closer to users.

    It’s important to keep in mind that the Edge Runtime has some differences compared to Node.js. For example, it doesn’t support Node’s filesystem module.

For a more detailed comparison, check out Vercel’s Runtime Comparison article.

Content Caching

One thing that Vercel does so well is its caching system. Here’s an overview of its top features related to this topic:

  • For static files, Vercel will automatically cache them on their CDN using Vercel’s Edge Network. This will be automatically applied to files under static directory, as well as your Vite’s compiled fingerprinted static assets.
  • Vercel was the first platform to add Incremental Static Regeneration support, which allows regenerating static/prerendered pages based on time intervals.
  • For pages with static and dynamic data, Vercel has a built-in granular cache known as Vercel Data Cache, though, at the time of writing, this feature is only available for Next.js (It does this by wrapping the global fetch function).

Note that caching for fiels under static directory is done by only using CDN caching, as the time of writing this blog post, SvelteKit does not expost cache control headers for those files, you will only get a boost from serving those files over Vercel’s Edge Network, but the browsers won’t cache them for you.

Page-Specific Adapter Configs

Before taking another look at our example app, let me mention that SvelteKit supports exporting page-specific configs, which are adapter-specific. For example, if we want to enable Incremental Static Regeneration for one of our static pages, in page.server.ts or page.ts, we can export a config object:

export config = {
	isr: {
		expiration: 15 // regenerate every 15 seconds
	}
}

This way, our static page will be rebuilt every 60 seconds. Note that page-specific configs are not supported for all adapters. It’s currently supported by adapter-vercel, which we will use in this part of the series.

A Closer Look at our Example App

To allow us to experiment with different features offered by multiple platforms, We have 5 pages with different rendering and caching strategies:

  • /ssr: Displays a list of comments. This page is server-side rendered and allows posting new comments, which triggers a form server-action.
  • /ssr-streaming: Another server-rendered page, but the content here is streamed using SvelteKit’s streaming feature. Comments here are delayed for 2 seconds. While loading, a UI loading skeleton is displayed.
  • /ssg: This page exports prerender = true. The comments displayed on this page are generated at build time, so posting them on the SSR page won’t be displayed here.
  • /isr: An SSG page that uses incremental static regeneration. It exports the isr config with an expiration interval of 15 seconds. As explained in the previous section, this awesome feature allows updating our static page so that newly posted comments are reflected in our static page.
  • /cache-headers: A minimal page that displays the current time. What we are experimenting with here is the cache headers. This page exports a Cache-Control header with a max-age of 10 seconds. Unlike ISR, the page is cached within the browser, not our deployment platform.

Note: As mentioned in Part 1, we store and post our data to Supabase. The env file is shared in the GitHub repo, so you don’t have to worry about setting up your own. However, please remember to manually add them to the target platform when your app is deployed, as SvelteKit does not include them in your output.

Deploying our app to Vercel

In this section, we will set up automatic deployment using vercel and vercel-edge branches and set up a preview URL for each.

SvelteKit’s Vercel Adapter

To get your app deployed to Vercel, you must follow their file structure convention - Vercel’s build output API; However, @sveltejs/adapter-vercel package does this automatically for you.

We are going to use the adapter in our example app:

pnpm install @sveltejs/adapter-vercel

Modifying svelte.config.js to use the adapter:

import adapter from '@sveltejs/adapter-vercel';
import { vitePreprocess } from '@sveltejs/kit/vite';

/** @type {import('@sveltejs/kit').Config} */
const config = {
	preprocess: vitePreprocess(),

	kit: {
		adapter: adapter()
	}
};

export default config;

And finally, building our app while having the adapter configured:

pnpm build

Expand the content of the .vercel/output directory. You will see a file structure tree similar to the following:

├── config.json
├── functions
│   └── fn.func
│       ├── node_modules
│       │   ├── @supabase
│       │   ├── @sveltejs
│       └── package.json
└── static
    ├── _app
    │   ├── immutable
    ├── favicon.png
    ├── ssg
    │   └── __data.json
    ├── isr
    │   └── __data.json
    ├── isr.html
    └── ssg.html

This filesystem follows Vercel’s build output API, so deploying our app to Vercel will work naturally and as expected. Let’s have some highlights from the last tree snippet:

  • The file config.json is the core of our deployed app. It has the following properties:
    • version: The versions of the build output API being used. By the time of writing this blog post, it is using version 3.
    • routes: A set of regex patterns and destinations for every route defined in our SvelteKit app. Routes could have a source and destination and contain the returned status code, headers, etc.
      • Note that the "/_app/immutable/.+ source, exports a cache control header of 31536000 seconds (1 year). We will explain this within the current section.
      • Note the /.*" source, which points at every route in our app. They all yield to a destination of /fn/, our serverless function. We will explain this here, too.
    • functions: The place where our serverless functions are stored. Note that the Vercel adapters has mapped all of our routes to one function. This lets us gain some advantages:
      • Reduce cold starts: When our app uses one serverless function, the cold starts will be drastically reduced, and the chance of it being warmed up will increase.
      • Reduce costs: More serverless functions mean more costs. Combining multiple endpoints within one is more cost-effective.
      • Better performance: The startup penalty is only paid once for all routes in our application.
    • static: The place where our static assets and statically generated data are stored:
      • _app/immutable: Contains static files that will likely never change. It has a set of compiled JavaScript and CSS files, that’s why they are cached for 1 year, but they are fingerprinted so that when you update those without worrying about caching.
    • files with ssg and isr contain the data and HTML templates for your statically generated pages.

Connecting our example app to GitHub

It’s highly recommended to deploy our app using Git. Vercel supports deployments using GitHub, Gitlab, and Bitbucket. In this blog post, we will use GitHub.

Using VSCode, you could do this with a few keystrokes: press ⌘+Shift+p (Mac) or Ctrl+Shift+p (Linux/Windows) and select Publish to GitHub.

publish to github

Node Serverless runtime

We haven’t set any runtime options in our example app. In such case, Vercel will use the Serverless runtime by default. Let’s head over to Vercel and get our app deployed!

If you don’t have a Vercel account, create one. Adding a new project could be done by connecting your GitHub repo to Vercel. The rest will be automated for you!

Deploy to Vercel

Don’t forget to add the environment variables before deploying your app. All ends you need are included in the .env within the repo.

Note that the code for this section exists in the vercel branch. Please fork the example app repo and set it as your main branch before deployment. This is a limitation from Vercel. It will deploy your app initially using the main branch.

And finally, we get our app deployed!

Deployed app to Vercel using Node serverless runtime

Edge Network

In earlier sections of this blog post, we have explained what Vercel’s Edge Runtime is. Now let’s experiment with and and deploy our example app to the edge!

If you have already forked the example app repo, checkout the vercel-edge branch and set it as your main branch.

We are using the runtime: 'edge' option, which is completely supported by the Vercel adapter:

import adapter from '@sveltejs/adapter-vercel';
import { vitePreprocess } from '@sveltejs/kit/vite';

/** @type {import('@sveltejs/kit').Config} */
const config = {
	preprocess: vitePreprocess(),

	kit: {
		adapter: adapter({ runtime: 'edge' })
	}
};

export default config;

The isr page is not compatible with the edge runtime, so we will overwrite that behavior in a page specific config:

import { fetchComments } from "$lib/server/utils";
import type { PageServerLoad } from "./$types";

export const load: PageServerLoad = async () => {
	const comments = await fetchComments(fetch);

	return {
		comments
	}
};

export const config = {
	isr: {
		expiration: 15 // regenerate every 15 seconds
	},
	runtime: 'nodejs20.x' // Overwrite edge runtime
};

Note: Some well-known node modules are not supported in the edge-runtime are unsupported. For example, we cannot access the file system using the fs module like in Node. But this is not a problem with our app since we don’t use it. We use supabase as our database which have great support for the edge runtime.

Note that the code for this section exists in the vercel-edge branch.

Deployment Notes

  • As you see, the ssr and ssr-streaming are the most dynamic, whenever they are visited, they will have the most updated content.
  • The ssg page is the least dynamic, the data you see displayed present in an HTML file obtained at build time.
  • The isr are well supported by SvelteKit. this page has the good parts of the two worlds, it is regenerated based on an interval of 15 seconds, but you are likely to add longer intervals in real world apps.
  • As expected, the cahced-headers page is updated each 10 seconds, which is the effect of the cache control header, but this behavior is browser specific since it’s cached there to prevent the request from the first place, if you visit the page on another browser for the first time, you will get new fresh content.

Other options

Along with the mentioned runtime and isr options, adapter-vercel supports other options. Here are some to name a few:

  • split: This option is set to false by default. If enabled, your SvelteKit loaders, actions, and API routes will be split across different serverless functions, but I wouldn’t recommend this unless there’s a specific thing you want to do.
  • external: This option only applies to edge routes. It allows setting a list of optional dependencies which are not compatible with the edge runtime (Node-specific modules).
  • There are other options like memory and maxDuration that you can pass, but it’s unlikely that you will need them. For more info, check out SvelteKit’s docs

You have reached the end of this blog post. Thank you so much for reading! Next, we will pick another platform. We will deploy our SvelteKit app to CloudFlare. See you!

Useful Links:

© 2023 Fayez Nazzal. All rights reserved.
Social Links:
contact@fayez.io