Serverless Node.js URL Shortener App powered by Upstash Kafka and Materialize

This is a simple Node.js URL shortener app that uses Cloudflare Workers.

The app is powered by Cloudflare Workers and Upstash Redis for storing data and Kafka for storing the click events along with Materialize for real-time data analytics.

Upstash offers Serverless, Low latency, and pay-as-you-go solutions for Kafka and Redis.

Materialize is a streaming database for real-time applications. Materialize accepts input data from a variety of streaming sources (like Kafka), data stores and databases (like S3 and Postgres), and files (like CSV and JSON), and lets you query them using SQL.

App structure

The demo app has the following structure:

  • A serverless Cloudflare Worker that lets you add short links and redirect them to other URLs.
  • All data is stored in Upstash serverless Redis cluster as key-value pairs (short link -> long link).
  • Every time you visit a short link, it triggers an event and stores it in Upstash Kafka.
  • We then get the data from Upstash Kafka and analyze it in Materialize in real-time.

A demo of the app can be found here:

https://cf-url-shortener.bobbyiliev.workers.dev/admin

Diagram

The following is a diagram of the app structure:

CHECKOUT OUR LATEST PRODUCT — THE ULTIMATE TAILWINDCSS PAGE CREATOR 🚀

Demo

Here is a quick demo of how the app works:

Prerequisites

Before you get started, you need to make sure that you have the following

  • A Redis cluster and a Kafka cluster in Upstash.
  • A Kafka topic in Upstash called visits-log.
  • The Cloudflare CLI tool called wrangler on your local machine as described here
  • A Materialize instance running on your local machine as described here or a Materialize Cloud instance.

Running this demo

Once you have all the prerequisites, you can proceed with the following steps:

  • Clone the repository and run the following command:
git clone https://github.com/bobbyiliev/cf-url-shortener.git
  • Access the directory:
cd cf-url-shortener
  • Install the npm dependencies:
npm install
  • Run the wrangler command to authenticate with Cloudflare:
wrangler login
  • Then in the wrangler.toml file, update the account_id to match your Cloudflare account ID:
account_id = "YOUR_ACCOUNT_ID_HERE"
  • Set the following secrets in Cloudflare using the wrangler tool:
wrangler secret put UPSTASH_REDIS_REST_URL
wrangler secret put UPSTASH_REDIS_REST_TOKEN
wrangler secret put UPSTASH_KAFKA_REST_URL
wrangler secret put UPSTASH_KAFKA_REST_USERNAME
wrangler secret put UPSTASH_KAFKA_REST_PASSWORD

Make sure to use the REST API URLs and not the Broker details.

  • Run the following command to deploy the CF Worker:
wrangler deploy

With the CF Worker deployed, you can visit the admin URL where you can add short links and redirect them to other URLs.

Setup Materialize

Once you’ve deployed the CF Worker, you can set up Materialize to analyze the data in Upstash Kafka in real-time.

Start by creating a new Materialize instance in Materialize Cloud:

Or alternatively, you can install Materialize locally:

After you’ve created the instance, you can connect to it using the psql command as shown in the docs.

The CREATE SOURCE statements allow you to connect Materialize to an external Kafka data source and lets you interact with its data as if the data were in a SQL table.

To create a new Kafka source in Materialize run the following statement:

CREATE SOURCE click_stats
FROM KAFKA BROKER 'UPSTASH_KAFKA_BROKER_URL' TOPIC 'visits-log'
WITH (
security_protocol = 'SASL_SSL',
sasl_mechanisms = 'SCRAM-SHA-256',
sasl_username = 'UPSTASH_KAFKA_BROKER_USERNAME',
sasl_password = 'UPSTASH_KAFKA_BROKER_PASSWORD'
)
FORMAT BYTES;

Change the Kafka details to match your Upstash Kafka cluster Broker and credentials.

Next, we will create a NON-materialized View, which you can think of as kind of a reusable template to be used in other materialized views:

CREATE VIEW click_stats_v AS
SELECT
*
FROM (
SELECT
(data->>'shortCode')::string AS short_code,
(data->>'longUrl')::string AS long_url,
(data->>'country')::string AS country,
(data->>'city')::string AS city,
(data->>'ip')::string AS ip
FROM (
SELECT CAST(data AS jsonb) AS data
FROM (
SELECT convert_from(data, 'utf8') AS data
FROM click_stats
)
)
);

Finally, create a materialized view to analyze the data in the Kafka source:

CREATE MATERIALIZED VIEW click_stats_m AS
SELECT
*
FROM click_stats_v;

Then you can query the materialized view just using standard SQL, but get the data in real-time, with sub-millisecond latency:

SELECT * FROM click_stats_m;

You can stack up materialized views together, so let’s order by the number of clicks per short link:

CREATE MATERIALIZED VIEW order_by_clicks AS
SELECT
short_code,
COUNT(*) AS clicks
FROM click_stats_m
GROUP BY short_code;

One of the great features of Materialize is TAIL.

TAIL streams updates from a source, table, or view as they occur.

So to stream the data from our materialized view using TAIL, we can use the following statement:

COPY ( TAIL ( SELECT * FROM order_by_clicks ) ) TO STDOUT;

For more information about TAIL, check out this blog post:

Subscribe to changes in a view with TAIL in Materialize

Display the results in Metabase

As Materialize is Postgres-wire compatible, you can use BI tools like Metabase to create business intelligence dashboards using the real-time data streams in your Materialize instance.

For more information about Metabase + Materialize, check out the official documentation:

Metabase + Materialize

Example dashboard that shows the number of clicks per short link:

Conclusion

Using Materialize to analyze the data in your Upstash Kafka serverless instance is a great way to get real-time insights into your data.

As a next step, here are some other great resources to learn about Materialize and Upstash:

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Bobby Iliev

I am a professional System Administrator with a demonstrated history of working in the internet industry. I am a Linux lover