Realtime

Postgres Changes

Listen to Postgres changes using Supabase Realtime.

Let's explore how to use Realtime's Postgres Changes feature to listen to database events.

Quick start

In this example we'll set up a database table, secure it with Row Level Security, and subscribe to all changes using the Supabase client libraries.

1

Set up a Supabase project with a 'todos' table

Create a new project in the Supabase Dashboard.

After your project is ready, create a table in your Supabase database. You can do this with either the Table interface or the SQL Editor.


_10
-- Create a table called "todos"
_10
-- with a column to store tasks.
_10
create table todos (
_10
id serial primary key,
_10
task text
_10
);

2

Allow anonymous access

In this example we'll turn on Row Level Security for this table and allow anonymous access. In production, be sure to secure your application with the appropriate permissions.


_10
-- Turn on security
_10
alter table "todos"
_10
enable row level security;
_10
_10
-- Allow anonymous access
_10
create policy "Allow anonymous access"
_10
on todos
_10
for select
_10
to anon
_10
using (true);

3

Enable Postgres replication

Go to your project's Publications settings, and under supabase_realtime, toggle on the tables you want to listen to.

4

Install the client

Install the Supabase JavaScript client.


_10
npm install @supabase/supabase-js

5

Create the client

This client will be used to listen to Postgres changes.


_10
import { createClient } from '@supabase/supabase-js'
_10
_10
const client = createClient(
_10
'https://<project>.supabase.co',
_10
'<your-anon-key>'
_10
)

6

Listen to changes by schema

Listen to changes on all tables in the public schema by setting the schema property to 'public' and event name to *. The event name can be one of:

  • INSERT
  • UPDATE
  • DELETE
  • *

The channel name can be any string except 'realtime'.


_11
const channelA = client
_11
.channel('schema-db-changes')
_11
.on(
_11
'postgres_changes',
_11
{
_11
event: '*',
_11
schema: 'public',
_11
},
_11
(payload) => console.log(payload)
_11
)
_11
.subscribe()

7

Insert dummy data

Now we can add some data to our table which will trigger the channelA event handler.


_10
insert into todos (task)
_10
values
_10
('Change!');

Usage

You can use the Supabase client libraries to subscribe to database changes.

Listening to specific schemas

Subscribe to specific schema events using the schema parameter:


_11
const changes = client
_11
.channel('schema-db-changes')
_11
.on(
_11
'postgres_changes',
_11
{
_11
schema: 'public', // Subscribes to the "public" schema in Postgres
_11
event: '*', // Listen to all changes
_11
},
_11
(payload) => console.log(payload)
_11
)
_11
.subscribe()

The channel name can be any string except 'realtime'.

Listening to INSERT events

Use the event parameter to listen only to database INSERTs:


_11
const changes = client
_11
.channel('schema-db-changes')
_11
.on(
_11
'postgres_changes',
_11
{
_11
event: 'INSERT', // Listen only to INSERTs
_11
schema: 'public',
_11
},
_11
(payload) => console.log(payload)
_11
)
_11
.subscribe()

The channel name can be any string except 'realtime'.

Listening to UPDATE events

Use the event parameter to listen only to database UPDATEs:


_11
const changes = client
_11
.channel('schema-db-changes')
_11
.on(
_11
'postgres_changes',
_11
{
_11
event: 'UPDATE', // Listen only to UPDATEs
_11
schema: 'public',
_11
},
_11
(payload) => console.log(payload)
_11
)
_11
.subscribe()

The channel name can be any string except 'realtime'.

Listening to DELETE events

Use the event parameter to listen only to database DELETEs:


_11
const changes = client
_11
.channel('schema-db-changes')
_11
.on(
_11
'postgres_changes',
_11
{
_11
event: 'DELETE', // Listen only to DELETEs
_11
schema: 'public',
_11
},
_11
(payload) => console.log(payload)
_11
)
_11
.subscribe()

The channel name can be any string except 'realtime'.

Listening to specific tables

Subscribe to specific table events using the table parameter:


_12
const changes = client
_12
.channel('table-db-changes')
_12
.on(
_12
'postgres_changes',
_12
{
_12
event: '*',
_12
schema: 'public',
_12
table: 'todos',
_12
},
_12
(payload) => console.log(payload)
_12
)
_12
.subscribe()

The channel name can be any string except 'realtime'.

Listening to multiple changes

To listen to different events and schema/tables/filters combinations with the same channel:


_21
const channel = supabase
_21
.channel('db-changes')
_21
.on(
_21
'postgres_changes',
_21
{
_21
event: '*',
_21
schema: 'public',
_21
table: 'messages',
_21
},
_21
(payload) => console.log(payload)
_21
)
_21
.on(
_21
'postgres_changes',
_21
{
_21
event: 'INSERT',
_21
schema: 'public',
_21
table: 'users',
_21
},
_21
(payload) => console.log(payload)
_21
)
_21
.subscribe()

Filtering for specific changes

Use the filter parameter for granular changes:


_13
const changes = client
_13
.channel('table-filter-changes')
_13
.on(
_13
'postgres_changes',
_13
{
_13
event: 'INSERT',
_13
schema: 'public',
_13
table: 'todos',
_13
filter: 'id=eq.1',
_13
},
_13
(payload) => console.log(payload)
_13
)
_13
.subscribe()

Available filters

Realtime offers filters so you can specify the data your client receives at a more granular level.

Equal to (eq)

To listen to changes when a column's value in a table equals a client-specified value:


_13
const channel = supabase
_13
.channel('changes')
_13
.on(
_13
'postgres_changes',
_13
{
_13
event: 'UPDATE',
_13
schema: 'public',
_13
table: 'messages',
_13
filter: 'body=eq.hey',
_13
},
_13
(payload) => console.log(payload)
_13
)
_13
.subscribe()

This filter uses Postgres's = filter.

Not equal to (neq)

To listen to changes when a column's value in a table does not equal a client-specified value:


_13
const channel = supabase
_13
.channel('changes')
_13
.on(
_13
'postgres_changes',
_13
{
_13
event: 'INSERT',
_13
schema: 'public',
_13
table: 'messages',
_13
filter: 'body=neq.bye',
_13
},
_13
(payload) => console.log(payload)
_13
)
_13
.subscribe()

This filter uses Postgres's != filter.

Less than (lt)

To listen to changes when a column's value in a table is less than a client-specified value:


_13
const channel = supabase
_13
.channel('changes')
_13
.on(
_13
'postgres_changes',
_13
{
_13
event: 'INSERT',
_13
schema: 'public',
_13
table: 'profiles',
_13
filter: 'age=lt.65',
_13
},
_13
(payload) => console.log(payload)
_13
)
_13
.subscribe()

This filter uses Postgres's < filter, so it works for non-numeric types. Make sure to check the expected behavior of the compared data's type.

Less than or equal to (lte)

To listen to changes when a column's value in a table is less than or equal to a client-specified value:


_13
const channel = supabase
_13
.channel('changes')
_13
.on(
_13
'postgres_changes',
_13
{
_13
event: 'UPDATE',
_13
schema: 'public',
_13
table: 'profiles',
_13
filter: 'age=lte.65',
_13
},
_13
(payload) => console.log(payload)
_13
)
_13
.subscribe()

This filter uses Postgres' <= filter, so it works for non-numeric types. Make sure to check the expected behavior of the compared data's type.

Greater than (gt)

To listen to changes when a column's value in a table is greater than a client-specified value:


_13
const channel = supabase
_13
.channel('changes')
_13
.on(
_13
'postgres_changes',
_13
{
_13
event: 'INSERT',
_13
schema: 'public',
_13
table: 'products',
_13
filter: 'quantity=gt.10',
_13
},
_13
(payload) => console.log(payload)
_13
)
_13
.subscribe()

This filter uses Postgres's > filter, so it works for non-numeric types. Make sure to check the expected behavior of the compared data's type.

Greater than or equal to (gte)

To listen to changes when a column's value in a table is greater than or equal to a client-specified value:


_13
const channel = supabase
_13
.channel('changes')
_13
.on(
_13
'postgres_changes',
_13
{
_13
event: 'INSERT',
_13
schema: 'public',
_13
table: 'products',
_13
filter: 'quantity=gte.10',
_13
},
_13
(payload) => console.log(payload)
_13
)
_13
.subscribe()

This filter uses Postgres's >= filter, so it works for non-numeric types. Make sure to check the expected behavior of the compared data's type.

Contained in list (in)

To listen to changes when a column's value in a table equals any client-specified values:


_13
const channel = supabase
_13
.channel('changes')
_13
.on(
_13
'postgres_changes',
_13
{
_13
event: 'INSERT',
_13
schema: 'public',
_13
table: 'colors',
_13
filter: 'name=in.(red, blue, yellow)',
_13
},
_13
(payload) => console.log(payload)
_13
)
_13
.subscribe()

This filter uses Postgres's = ANY. Realtime allows a maximum of 100 values for this filter.

Receiving old records

By default, only new record changes are sent but if you want to receive the old record (previous values) whenever you UPDATE or DELETE a record, you can set the replica identity of your table to full:


_10
alter table
_10
messages replica identity full;

Private schemas

Postgres Changes works out of the box for tables in the public schema. You can listen to tables in your private schemas by granting table SELECT permissions to the database role found in your access token. You can run a query similar to the following:


_10
grant select on "non_private_schema"."some_table" to authenticated;

Custom tokens

You may choose to sign your own tokens to customize claims that can be checked in your RLS policies.

Your project JWT secret is found with your Project API keys in your dashboard.

To use your own JWT with Realtime make sure to set the token after instantiating the Supabase client and before connecting to a Channel.


_20
const { createClient } = require('@supabase/supabase-js')
_20
_20
const supabase = createClient(process.env.SUPABASE_URL, process.env.SUPABASE_KEY, {})
_20
_20
// Set your custom JWT here
_20
supabase.realtime.setAuth('your-custom-jwt')
_20
_20
const channel = supabase
_20
.channel('db-changes')
_20
.on(
_20
'postgres_changes',
_20
{
_20
event: '*',
_20
schema: 'public',
_20
table: 'messages',
_20
filter: 'body=eq.bye',
_20
},
_20
(payload) => console.log(payload)
_20
)
_20
.subscribe()

Refreshed tokens

You will need to refresh tokens on your own, but once generated, you can pass them to Realtime.

For example, if you're using the supabase-js v2 client then you can pass your token like this:


_10
// Client setup
_10
_10
supabase.realtime.setAuth('fresh-token')

Limitations

Delete events are not filterable

You can't filter Delete events when tracking Postgres Changes. This limitation is due to the way changes are pulled from Postgres.

Spaces in table names

Realtime currently does not work when table names contain spaces.

Database instance and realtime performance

Realtime systems usually require forethought because of their scaling dynamics. For the Postgres Changes feature, every change event must be checked to see if the subscribed user has access. For instance, if you have 100 users subscribed to a table where you make a single insert, it will then trigger 100 "reads": one for each user.

There can be a database bottleneck which limits message throughput. If your database cannot authorize the changes rapidly enough, the changes will be delayed until you receive a timeout.

Database changes are processed on a single thread to maintain the change order. That means compute upgrades don't have a large effect on the performance of Postgres change subscriptions. You can estimate the expected maximum throughput for your database below.

If you are using Postgres Changes at scale, you should consider using separate "public" table without RLS and filters. Alternatively, you can use Realtime server-side only and then re-stream the changes to your clients using a Realtime Broadcast.

Enter your database settings to estimate the maximum throughput for your instance:

Set your expected parameters

Current maximum possible throughput

Total DB changes /secMax messages per client /secMax total messages /secLatency p95
646432,000238ms

Don't forget to run your own benchmarks to make sure that the performance is acceptable for your use case.

We are making many improvements to Realtime's Postgres Changes. If you are uncertain about the performance of your use case, please reach out using Support Form and we will be happy to help you. We have a team of engineers that can advise you on the best solution for your use-case.