mirror of
https://github.com/supabase/agent-skills.git
synced 2026-03-27 10:09:26 +08:00
* rebase and house keeping * fix supabase sdk reference files after docs review * update agents.md
3.4 KiB
3.4 KiB
title, impact, impactDescription, tags
| title | impact | impactDescription | tags |
|---|---|---|---|
| Realtime Performance and Cleanup | HIGH | Prevents memory leaks and ensures reliable subscriptions | realtime, subscriptions, cleanup, channels, broadcast, postgres-changes |
Realtime Performance and Cleanup
Realtime subscriptions require proper cleanup to prevent memory leaks.
Basic Subscription
const channel = supabase
.channel('messages')
.on(
'postgres_changes',
{ event: 'INSERT', schema: 'public', table: 'messages' },
(payload) => console.log('New message:', payload.new)
)
.subscribe()
React Cleanup Pattern (Critical)
Incorrect:
// Memory leak - subscription never cleaned up
useEffect(() => {
const supabase = createClient()
supabase
.channel('messages')
.on('postgres_changes', { event: '*', schema: 'public', table: 'messages' }, handler)
.subscribe()
}, [])
Correct:
useEffect(() => {
const supabase = createClient()
const channel = supabase
.channel('messages')
.on('postgres_changes', { event: '*', schema: 'public', table: 'messages' }, handler)
.subscribe()
// Cleanup on unmount
return () => {
supabase.removeChannel(channel)
}
}, [])
Filter Subscriptions
Reduce server load by filtering at the source:
const channel = supabase
.channel('user-messages')
.on(
'postgres_changes',
{
event: 'INSERT',
schema: 'public',
table: 'messages',
filter: `recipient_id=eq.${userId}`,
},
handleMessage
)
.subscribe()
Connection Status Handling
const channel = supabase.channel('my-channel')
channel.subscribe((status) => {
if (status === 'SUBSCRIBED') {
console.log('Connected')
} else if (status === 'CHANNEL_ERROR') {
console.error('Connection error')
} else if (status === 'TIMED_OUT') {
console.log('Connection timed out, retrying...')
} else if (status === 'CLOSED') {
console.log('Connection closed')
}
})
Use Broadcast for Scale
Postgres Changes don't scale horizontally. For high-throughput use cases, use Broadcast from Database. This requires private channels, Realtime Authorization RLS policies, and setAuth():
-- RLS policy for broadcast authorization
create policy "Authenticated users can receive broadcasts"
on "realtime"."messages"
for select
to authenticated
using ( true );
// Client subscribes to broadcast (requires authorization setup)
await supabase.realtime.setAuth()
const channel = supabase
.channel('messages-broadcast', {
config: { private: true },
})
.on('broadcast', { event: 'INSERT' }, (payload) => {
console.log('New message:', payload)
})
.subscribe()
Limitations
| Limitation | Description |
|---|---|
| Single thread | Postgres Changes processed on one thread |
| DELETE filters | Cannot filter DELETE events by column |
| RLS per subscriber | 100 subscribers = 100 RLS checks per change |
| Table names | Cannot contain spaces |
Multiple Tables
const channel = supabase
.channel('db-changes')
.on('postgres_changes', { event: '*', schema: 'public', table: 'messages' }, handleMessages)
.on('postgres_changes', { event: 'INSERT', schema: 'public', table: 'users' }, handleNewUser)
.subscribe()