Goodbye PlanetScale, hello Supabase
Posted on 11 March, 2024
As previously posted the database provider for the LSERSA booking system is pulling their free tier.
Having done some searching, I determined that Supabase was the only alternative 'free tier' provider that I was prepared to trust. And a little more research shows that even their paid tier is quite affordable, starting at $25/month. Given that there are some limitations on the free tier (pausing after 1 week inactivity, and not having automated backups) it's likely that LSERSA will actually just switch to the paid account.
The pricing is "$25/month including $10 compute credits" — and that $10 basically buys a micro instance (I assume on AWS). So effectively it's $15/month for an account plus (from) $10/month for each database. There's some account limits (storage, transfer etc) beyond which additional fees apply, but the limits are more than enough for my purposes.
All of which has meant an update to the codebase. Originally I used PlanetScale's own package,
@planetscale/database, for making database calls. We're now moving to a Postgres-based provider,
and whilst there is a bespoke package I think it's better to use a provider-agnostic approach in
the rewrite. So, npm i postgres here we come.
Overall the process has been fairly clean, with the main changes/takeaways:
Simpler import and config
Rather than doing this in each file:
// top of every api route handler that hits the database, with planetscale
// fetch imported because node instances may not have it
import { fetch } from 'undici';
import { connect } from '@planetscale/database';
const config = {
fetch,
host: process.env.DATABASE_HOST,
username: process.env.DATABASE_USERNAME,
password: process.env.DATABASE_PASSWORD
}
Instead we write one module ...
// separate file, eg /src/lib/db.js
import postgres from 'postgres'
const connectionString = process.env.DATABASE_URL
const sql = postgres(connectionString)
export default sql
... and import that everywhere ...
// top of every api route handler that hits the database, with postgres
import sql from "<whatever-path-to-file">
Simpler use of connection
Rather than invoking a connection in each function:
// every call, or set of calls ...
const conn = await connect(config);
const result = await conn.execute(`SELECT * FROM users`);
We have the simpler
// every call
const result = await sql`SELECT * FROM users`;
Gotchas
First, serious, gotcha — other than SELECT, SQL commands do not return anything by
default (you get an empty array). Add RETURNING * at the end of the SQL command to get an array
back of everything you've done ...
Second, the planetscale package has a nice way of parametrising queries by providing placeholders
in the query and providing an array of values to substitute. I imagine that Supabase's own package
does something very similar. The go-to solution in postgres is template literals with ${}
substitution:
// would have done
const result = await conn.execute('SELECT * FROM users WHERE id = ?', [id]);
// now
const result = await sql`SELECT * FROM users WHERE id = ${id}`;
There is support in the
postgrespackage for more dynamic replacements (particularly where, for instance, you're updating a row but the columns for updating are dynamic).
And lastly (possibly one of my own making) — transactions. I'd assumed that I could manually
write send BEGIN, COMMIT and ROLLBACK instructions, but no — there's a function
provided, sql.begin() instead. Once that's clear, all good.
UPDATE: One other huge gotcha that I only found after data had migrated. Whereas Postgres (and, I though, MySQL) store booleans as
TRUEandFALSE, PlanetScale stored them as integer1and0. Since those values are truth/falsy it should make little difference, but. There was one line in my codebase where I was checking the value of a 'boolean' by strict equivalence,value === 0— so that immediately broke until I changed it to!value
Overall — including re-writing all queries, testing and finding out that only SELECT
commands were returning anything, and testing and finding that I'd done the wrong thing on
transactions — it took about 6 hours to change the code and test/fix each API route. Not bad,
but not something I'm keen to do again in a hurry ...