go / sqlc
I use sqlc to generate type-safe Go from SQL queries.
Write SQL, run sqlc generate, get functions with proper types.
Setup
Create sqlc.yaml at the project root:
version: "2"
sql:
- schema: "db/migrate/*.sql"
queries: "db/queries/"
engine: postgresql
gen:
go:
package: db
out: db
sql_package: pgx/v5
emit_json_tags: true
emit_empty_slices: true
This tells sqlc to:
- Read schema from migration files in
db/migrate/ - Read queries from
db/queries/ - Generate Go code in the
db/package - Use pgx/v5 as the database driver
Writing queries
Put queries in db/queries/*.sql with sqlc annotations:
-- name: GetUser :one
SELECT id, email, created_at
FROM users
WHERE id = $1;
-- name: ListUsers :many
SELECT id, email, created_at
FROM users
ORDER BY created_at DESC
LIMIT $1;
-- name: CreateUser :exec
INSERT INTO users (email)
VALUES ($1);
-- name: UpdateEmail :exec
UPDATE users
SET email = $2
WHERE id = $1;
The annotation format is -- name: FunctionName :return_type.
Return types:
:one: returns a single row (error if not found):many: returns a slice:exec: returns only error (INSERT, UPDATE, DELETE):execrows: returns affected row count
Generated code
Run sqlc generate to create Go code:
type User struct {
ID int64 `json:"id"`
Email string `json:"email"`
CreatedAt time.Time `json:"created_at"`
}
func (q *Queries) GetUser(ctx context.Context, id int64) (User, error)
func (q *Queries) ListUsers(ctx context.Context, limit int32) ([]User, error)
func (q *Queries) CreateUser(ctx context.Context, email string) error
func (q *Queries) UpdateEmail(ctx context.Context, arg UpdateEmailParams) error
The struct fields and function signatures match your SQL exactly. Compiler catches type mismatches between Go code and database schema.
This matters because database errors at runtime are expensive. A renamed column, changed type, or missing field becomes a compile error instead of a 3 AM page. The feedback loop tightens from "deploy and hope" to "won't build."
Wrapper functions
I wrap generated functions to provide a stable API:
// db/queries.go
package db
func (db *DB) GetUser(ctx context.Context, id int64) (User, error) {
return db.Queries().GetUser(ctx, id)
}
func (db *DB) CreateUser(ctx context.Context, email string) error {
return db.Queries().CreateUser(ctx, email)
}
Application code calls db.GetUser(), not db.Queries().GetUser().
The wrapper can add logging, metrics, or parameter conversion
without changing callers.
Schema changes
When the schema changes, update two places:
- Migrations (
db/migrate/): DDL changes - Queries (
db/queries/): if column names changed
Then run sqlc generate. The compiler shows what Go code needs updating.
This centralization pays off for refactors. Renaming a table:
-- db/migrate/20240101_rename_table.sql
ALTER TABLE user_accounts RENAME TO users;
Update table name in db/queries/*.sql, run sqlc generate.
Application code calling the wrapper functions doesn't change at all.
Without sqlc, you'd grep for raw SQL strings scattered across the codebase.
When to use raw SQL
sqlc handles most queries: joins, CTEs, window functions, aggregations, upserts. If Postgres can parse it, sqlc can too.
The exception is queries whose structure isn't known at compile time:
- Search with user-selected filter combinations
- Dynamic ORDER BY from query parameters
- One-off admin scripts not worth committing
For everything else, sqlc wins. Centralizing SQL in db/queries/
means schema refactors touch one directory instead of grepping the codebase.
Batch inserts
In 2021, Brandur wrote about going all in on sqlc
and noted that batch inserts required an unnest workaround
since sqlc couldn't handle variadic parameters.
sqlc now supports :copyfrom, which uses PostgreSQL's COPY protocol:
-- name: CreateUsers :copyfrom
INSERT INTO users (name, email)
VALUES ($1, $2);
The generated function accepts a slice:
func (q *Queries) CreateUsers(ctx context.Context, arg []CreateUsersParams) (int64, error)
COPY is significantly faster than individual INSERTs for bulk data. No workarounds needed.
Real Postgres parser
sqlc uses pg_query_go, which embeds the same parser that Postgres itself uses. This means complex queries with CTEs, window functions, and Postgres-specific syntax just work.
A hand-rolled parser would struggle with edge cases. Using the real thing means sqlc can parse anything Postgres can.
Codegen speed
sqlc generate runs fast. Under 100ms for a typical project.
The development loop stays tight:
$ time sqlc generate
real 0.05s
user 0.06s
sys 0.01s
Fast enough to run on every save if you want.
CI verification
Add a GitHub Action to verify generated code stays in sync:
- name: Verify sqlc
run: |
sqlc generate
git diff --exit-code db/
This fails if someone edits queries but forgets to regenerate. Catches schema drift before it hits main.
Comparison with ORMs
ORMs hide SQL behind method chains:
db.Where("email = ?", email).First(&user)
sqlc keeps SQL visible:
-- name: GetUserByEmail :one
SELECT * FROM users WHERE email = $1;
I prefer seeing the actual query. Easier to optimize, easier to run in psql, easier to explain to the database.
Project layout
db/
├── migrate/ # DDL (CREATE TABLE, ALTER, etc.)
│ ├── 001_create_users.sql
│ └── 002_add_email_index.sql
├── queries/ # DML (SELECT, INSERT, UPDATE, DELETE)
│ └── users.sql
├── db.go # Connection pool, Queries() method
├── queries.go # Wrapper functions (stable API)
├── models.go # Generated structs
└── users.sql.go # Generated query functions
sqlc.yaml
The split between migrate/ and queries/ keeps DDL and DML separate.
Migrations run once; queries run constantly.