Core Concepts

Migrations

Migrations are the process of keeping your database schema synchronized with your application's schema definitions in code. This ORM provides a powerful migration system that can handle both simple and complex schema changes.

Migrations are the process of keeping your database schema synchronized with your application's schema definitions in code. This ORM provides a powerful migration system that can handle both simple and complex schema changes.

Destructive Migrations Warning: Adding or removing UNIQUE constraints, changing column types, or modifying NOT NULL constraints will recreate the entire table. While data is preserved during recreation, this is a destructive operation that temporarily locks the table. Always backup your database before running such migrations, especially in production.

How It Works: The Schema Signature

The migrateIfDirty() method relies on a schema signature.

  1. It computes a unique string based on your current code schema (tables, columns, types, constraints).
  2. It compares this signature to one stored in a special _schema_meta table in your database.
  3. If the signatures don't match, the ORM considers the schema "dirty" and runs the migration process.

Migration Strategies

The ORM uses two strategies for migrations:

Strategy 1: Simple Column Addition (Safe & Fast)

Used for adding new nullable columns without constraints:

// Adding a simple nullable column - uses ALTER TABLE
userId: text('userId')

Strategy 2: Table Recreation (Destructive but Safe)

Used when column definitions change or constraints are added/removed:

// These trigger table recreation:
email: text('email').unique()    // ← Adding/removing UNIQUE
age: integer('age')              // ← Changing column type
name: text('name').notNull()     // ← Adding/removing NOT NULL

Table Recreation Process:

  1. Creates temporary table with new schema
  2. Copies all data from old table
  3. Drops old table
  4. Renames temporary table to original name
Table recreation requires exclusive table access and can be slow for large tables. The table is temporarily unavailable during migration.

Migration Methods

You have four primary methods for managing migrations, each with a different purpose.

db.migrateIfDirty()

This is the recommended method for development and simple deployments. It's an all-in-one, non-destructive command that:

  1. Automatically checks if the schema is dirty using the signature comparison.
  2. If dirty, it runs the migration logic to add any missing tables or columns.
  3. After migrating, it updates the stored signature to match the new schema.
  4. It returns a Promise<boolean> which resolves to true if a migration was performed.

This method will never delete tables or columns, making it safe to run on application startup.

src/startup.ts
// Place this in your app's startup logic
await db.migrateIfDirty();

db.checkMigration()

NEW: Preview migration changes before applying them. This method analyzes your schema and database without making any changes.

Returns: Promise<{ safe: boolean; changes: { ... } }>

src/check.ts
const result = await db.checkMigration();

console.log('Is migration safe?', result.safe);
console.log('Tables to create:', result.changes.tablesToCreate);
console.log('Tables to recreate (DESTRUCTIVE):', result.changes.tablesToRecreate);
console.log('Tables to drop:', result.changes.tablesToDrop);
console.log('Columns to add:', result.changes.columnsToAdd);

// Only migrate if safe
if (result.safe) {
  await db.migrate();
} else {
  console.warn('⚠️ Migration requires table recreation!');
  console.warn('Tables affected:', result.changes.tablesToRecreate);
  // Require explicit user confirmation before proceeding
}
Use checkMigration() in production to warn users before destructive migrations, or to log changes for audit purposes.

db.migrate(options?)

This is the more powerful, underlying migration method. It directly compares your code schema with the database structure and applies changes.

Options:

  • options.performDestructiveActions: boolean (default: false) - Allow dropping tables/columns
  • options.dryRun: boolean (default: false) - Preview changes without applying

Without options (Default Mode):

  • CREATE TABLE for any new tables
  • ALTER TABLE ADD COLUMN for simple nullable columns
  • Recreates tables when column definitions change (UNIQUE, type, NOT NULL, etc.)
  • ❌ Never drops tables or columns

With `{ performDestructiveActions: true }` (Destructive Mode):

  • Performs all actions from Default Mode
  • DROP TABLE for tables not in schema
  • ✅ Removes columns not in schema (via table recreation)

With `{ dryRun: true }` (Preview Mode):

  • Logs planned changes to console
  • Does NOT apply any changes
  • Useful for debugging and CI/CD pipelines
src/migrate.ts
// Safe: add new columns/tables, recreate tables for constraint changes
await db.migrate();

// Preview changes without applying
await db.migrate({ dryRun: true });

// Destructive: also remove extra tables/columns
await db.migrate({ performDestructiveActions: true });
You Have Been Warned: Destructive mode (performDestructiveActions: true) will permanently delete tables and columns not in your schema. This is intended ONLY for controlled scenarios like development database resets. NEVER use this on application startup in production!
Even in default mode (without performDestructiveActions), migrations will recreate tables when you:
  • Add or remove UNIQUE constraints
  • Change column types (e.g., TEXT → INTEGER)
  • Add or remove NOT NULL constraints
  • Add or remove PRIMARY KEY
Always backup your database before running migrations that change column definitions!

db.isSchemaDirty()

This method checks the schema signature without performing any migrations. It's useful for UI feedback or custom logic.

Returns: Promise<{ dirty: boolean; current: string; stored: string | null }>

What's Supported

The migration system now supports:

  • ✅ Creating new tables
  • ✅ Adding simple nullable columns (via ALTER TABLE)
  • Changing column types (via table recreation)
  • Adding/removing UNIQUE constraints (via table recreation)
  • Adding/removing NOT NULL constraints (via table recreation)
  • Adding/removing PRIMARY KEY (via table recreation)
  • ✅ Adding/removing DEFAULT values (via table recreation)
  • ✅ Dropping tables (with performDestructiveActions: true)
  • ✅ Dropping columns (with performDestructiveActions: true)

Limitations

The migrator DOES NOT support:

  • Renaming columns or tables- Use manual migration helpers:
    await db.renameTable('old_name', 'new_name')
    await db.renameColumn('table', 'old_col', 'new_col')
    
  • Custom migration logic - For complex data transformations, write custom SQL
For operations not supported by automatic migrations, use the Manual Schema Management helpers within a custom migration script.

Best Practices

Development Workflow

src/db.ts
// Use auto-migration during development
await db.migrateIfDirty()

Production Workflow

src/db.ts
// Check before migrating
const check = await db.checkMigration()

if (!check.safe) {
  console.warn('⚠️ Migration requires table recreation!')
  console.warn('Affected tables:', check.changes.tablesToRecreate)
  
  // Log to monitoring system
  await logToMonitoring({
    event: 'destructive_migration_required',
    tables: check.changes.tablesToRecreate
  })
  
  // Require explicit approval or rollback
  const approved = await askUserForConfirmation()
  if (!approved) {
    throw new Error('Migration cancelled by user')
  }
}

// Backup before migration
await backupDatabase()

// Run migration
await db.migrate()

CI/CD Pipeline

scripts/check-migrations.ts
// In your CI/CD pipeline, fail if destructive migrations are detected
const check = await db.checkMigration()

if (!check.safe) {
  console.error('❌ Destructive migration detected!')
  console.error('Tables to recreate:', check.changes.tablesToRecreate)
  process.exit(1)
}

console.log('✅ Migration is safe')

Large Production Databases

For tables with millions of rows, table recreation can be slow. Consider:

  1. Schedule during maintenance windows
  2. Use read replicas during migration
  3. Write custom migration scripts with batching:
scripts/custom-migration.ts
// For very large tables, consider custom batched migrations
const batchSize = 10000
let offset = 0

while (true) {
  const rows = await db.select(oldTable)
    .limit(batchSize)
    .offset(offset)
    .all()
  
  if (rows.length === 0) break
  
  await db.insert(newTable).values(rows).execute()
  offset += batchSize
  
  console.log(`Migrated ${offset} rows...`)
}
Copyright © 2026