Database Migrations
Database Migrations
Learn how to manage database schema changes safely using Drizzle's migration system. This guide covers schema evolution, migration generation, and deployment strategies for Solo Kit's PostgreSQL-first architecture.
🎯 Migration Overview
What Are Migrations?
Migrations are version-controlled scripts that evolve your database schema over time. They ensure:
- Consistent schemas across development, staging, and production
- Reversible changes with rollback capabilities
- Team collaboration without schema conflicts
- Safe deployments with automated schema updates
Solo Kit Migration Architecture
Solo Kit uses a dual-approach migration system:
📊 Schema Definition (schema/*.ts)
↓ (drizzle-kit generate)
📄 Migration Files (migrations/*.sql)
↓ (drizzle-kit push/migrate)
🐘 PostgreSQL Database
Two Migration Modes:
- Development Mode (
pnpm db:push
) - Direct schema synchronization - Production Mode (
pnpm db:migrate
) - Migration file execution
🔧 Migration Commands
Development Commands
# Push schema changes directly (development only)
pnpm db:push
# Generate migration files from schema changes
pnpm db:generate
# Check for schema issues
pnpm db:check
# Open visual database browser
pnpm db:studio
Production Commands
# Run migrations (production safe)
pnpm db:migrate
# Run migrations with custom script
pnpm db:migrate:dev # Generate migrations
pnpm db:migrate:prod # Apply migrations in production
📝 Schema-First Development
1. Define Schema Changes
Solo Kit follows a schema-first approach. Edit schema files to define changes:
// packages/database/schema/users.ts
export const users = pgTable(
'users',
{
id: text('id').primaryKey().$defaultFn(() => randomUUID()),
email: text('email').notNull().unique(),
name: text('name').notNull(),
// ✅ Add new columns here
bio: text('bio'), // New column - nullable by default
isActive: boolean('is_active').default(true), // With default value
// Timestamps
createdAt: timestamp('created_at', { mode: 'date' }).notNull().defaultNow(),
updatedAt: timestamp('updated_at', { mode: 'date' }).notNull().defaultNow(),
},
(table) => ({
// Add indexes for new columns
emailIdx: index('users_email_idx').on(table.email),
isActiveIdx: index('users_is_active_idx').on(table.isActive), // New index
})
);
2. Development Workflow
For rapid development iteration:
# 1. Edit schema files
# 2. Push changes directly to development database
pnpm db:push
# 3. Verify changes in Drizzle Studio
pnpm db:studio
Schema push handles:
- Adding/removing columns
- Modifying column types
- Creating/dropping indexes
- Adding/removing constraints
3. Production Preparation
Before deploying to production:
# 1. Generate migration files from schema changes
pnpm db:generate
# 2. Review generated migration files
ls packages/database/drizzle/
# 3. Test migrations on staging database
NODE_ENV=staging pnpm db:migrate
# 4. Commit migration files to git
git add packages/database/drizzle/
git commit -m "feat: add user bio and active status fields"
🏭 Migration Generation
Automatic Migration Generation
Drizzle analyzes schema changes and generates SQL migrations:
pnpm db:generate
Example generated migration:
-- packages/database/drizzle/0001_add_user_bio.sql
ALTER TABLE "users" ADD COLUMN "bio" text;
ALTER TABLE "users" ADD COLUMN "is_active" boolean DEFAULT true;
CREATE INDEX "users_is_active_idx" ON "users" ("is_active");
Migration Metadata
Drizzle tracks migrations in metadata files:
// packages/database/drizzle/meta/_journal.json
{
"version": "7",
"dialect": "postgresql",
"entries": [
{
"idx": 0,
"version": "7",
"when": 1234567890,
"tag": "0001_add_user_bio",
"breakpoints": false
}
]
}
🚀 Migration Execution
Development Migration
Using db:push
(Development Only):
# Direct schema synchronization
pnpm db:push
What happens:
- Drizzle compares schema with database
- Generates and executes SQL changes
- Updates database immediately
- No migration files created
Advantages:
- Fast iteration
- No migration file management
- Perfect for development
Disadvantages:
- Not suitable for production
- No rollback capability
- No migration history
Production Migration
Using db:migrate
(Production Safe):
# Apply migration files
pnpm db:migrate
Migration execution process:
// scripts/migrate.ts (Solo Kit's custom migration runner)
async function runMigrations() {
console.log('🚀 Starting database migrations...');
// Create PostgreSQL connection
const client = postgres(process.env.DATABASE_URL, {
max: 1, // Single connection for migrations
prepare: false, // Better compatibility
connect_timeout: 60, // Longer timeout for migrations
});
const db = drizzle(client);
// Apply migrations from drizzle folder
await migrate(db, { migrationsFolder: './packages/database/drizzle' });
console.log('✅ Migrations completed successfully!');
}
📊 Migration Patterns
Safe Schema Changes
✅ Safe Operations (No downtime):
// Add nullable columns
newField: text('new_field'), // Safe - nullable by default
// Add columns with defaults
isActive: boolean('is_active').default(true), // Safe - default provided
// Add indexes (PostgreSQL creates them concurrently)
// In migration SQL:
CREATE INDEX CONCURRENTLY "users_email_idx" ON "users" ("email");
// Add new tables
export const userProfiles = pgTable('user_profiles', {
// ... table definition
});
⚠️ Potentially Breaking Operations:
// Remove columns (loses data)
// Don't do this: remove existing fields
// Rename columns (requires coordinated deployment)
// Don't do this directly - use multi-step process
// Change column types (may require data conversion)
// Be careful with: text to integer, etc.
// Add NOT NULL constraints to existing columns
// Don't do this: .notNull() on existing nullable columns
Multi-Step Migrations
For breaking changes, use a multi-step approach:
Step 1: Add new column
export const users = pgTable('users', {
// Existing columns...
email: text('email').notNull().unique(),
// Add new column alongside old
newEmail: text('new_email'), // Nullable initially
});
Step 2: Populate new column
-- Data migration
UPDATE users SET new_email = email WHERE new_email IS NULL;
Step 3: Switch application to use new column
// Update application code to use newEmail
Step 4: Make new column required, drop old column
export const users = pgTable('users', {
// Remove old column, make new one required
newEmail: text('new_email').notNull().unique(),
});
🔄 Advanced Migration Scenarios
Data Migrations
Combining schema and data changes:
-- 0002_user_name_split.sql
-- Add new columns
ALTER TABLE "users" ADD COLUMN "first_name" text;
ALTER TABLE "users" ADD COLUMN "last_name" text;
-- Populate new columns from existing data
UPDATE "users"
SET
"first_name" = split_part("name", ' ', 1),
"last_name" = CASE
WHEN array_length(string_to_array("name", ' '), 1) > 1
THEN substring("name" from position(' ' in "name") + 1)
ELSE ''
END
WHERE "first_name" IS NULL;
-- Add constraints after data population
ALTER TABLE "users" ALTER COLUMN "first_name" SET NOT NULL;
Custom Migration Scripts
For complex migrations, create custom scripts:
// scripts/migrations/custom-user-migration.ts
import { db } from '@packages/database';
import { users } from '@packages/database/schema';
export async function migrateUserData() {
console.log('🔄 Running custom user data migration...');
// Custom migration logic
const usersToUpdate = await db.select().from(users).where(/* conditions */);
for (const user of usersToUpdate) {
// Complex data transformation
await db.update(users)
.set({
// Custom updates
updatedAt: new Date(),
})
.where(eq(users.id, user.id));
}
console.log(`✅ Migrated ${usersToUpdate.length} users`);
}
Index Management
Creating indexes safely:
-- PostgreSQL: Use CONCURRENTLY for zero-downtime index creation
CREATE INDEX CONCURRENTLY "users_email_idx" ON "users" ("email");
-- Composite indexes for common query patterns
CREATE INDEX CONCURRENTLY "users_active_created_idx"
ON "users" ("is_active", "created_at" DESC);
-- Partial indexes for filtered queries
CREATE INDEX CONCURRENTLY "active_users_email_idx"
ON "users" ("email") WHERE "is_active" = true;
🚨 Migration Best Practices
1. Test Migrations Thoroughly
# Test migration generation
pnpm db:generate --verbose
# Test on staging database first
NODE_ENV=staging pnpm db:migrate
# Verify schema after migration
pnpm db:studio
2. Backup Before Production Migrations
# Create database backup (Neon example)
neonctl branches create --parent main backup-$(date +%Y%m%d)
# Or traditional pg_dump
pg_dump $DATABASE_URL > backup_$(date +%Y%m%d).sql
3. Monitor Migration Performance
// Add timing to migration script
const startTime = Date.now();
await migrate(db, { migrationsFolder: migrationsPath });
const duration = Date.now() - startTime;
console.log(`Migration completed in ${duration}ms`);
4. Rollback Strategy
# Keep rollback migrations ready
# 0003_remove_user_bio_rollback.sql
ALTER TABLE "users" DROP COLUMN "bio";
DROP INDEX "users_is_active_idx";
5. Review Migration Files
# Always review generated SQL before production
cat packages/database/drizzle/0001_*.sql
# Check for potentially dangerous operations
grep -i "DROP\|ALTER.*NOT NULL\|TRUNCATE" packages/database/drizzle/*.sql
🔍 Troubleshooting Migrations
Common Issues
Migration Fails: Column Already Exists
# Error: column "bio" of relation "users" already exists
# Solution: Reset migration state or create new migration
pnpm db:push --force
Schema Drift (Database doesn't match schema)
# Detect schema drift
pnpm db:check
# Fix by pushing current schema
pnpm db:push
Migration Timeout
// Increase timeout in drizzle config
export default defineConfig({
// ... other config
dbCredentials: {
url: enhancedDatabaseUrl(), // Includes timeout parameters
},
});
Connection Issues During Migration
# Check DATABASE_URL
echo $DATABASE_URL | head -c 50
# Test connection
pnpm db:studio
# Run migration with verbose logging
pnpm db:migrate --verbose
Recovery Procedures
Reset Migration State:
-- Manually reset migration tracking (emergency only)
DELETE FROM "__drizzle_migrations" WHERE tag = 'problematic_migration';
Force Schema Sync:
# Force push current schema (development only)
pnpm db:push --force
# Regenerate migration files from scratch
rm -rf packages/database/drizzle/
pnpm db:generate
🎯 Production Deployment
CI/CD Integration
# .github/workflows/deploy.yml
- name: Run Database Migrations
run: |
pnpm db:migrate
env:
DATABASE_URL: ${{ secrets.DATABASE_URL }}
Zero-Downtime Deployments
- Backward-compatible migrations first
- Deploy application code
- Remove deprecated columns in follow-up migration
Monitoring Migration Success
// Add migration verification
const tablesResult = await client`
SELECT table_name
FROM information_schema.tables
WHERE table_schema = 'public'
`;
console.log(`Database has ${tablesResult.length} tables after migration`);
🎯 Next Steps
Master database migrations with these advanced guides:
- Performance - Optimize migration performance
- Security - Secure migration practices
- Monitoring - Monitor database changes
- Advanced - Complex migration patterns
Proper migration management ensures your Solo Kit application can evolve safely and reliably as your requirements grow.