S
Solo Kit
DocumentationComponentsPricingChangelogRoadmapFAQContact
LoginGet Started
DocumentationComponentsPricing
LoginGet Started
Welcome to Solo Kit DocumentationIntroductionTech StackRoadmapFAQGetting Started
Database OverviewSchema DesignDatabase QueriesDatabase MigrationsDatabase SeedingDatabase PerformanceDatabase SecurityBackup & RestoreDatabase MonitoringAdvanced Database Features
Database

Database Migrations

Database Migrations

Learn how to manage database schema changes safely using Convex. This guide covers schema evolution, deployment strategies, and best practices for Solo Kit's Convex-powered architecture.

Migration Overview

What Are Schema Changes?

Schema changes are modifications to your database structure over time. With Convex, they ensure:

  • Consistent schemas across development, staging, and production
  • Type-safe changes with automatic validation
  • Safe deployments with automatic schema updates
  • Zero-downtime schema evolution

Solo Kit Schema Architecture

Solo Kit uses Convex's declarative schema system:

Schema Definition (convex/schema.ts)
    ↓ (npx convex dev / deploy)
Automatic Schema Sync
    ↓
Convex Database

Key Benefits:

  • Declarative Schema: Define your schema in TypeScript
  • Automatic Sync: Convex handles schema changes automatically
  • Type Safety: Full TypeScript integration
  • Zero Migrations Files: No SQL migration files to manage

Schema Commands

Development Commands

# Start development server (watches for schema changes)
npx convex dev

# Push schema changes to development
npx convex dev --once

# Check schema status
npx convex dashboard

Production Commands

# Deploy schema and functions to production
npx convex deploy

# Deploy with environment variables
npx convex deploy --prod

Schema-First Development

1. Define Schema Changes

Solo Kit follows a schema-first approach. Edit the schema file to define changes:

// convex/schema.ts
import { defineSchema, defineTable } from 'convex/server';
import { v } from 'convex/values';

export default defineSchema({
  users: defineTable({
    email: v.string(),
    name: v.string(),

    // Add new optional fields
    bio: v.optional(v.string()), // New field - optional by default
    isActive: v.optional(v.boolean()), // With optional handling

    // Timestamps
    createdAt: v.number(),
    updatedAt: v.number(),
  })
    .index('by_email', ['email'])
    .index('by_isActive', ['isActive']), // New index
});

2. Development Workflow

For rapid development iteration:

# 1. Edit convex/schema.ts
# 2. Convex dev server automatically syncs changes
npx convex dev

# 3. Verify changes in Convex Dashboard
npx convex dashboard

Schema sync handles:

  • Adding/removing fields
  • Modifying field types (with care)
  • Creating/dropping indexes
  • Adding new tables

3. Production Deployment

Before deploying to production:

# 1. Test schema changes in development
npx convex dev

# 2. Review changes in dashboard
npx convex dashboard

# 3. Deploy to production
npx convex deploy

# 4. Verify deployment
npx convex dashboard --prod

Schema Evolution Patterns

Adding Optional Fields

Safe: Adding optional fields is always backward compatible:

// convex/schema.ts
export default defineSchema({
  users: defineTable({
    email: v.string(),
    name: v.string(),

    // New optional fields - safe to add
    bio: v.optional(v.string()),
    avatarUrl: v.optional(v.string()),
    preferences: v.optional(v.string()),

    createdAt: v.number(),
    updatedAt: v.number(),
  }),
});

Adding Required Fields

Requires data migration for existing documents:

// Step 1: Add as optional first
export default defineSchema({
  users: defineTable({
    email: v.string(),
    name: v.string(),
    role: v.optional(v.union(v.literal('user'), v.literal('admin'))), // Optional first
    createdAt: v.number(),
    updatedAt: v.number(),
  }),
});

// Step 2: Run migration to populate existing documents
// convex/migrations.ts
export const migrateUserRoles = internalMutation({
  handler: async (ctx) => {
    const users = await ctx.db.query('users').collect();

    for (const user of users) {
      if (!user.role) {
        await ctx.db.patch(user._id, { role: 'user' });
      }
    }
  },
});

// Step 3: After migration, make field required
export default defineSchema({
  users: defineTable({
    email: v.string(),
    name: v.string(),
    role: v.union(v.literal('user'), v.literal('admin')), // Now required
    createdAt: v.number(),
    updatedAt: v.number(),
  }),
});

Adding New Tables

Safe: New tables don't affect existing data:

// convex/schema.ts
export default defineSchema({
  // Existing tables...
  users: defineTable({
    // ... existing fields
  }),

  // New table - safe to add
  userPreferences: defineTable({
    userId: v.id('users'),
    theme: v.optional(v.union(v.literal('light'), v.literal('dark'), v.literal('system'))),
    emailNotifications: v.optional(v.boolean()),
    createdAt: v.number(),
    updatedAt: v.number(),
  }).index('by_userId', ['userId']),
});

Adding Indexes

Safe: Indexes are created automatically:

// convex/schema.ts
export default defineSchema({
  users: defineTable({
    email: v.string(),
    name: v.string(),
    role: v.union(v.literal('user'), v.literal('admin')),
    createdAt: v.number(),
  })
    .index('by_email', ['email'])
    .index('by_role', ['role']) // New index
    .index('by_createdAt', ['createdAt']), // New index
});

Data Migrations

Running Data Migrations

Use internal mutations for data migrations:

// convex/migrations.ts
import { internalMutation } from './_generated/server';

// Migrate user data
export const migrateUserNames = internalMutation({
  handler: async (ctx) => {
    console.log('Starting user name migration...');

    const users = await ctx.db.query('users').collect();

    let migratedCount = 0;
    for (const user of users) {
      // Split full name into first/last
      if (user.name && !user.firstName) {
        const parts = user.name.split(' ');
        await ctx.db.patch(user._id, {
          firstName: parts[0] || '',
          lastName: parts.slice(1).join(' ') || '',
          updatedAt: Date.now(),
        });
        migratedCount++;
      }
    }

    console.log(`Migrated ${migratedCount} users`);
    return { migratedCount };
  },
});

// Backfill missing data
export const backfillUserDefaults = internalMutation({
  handler: async (ctx) => {
    const users = await ctx.db.query('users').collect();

    for (const user of users) {
      const updates: Record<string, any> = {};

      if (user.role === undefined) {
        updates.role = 'user';
      }
      if (user.isActive === undefined) {
        updates.isActive = true;
      }
      if (user.createdAt === undefined) {
        updates.createdAt = Date.now();
      }

      if (Object.keys(updates).length > 0) {
        await ctx.db.patch(user._id, {
          ...updates,
          updatedAt: Date.now(),
        });
      }
    }
  },
});

Running Migrations via CLI

# Run migration function
npx convex run migrations:migrateUserNames

# Run with arguments
npx convex run migrations:backfillUserDefaults

Batch Migrations for Large Datasets

// convex/migrations.ts
export const batchMigration = internalMutation({
  args: {
    batchSize: v.optional(v.number()),
    cursor: v.optional(v.id('users')),
  },
  handler: async (ctx, args) => {
    const batchSize = args.batchSize ?? 100;

    let query = ctx.db.query('users').order('asc');

    if (args.cursor) {
      const cursorDoc = await ctx.db.get(args.cursor);
      if (cursorDoc) {
        query = query.filter((q) => q.gt(q.field('_creationTime'), cursorDoc._creationTime));
      }
    }

    const users = await query.take(batchSize + 1);
    const hasMore = users.length > batchSize;
    const batch = hasMore ? users.slice(0, -1) : users;

    // Process batch
    for (const user of batch) {
      await ctx.db.patch(user._id, {
        // migration updates
        migratedAt: Date.now(),
      });
    }

    const nextCursor = hasMore ? batch[batch.length - 1]._id : null;

    return {
      processed: batch.length,
      hasMore,
      nextCursor,
    };
  },
});

Safe Schema Changes

Safe Operations (Zero Downtime)

// Adding optional fields
bio: v.optional(v.string()), // Safe

// Adding new tables
notifications: defineTable({...}), // Safe

// Adding indexes
.index('by_createdAt', ['createdAt']), // Safe

// Removing unused fields (if no code references them)
// Safe - Convex allows extra fields

Potentially Breaking Operations

// Changing field types
// Be careful: may fail for existing documents

// Making optional fields required
// Requires migration first

// Removing fields that code still references
// Causes runtime errors

// Renaming fields
// Requires coordinated deployment

Multi-Step Schema Changes

For breaking changes, use a multi-step approach:

// Step 1: Add new field alongside old
export default defineSchema({
  users: defineTable({
    email: v.string(), // Old field
    emailAddress: v.optional(v.string()), // New field (optional)
  }),
});

// Step 2: Run migration to copy data
export const migrateEmailField = internalMutation({
  handler: async (ctx) => {
    const users = await ctx.db.query('users').collect();
    for (const user of users) {
      if (!user.emailAddress) {
        await ctx.db.patch(user._id, {
          emailAddress: user.email,
        });
      }
    }
  },
});

// Step 3: Update application code to use new field

// Step 4: Make new field required, remove old field
export default defineSchema({
  users: defineTable({
    emailAddress: v.string(), // Now required
    // email removed
  }),
});

Production Deployment

Deployment Checklist

# 1. Test schema changes locally
npx convex dev

# 2. Run any required migrations in development
npx convex run migrations:yourMigration

# 3. Review changes in dashboard
npx convex dashboard

# 4. Deploy to production
npx convex deploy

# 5. Run migrations in production if needed
npx convex run migrations:yourMigration --prod

# 6. Verify deployment
npx convex dashboard --prod

Zero-Downtime Deployments

  1. Add new optional fields first
  2. Deploy application code that handles both old and new
  3. Run migrations to populate new fields
  4. Deploy application code that uses new fields
  5. Remove old fields in follow-up deployment

Rollback Strategy

// Keep rollback migrations ready
export const rollbackMigration = internalMutation({
  handler: async (ctx) => {
    console.log('Rolling back migration...');

    const users = await ctx.db.query('users').collect();

    for (const user of users) {
      // Revert changes
      await ctx.db.patch(user._id, {
        // rollback updates
      });
    }

    console.log('Rollback complete');
  },
});

Best Practices

1. Test Schema Changes Thoroughly

# Test in development first
npx convex dev

# Verify all functions work with new schema
pnpm test

# Check dashboard for any issues
npx convex dashboard

2. Use Optional Fields for New Data

// Always start with optional for new fields
newField: v.optional(v.string()),

// Handle undefined in your code
const value = user.newField ?? 'default';

3. Plan Migrations Carefully

// Create migration functions for each schema change
// convex/migrations.ts
export const migration_2024_01_15_addUserRoles = internalMutation({
  handler: async (ctx) => {
    // Clear migration logic
  },
});

4. Monitor Deployments

# Watch logs during deployment
npx convex logs

# Check function health
npx convex dashboard

Troubleshooting

Common Issues

Schema Validation Error:

# Error: Field 'role' is required but missing in some documents
# Solution: Make field optional or run migration first

Type Mismatch:

# Error: Expected string, got number
# Solution: Run migration to fix existing data

Index Creation:

# Indexes are created automatically
# Large tables may take time to index

Recovery Procedures

Revert Schema Change:

// Revert to previous schema definition
// Deploy immediately
npx convex deploy

Fix Data Issues:

// Create correction migration
export const fixDataIssue = internalMutation({
  handler: async (ctx) => {
    // Fix problematic documents
  },
});

CI/CD Integration

GitHub Actions

# .github/workflows/deploy.yml
name: Deploy to Convex

on:
  push:
    branches: [main]

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: '20'

      - name: Install dependencies
        run: pnpm install

      - name: Deploy to Convex
        env:
          CONVEX_DEPLOY_KEY: ${{ secrets.CONVEX_DEPLOY_KEY }}
        run: npx convex deploy

Next Steps

Master schema management with these advanced guides:

  1. Performance - Optimize schema performance
  2. Security - Secure schema practices
  3. Monitoring - Monitor schema changes
  4. Advanced - Complex schema patterns

Proper schema management ensures your Solo Kit application can evolve safely and reliably as your requirements grow.

Database Queries

Master type-safe database queries with Convex, from basic CRUD operations to complex data fetching and performance optimization

Database Seeding

Seed your database with demo data for development and testing purposes using Convex

On this page

Database MigrationsMigration OverviewWhat Are Schema Changes?Solo Kit Schema ArchitectureSchema CommandsDevelopment CommandsProduction CommandsSchema-First Development1. Define Schema Changes2. Development Workflow3. Production DeploymentSchema Evolution PatternsAdding Optional FieldsAdding Required FieldsAdding New TablesAdding IndexesData MigrationsRunning Data MigrationsRunning Migrations via CLIBatch Migrations for Large DatasetsSafe Schema ChangesSafe Operations (Zero Downtime)Potentially Breaking OperationsMulti-Step Schema ChangesProduction DeploymentDeployment ChecklistZero-Downtime DeploymentsRollback StrategyBest Practices1. Test Schema Changes Thoroughly2. Use Optional Fields for New Data3. Plan Migrations Carefully4. Monitor DeploymentsTroubleshootingCommon IssuesRecovery ProceduresCI/CD IntegrationGitHub ActionsNext Steps