Appearance
Database Migration Guidelines
This document is the single source of truth for all database migration procedures, rules, and best practices for the Family Shapes project. All database schema changes must follow these guidelines to ensure consistent, reliable, and safe database evolution.
Note: This document consolidates all migration-related guidance. AI assistants and developers should refer to this document for all migration-related questions.
Core Principles
- Migrations as Source of Truth: All database schema changes must be made through migrations in
supabase/migrations/directory. - No Direct Production Edits: Never modify the production database schema directly through the Supabase Dashboard.
- Verification Required: All migrations must be verified locally before applying to production.
- Type Safety: TypeScript types must be regenerated after schema changes.
- Alignment with Roadmap: All database changes must align with the current phase and priorities in the ROADMAP.md.
Migration Workflow
1. Preparation
bash
# Ensure environment is set up
# Create .env file at repo root with:
SUPABASE_PROJECT_REF=nhkufibfwskdpzdjwirr
SUPABASE_DB_PASSWORD=your-remote-db-password
# Reset local database to match migrations
make db/reset
# Verify no drift exists
make db/check # Should show: ✅ No drift.2. Check Project Priorities
Before creating any database migrations:
Review the ROADMAP.md file, focusing on:
- The "Immediate Priorities" section
- The "Key Milestones" table
- The current development phase
Verify that your database changes:
- Support tasks in the immediate priorities
- Align with the current development phase
- Don't conflict with planned architecture changes
3. Creating Migrations
bash
# After making local schema changes
make db/diff name=0002_short_description
# Verify migration
make db/reset
make db/check4. Updating Types
bash
# Regenerate TypeScript types
make db/types5. Applying to Production
bash
# Only after PR approval and CI passing
make db/pushMigration Guidelines
DO
- Use descriptive migration names with sequential numbering
- Include comments in SQL explaining complex changes
- Test migrations locally before committing
- Update TypeScript types after schema changes
- Use Row Level Security (RLS) policies for all tables
- Include rollback procedures for complex migrations
DON'T
- Modify the baseline migration (
0001_baseline_prod_schema.sql) - Edit production schema directly in Supabase Dashboard
- Create migrations outside the
supabase/migrations/directory - Commit sensitive data or credentials
- Make breaking changes without migration plans
Critical Timestamp Rules
- ALWAYS use the CURRENT timestamp when creating migrations manually
- NEVER use past or future dates - migrations must be in chronological order
- If creating a migration manually, use:
date +%Y%m%d%H%M%Sto get the current timestamp - Migration order is determined by timestamp, NOT by the number in the filename
- Example:
20250917124423_0006_fix_something.sql(September 17, 2025, 12:44:23)
Row Level Security
All tables should have appropriate RLS policies:
sql
-- Example RLS policy for organization-scoped data
CREATE POLICY "Users can view their organization's data"
ON table_name
FOR SELECT
USING (
auth.uid() IN (
SELECT user_id FROM organization_members
WHERE organization_id = table_name.organization_id
)
);Common Migration Patterns
Adding a New Table
sql
-- Create the table
CREATE TABLE public.new_table (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
created_at TIMESTAMPTZ DEFAULT now(),
updated_at TIMESTAMPTZ DEFAULT now(),
name TEXT NOT NULL,
description TEXT,
user_id UUID REFERENCES auth.users(id) ON DELETE CASCADE,
organization_id UUID REFERENCES public.organizations(id) ON DELETE CASCADE
);
-- Add RLS policies
ALTER TABLE public.new_table ENABLE ROW LEVEL SECURITY;
-- Add policies
CREATE POLICY "Users can view their organization's data"
ON public.new_table FOR SELECT USING (
auth.uid() IN (
SELECT user_id FROM organization_members
WHERE organization_id = new_table.organization_id
)
);
-- Add triggers
CREATE TRIGGER set_updated_at
BEFORE UPDATE ON public.new_table
FOR EACH ROW
EXECUTE FUNCTION public.set_updated_at();Modifying an Existing Table
sql
-- Add a column
ALTER TABLE public.existing_table ADD COLUMN new_column TEXT;
-- Modify a column
ALTER TABLE public.existing_table ALTER COLUMN existing_column SET NOT NULL;
-- Add a constraint
ALTER TABLE public.existing_table ADD CONSTRAINT unique_name UNIQUE (name);Creating Functions and Triggers
sql
-- Create a function
CREATE OR REPLACE FUNCTION public.function_name()
RETURNS trigger AS $$
BEGIN
-- Function logic
RETURN NEW;
END;
$$ LANGUAGE plpgsql SECURITY DEFINER;
-- Create a trigger
CREATE TRIGGER trigger_name
AFTER INSERT ON public.table_name
FOR EACH ROW
EXECUTE FUNCTION public.function_name();Environment Setup
Create a local .env file at the repo root (never committed):
env
SUPABASE_PROJECT_REF=nhkufibfwskdpzdjwirr
SUPABASE_DB_PASSWORD=your-remote-db-password
# Optional: quiet logs for CI-like runs
# SUPABASE_CI=1Requirements:
- Supabase CLI must be installed
- Project should be linked (the scripts auto-link if needed):bash
supabase link --project-ref $SUPABASE_PROJECT_REF
Available Commands
Always use the Makefile targets; do not call raw CLI unless noted.
bash
make # list all DB targets
make db/status # local stack status
make db/reset # rebuild local from migrations (+ seed.sql if present)
make db/diff name=0002_short_description # create a new migration from local changes vs PROD
make db/types # regenerate TS types to src/types/database.ts
make db/check # ensure no drift vs PROD (non-interactive)
make db/push # apply migrations to PROD (only after PR review / CI green)Notes:
db/diffgeneratesYYYYMMDDHHMMSS_name.sqlundersupabase/migrations/db/resetapplies all migrations to a clean local DB and (optionally)supabase/seed.sqlif presentdb/checkfails if migrations and the linked remote schema differ
Prohibited Actions
- Editing production schema directly in the Supabase Dashboard
- Modifying or deleting the baseline migration:
*_0001_baseline_prod_schema.sql - Writing schema migrations outside
supabase/migrations/ - Committing
.envor any secrets (keys/tokens/passwords)
File Conventions & Layout
- Migrations live in
supabase/migrations/and are timestamped by the CLI - Baseline already exists:
supabase/migrations/20250806171912_0001_baseline_prod_schema.sql- Do not edit this file. New work starts at
0002_*
- Archived pre-baseline files live in
supabase/_archived_migrations/. Do not move them back into the loader path - Seeds (optional, idempotent test data) belong in
supabase/seed.sql - Backups (local-only) live in
/backup/and are gitignored
Complete Migration Checklist
Pre-Migration Checklist
Before creating any database migration, complete these steps:
✅ 1. Verify Alignment with Roadmap
- [ ] Reviewed ROADMAP.md "Immediate Priorities" section
- [ ] Confirmed this migration supports a current priority task
- [ ] Verified migration aligns with current development phase
- [ ] Checked for dependencies on other migrations
- [ ] Confirmed no conflicts with planned architecture changes
✅ 2. Verify Clean Database State
bash
# Reset local database to match migrations
make db/reset
# Verify no drift exists
make db/check # Should show: ✅ No drift.- [ ]
make db/resetcompleted successfully - [ ]
make db/checkshows no drift - [ ] Local database matches production schema
- [ ] No uncommitted migrations exist
✅ 3. Verify Environment Configuration
- [ ]
.envfile exists at repository root - [ ]
SUPABASE_PROJECT_REF=nhkufibfwskdpzdjwirris set - [ ]
SUPABASE_DB_PASSWORDis set correctly - [ ] Can connect to local Supabase instance
- [ ] Can connect to remote Supabase instance
Migration Creation Checklist
✅ 4. Create Migration with Descriptive Name
bash
# Use sequential numbering and short description
make db/diff name=0002_short_description- [ ] Migration name follows pattern:
NNNN_short_description - [ ] Description is clear and concise
- [ ] Sequential number is correct (check existing migrations)
- [ ] Migration file created in
supabase/migrations/
✅ 5. Review Generated Migration SQL
- [ ] SQL syntax is correct
- [ ] All dependencies exist (referenced tables/columns)
- [ ] No conflicts with existing objects
- [ ] Includes appropriate comments explaining complex changes
- [ ] Follows PostgreSQL best practices
✅ 6. Add Row Level Security (RLS) Policies
If creating or modifying tables:
sql
-- Enable RLS
ALTER TABLE public.table_name ENABLE ROW LEVEL SECURITY;
-- Add policies for each operation
CREATE POLICY "policy_name_select"
ON public.table_name FOR SELECT
USING (/* condition */);
CREATE POLICY "policy_name_insert"
ON public.table_name FOR INSERT
WITH CHECK (/* condition */);
CREATE POLICY "policy_name_update"
ON public.table_name FOR UPDATE
USING (/* condition */)
WITH CHECK (/* condition */);
CREATE POLICY "policy_name_delete"
ON public.table_name FOR DELETE
USING (/* condition */);- [ ] RLS is enabled on all new tables
- [ ] SELECT policy exists
- [ ] INSERT policy exists (if applicable)
- [ ] UPDATE policy exists (if applicable)
- [ ] DELETE policy exists (if applicable)
- [ ] Policies enforce proper data isolation
- [ ] Policies align with organization/user permissions
✅ 7. Add Triggers and Functions
If needed:
sql
-- Create function
CREATE OR REPLACE FUNCTION public.function_name()
RETURNS trigger AS $$
BEGIN
-- Function logic
RETURN NEW;
END;
$$ LANGUAGE plpgsql SECURITY DEFINER;
-- Create trigger
CREATE TRIGGER trigger_name
AFTER INSERT ON public.table_name
FOR EACH ROW
EXECUTE FUNCTION public.function_name();- [ ] Functions have clear names
- [ ] Functions include error handling
- [ ] Functions use
SECURITY DEFINERappropriately - [ ] Triggers are on correct events (INSERT/UPDATE/DELETE)
- [ ] Triggers are BEFORE or AFTER as appropriate
✅ 8. Add Indexes for Performance
sql
-- Add indexes for foreign keys and frequently queried columns
CREATE INDEX idx_table_column ON public.table_name(column_name);
CREATE INDEX idx_table_fk ON public.table_name(foreign_key_id);- [ ] Indexes added for foreign keys
- [ ] Indexes added for frequently queried columns
- [ ] Composite indexes considered for multi-column queries
- [ ] Index names follow convention:
idx_table_column
Testing Checklist
✅ 9. Test Migration Locally
bash
# Reset and apply migration
make db/reset
# Verify no drift
make db/check- [ ] Migration applies without errors
- [ ] No drift detected after migration
- [ ] All tables created successfully
- [ ] All policies created successfully
- [ ] All functions/triggers created successfully
✅ 10. Test RLS Policies
sql
-- Test as different users
SET ROLE authenticated;
SET request.jwt.claims TO '{"sub": "user-id"}';
-- Try SELECT, INSERT, UPDATE, DELETE
SELECT * FROM public.table_name;
INSERT INTO public.table_name (...) VALUES (...);
UPDATE public.table_name SET ... WHERE ...;
DELETE FROM public.table_name WHERE ...;- [ ] Policies allow authorized operations
- [ ] Policies block unauthorized operations
- [ ] Policies enforce data isolation correctly
- [ ] No security vulnerabilities identified
✅ 11. Update TypeScript Types
bash
# Regenerate types after schema changes
make db/types- [ ] Types generated successfully
- [ ] New types appear in
src/integrations/supabase/types.ts - [ ] Types match database schema
- [ ] No TypeScript errors in codebase
✅ 12. Test Application Integration
- [ ] Application starts without errors
- [ ] New tables/columns are accessible
- [ ] RLS policies work in application context
- [ ] No breaking changes to existing features
- [ ] All related services updated
Documentation Checklist
✅ 13. Document Migration
- [ ] Add comments in SQL explaining complex logic
- [ ] Document breaking changes in migration file
- [ ] Update CHANGELOG.md with changes
- [ ] Update relevant documentation in
DOCS/ - [ ] Document any manual steps required
✅ 14. Document Rollback Procedure
If migration is complex or risky:
sql
-- Rollback procedure (in comments)
-- 1. Drop new table: DROP TABLE public.new_table;
-- 2. Restore old column: ALTER TABLE public.old_table ADD COLUMN old_column TEXT;
-- 3. Migrate data back: UPDATE public.old_table SET old_column = ...;- [ ] Rollback steps documented
- [ ] Rollback tested locally
- [ ] Data migration rollback considered
- [ ] Backup strategy documented
Pre-Production Checklist
✅ 15. Final Verification
- [ ] All tests pass (
npm run test) - [ ] E2E tests pass (
npx playwright test) - [ ] No console errors in application
- [ ] Migration reviewed by another developer
- [ ] PR approved and merged
✅ 16. Production Deployment
bash
# Only after PR approval and CI passing
make db/push- [ ] PR is approved
- [ ] CI/CD pipeline passes
- [ ] Backup of production database taken
- [ ] Migration applied to production
- [ ] Production application tested
- [ ] No errors in production logs
Pull Request Checklist
When submitting a PR with database changes, ensure:
- [ ] New migration added under
supabase/migrations/with a clear name - [ ] Local rebuild OK:
make db/reset - [ ] Drift check OK:
make db/checkprints No drift - [ ] Types updated if relevant:
make db/types - [ ] PR description summarizes schema/RLS/trigger changes, indexes, and any rollout or backfill considerations
- [ ] Changes align with current phase in ROADMAP.md
- [ ] Changes support tasks in the "Immediate Priorities" section of ROADMAP.md (if applicable)
Additional Migration Patterns
Renaming Tables/Columns
sql
-- Rename table
ALTER TABLE public.old_name RENAME TO new_name;
-- Rename column
ALTER TABLE public.table_name RENAME COLUMN old_column TO new_column;
-- Update policies (they reference table/column names)
DROP POLICY IF EXISTS "old_policy_name" ON public.new_name;
CREATE POLICY "new_policy_name" ON public.new_name FOR SELECT USING (...);Data Migration
sql
-- Backfill data for new column
UPDATE public.table_name
SET new_column = CASE
WHEN condition THEN 'value1'
ELSE 'value2'
END
WHERE new_column IS NULL;
-- Migrate data between tables
INSERT INTO public.new_table (id, name, created_at)
SELECT id, name, created_at FROM public.old_table;Adding Indexes
sql
-- Add indexes for foreign keys and frequently queried columns
CREATE INDEX idx_table_column ON public.table_name(column_name);
CREATE INDEX idx_table_fk ON public.table_name(foreign_key_id);
-- Composite index for multi-column queries
CREATE INDEX idx_table_composite ON public.table_name(column1, column2);Troubleshooting
Drift Detection Issues
If make db/check shows drift:
- Verify your local database is reset:
make db/reset - Check for uncommitted migrations
- Ensure environment variables are correct
- Check for direct production changes (should never happen)
Drift check hangs or prompts:
- Ensure
.envhas exact keys (no quotes, no trailing spaces):SUPABASE_PROJECT_REF=...SUPABASE_DB_PASSWORD=...
- Re-run:
make db/check - One-off bypass push (local only):
git push --no-verify, then fix the cause
Migration Failures
If migrations fail to apply:
- Check SQL syntax
- Verify dependencies (referenced tables/columns exist)
- Check for conflicts with existing objects
- Review error messages in logs
- Test migration in isolation
RLS Policy Issues
If policies don't work as expected:
- Verify RLS is enabled:
ALTER TABLE ... ENABLE ROW LEVEL SECURITY; - Test policies with different user contexts
- Check policy conditions are correct
- Verify
auth.uid()is available in context - Check for policy conflicts (multiple policies combine with OR)
Local Postgres Crash-Loop After CLI Update
bash
supabase stop
# Remove ONLY local dev volumes for this project (safe for PROD)
docker volume rm $(docker volume ls -q --filter label=com.supabase.cli.project=nhkufibfwskdpzdjwirr)
supabase startPre-push Hook Blocks Push
- Use
git push --no-verifyonce, then runmake db/checkand resolve
Security Issues
- Never commit secrets or
.env. Use GitHub Actions secrets for CI - If keys are exposed publicly, rotate them in the Supabase dashboard and update
.env
Standard Agent Flow (Quick Reference)
- Ensure
.envexists and containsSUPABASE_PROJECT_REF(+SUPABASE_DB_PASSWORDif needed) - Sync local DB to migrations:bash
make db/reset make db/check # should end with: ✅ No drift. - Implement schema changes locally (SQL, RLS, triggers, functions)
- Create a migration and verify:bash
make db/diff name=0002_<short_description> make db/reset make db/check - Regenerate types if tables/views/functions used by the app changed:bash
make db/types - Commit and open a PR with the checklist above
- Apply to production only after PR is approved and CI is green:bash
make db/push
Resources
- Supabase CLI Documentation
- PostgreSQL Documentation
- Row Level Security Guide
- Project Priorities Guidelines
- ROADMAP.md - Check "Immediate Priorities" before creating migrations
- CHANGELOG.md - Document all migration changes here