Adding a new column sounds trivial. It can destroy uptime if done wrong. Whether you use PostgreSQL, MySQL, or another SQL database, schema changes require planning. Column additions block writes if executed without care. In high-traffic systems, that’s downtime.
First, check if your database supports non-locking ALTER TABLE for adding a column with a default value. If not, split the change into two steps: add the column without a default, then backfill data in batches. This avoids long locks. Always run migrations in a staging environment identical to production. Test with full datasets, not subsets.
When naming the new column, follow your established conventions. Keep names short but descriptive. Align data types with intended usage. For numeric data, pick the smallest type that meets the needs. For strings, consider indexing only if necessary — indexes speed reads but slow writes.
Deploy migrations during low-traffic windows when possible. If you use feature flags, deploy the application code to handle null values in the new column before the backfill begins. Once data is populated, update the application to treat the column as required.