Skip to content

Instantly share code, notes, and snippets.

@brandonkal
Last active March 2, 2025 23:03
Show Gist options
  • Save brandonkal/0ce05b5021157eccdcb4847cca78679a to your computer and use it in GitHub Desktop.
Save brandonkal/0ce05b5021157eccdcb4847cca78679a to your computer and use it in GitHub Desktop.
Generate zero schema from Database
#!/usr/bin/env node
/**
* @file db-to-zero
* @copyright 2024 Brandon Kalinowski
* @description Generate Zero Schema direct from Postgres (or any ORM) via atlas
*
* NOTE: The tool expects tables to have a comment zero-gen=true. This is a declarative way to
* only include a subset of tables in the Zero schema
*
* Declarative: Similar to Terraform, Atlas compares the current state of the database
* to the desired state, as defined in an HCL, SQL, or ORM schema.
* Based on this comparison, it generates and executes a migration plan to
* transition the database to its desired state.
* @see https://atlasgo.io/atlas-schema/hcl
* @see https://atlasgo.io/orms/why-atlas-for-your-orm
*
* @example
* ```bash
* atlas schema inspect -u $ZERO_UPSTREAM_DB --schema public --format '{{ json . }}' | \
node ./scripts/db-to-zero.js | \
biome format --stdin-file-path=src/generated-zero-schema.ts > src/generated-zero-schema.ts
* ```
*/
const USAGE_DOCS = true;
/**
* @typedef {'string' | 'number' | 'boolean' | 'json'} ColumnType
*/
/**
* @typedef {import('')}
*/
/**
* @typedef {Object} SchemaValue
* @property {ColumnType} type
* @property {boolean} [optional]
* @property {unknown} [customType]
*/
/**
* @typedef {Object} PostgresSchema
* @property {Array<{
* name: string,
* tables: Array<{
* name: string,
* comment?: string,
* columns: Array<{
* name: string,
* type: string,
* null?: boolean
* }>,
* primary_key: {
* parts: Array<{ column: string }>
* },
* foreign_keys?: Array<{
* columns: string[],
* references: {
* table: string,
* columns: string[]
* }
* }>
* }>
* }>} schemas
*/
/**
* @typedef {Object} FieldRelationship
* @property {string} sourceField - The source field name
* @property {string} destField - The destination field name
* @property {function(): Schema} destSchema - Function that returns the destination schema
*/
// Read from stdin
let inputData = "";
process.stdin.setEncoding("utf-8");
process.stdin.on("data", (chunk) => {
inputData += chunk;
});
process.stdin.on("end", () => {
try {
main(inputData);
} catch (error) {
console.error("Error:", error.message);
process.exit(1);
}
});
/**
* Convert Postgres type to schema value
* @param {string} pgType
* @returns {SchemaValue}
*/
function convertType(pgType) {
// Handle array types
if (pgType === "ARRAY") {
return { type: "json", customType: [] };
}
// Handle JSON types
if (pgType === "jsonb" || pgType === "json") {
return { type: "json" };
}
// Handle numeric types
if (
pgType === "integer" ||
pgType === "numeric" ||
pgType === "bigint" ||
pgType === "smallint"
) {
return { type: "number" };
}
// Handle boolean
if (pgType === "boolean") {
return { type: "boolean" };
}
// Handle text/varchar types
if (pgType.includes("character varying") || pgType.includes("text")) {
return { type: "string" };
}
// Handle timestamp
if (pgType.includes("timestamp")) {
return { type: "string" }; // Store timestamps as ISO strings
}
// Default to string for unknown types
console.warn(`Unknown type: ${pgType}, defaulting to string`);
return { type: "string" };
}
/**
* Main function to process the schema
* @param {string} input
*/
function main(input) {
/** @type {PostgresSchema} */
const schemaJson = JSON.parse(input);
// Find the public schema
const publicSchema = schemaJson.schemas.find((s) => s.name === "public");
if (!publicSchema) {
throw new Error("Public schema not found");
}
// Generate table schemas
/** @type {Record<string, unknown>} */
const tableSchemas = {};
const combinedSchema = {
version: 1,
tables: {},
};
const defaultPermissions = {
row: {
select: [],
insert: [],
update: { preMutation: [] },
delete: [],
},
};
/** @type {Record<string, unknown>} */
const permissionsOutput = {};
// First pass: Create basic table schemas
for (const table of publicSchema.tables) {
if (!table.comment || !table.comment.includes("zero-gen=")) continue;
/** @type {Record<string, SchemaValue>} */
const columns = {};
// Process columns
for (const column of table.columns) {
const typeInfo = convertType(column.type);
columns[column.name] = {
...typeInfo,
optional: column.null === true,
};
}
/** @type {Record<string, FieldRelationship} */
const relationships = {};
let hasRelationships = false;
const primaryKey = table.primary_key.parts.map((v) => v.column);
if (table.foreign_keys) {
for (const fkey of table.foreign_keys) {
const key_name = fkey.columns[0];
if (key_name) {
hasRelationships = true;
relationships[key_name] = {
sourceField: fkey.references.columns[0],
destField: fkey.columns[0],
destSchema: `🎾() => ${fkey.references.table}Schema🎾`,
};
}
}
}
// Create table schema
tableSchemas[table.name] = {
tableName: table.name,
primaryKey: primaryKey.length === 1 ? primaryKey[0] : primaryKey,
columns,
relationships: hasRelationships ? relationships : undefined,
};
combinedSchema.tables[table.name] = `🎾${table.name}Schema🎾`;
permissionsOutput[table.name] = defaultPermissions;
}
// Generate TypeScript code
const output = `// Generated by scripts/db-to-zero.js
import {
createTableSchema,
definePermissions,
NOBODY_CAN,
ANYONE_CAN,
type Schema,
type TableSchema,
type ExpressionBuilder,
type Row
} from '@rocicorp/zero';
// Auth data type from JWT
type AuthData = {
sub: string;
roles?: string[];
};
${Object.entries(tableSchemas)
.map(
([name, schema]) =>
`/**
* Schema for the ${name} table
*/
export const ${name}Schema = createTableSchema(${JSON.stringify(schema, null, 2)});`,
)
.join("\n\n")}
/**
* Combined schema containing all tables
*/
export const schema: Schema = ${JSON.stringify(combinedSchema, null, 2)};
// Define permissions
export const permissions = definePermissions<AuthData, typeof schema>(schema, () => {
return ${JSON.stringify(permissionsOutput, null, 2)};
});
// Export types
export type { AuthData };
`
.replaceAll(`"🎾`, "")
.replaceAll(`🎾"`, "");
// Write to stdout
process.stdout.write(output);
}

Inkjet Executable Markdown

https://github.com/brandonkal/inkjet

gen//default

Generate Zero schema from upstream database

atlas schema inspect -u $ZERO_UPSTREAM_DB  --schema public --format '{{ json . }}' | \
  node ./scripts/db-to-zero.js | \
  biome format --stdin-file-path=src/generated-zero-schema.ts > src/generated-zero-schema.ts

atlas

Generate Atlas schema JSON for development

atlas schema inspect -u $ZERO_UPSTREAM_DB  --schema public --format '{{ json . }}'
@brandonkal
Copy link
Author

This approach uses atlas to introspect the database state. If you prefer, you can use atlas with any ORM you prefer instead. The generated permissions are NOBODY_CAN, a saner default, so just update the arrays to allow permissions

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment