CLI & Migrations
The audit CLI generates the audit_logs migration, verifies your schema, exports log entries, and purges old data — available install-free via npx without adding anything to your package.json.
Migration workflow
Section titled “Migration workflow”-
Generate the
audit_logsmigrationThe CLI auto-detects your ORM and dialect from
package.jsonandDATABASE_URL.Terminal window # Create an empty custom migration file for Drizzle Kit to managenpx drizzle-kit generate --custom --name=audit_logs --prefix=none# Fill it with the audit_logs DDLnpx @usebetterdev/audit-cli migrate -o drizzle/_audit_logs.sqlTerminal window # Create a draft migration without applying itnpx prisma migrate dev --create-only --name audit_logs# Fill it with the audit_logs DDLnpx @usebetterdev/audit-cli migrate \-o prisma/migrations/*_audit_logs/migration.sql -
Apply the migration
Terminal window npx drizzle-kit migrateTerminal window npx prisma migrate dev -
Verify with check
Terminal window npx @usebetterdev/audit-cli check --database-url $DATABASE_URLAll checks should pass before wiring up the audit instance in your application.
Commands
Section titled “Commands”migrate
Section titled “migrate”Generates the audit_logs table DDL for your database dialect. Outputs SQL to a file or stdout.
| Flag | Default | Description |
|---|---|---|
-o, --output <path> | — | File path or directory to write the SQL (omit to print to stdout). Supports glob patterns (e.g., prisma/migrations/*_audit_logs/migration.sql). When a directory is given, the CLI generates a timestamped filename. |
--adapter <orm> | auto-detected | ORM to target: drizzle or prisma. |
--dialect <db> | auto-detected | Database dialect: postgres, mysql, or sqlite. |
--dry-run | false | Print SQL to stdout without writing a file. |
# Preview the SQL without writing a filenpx @usebetterdev/audit-cli migrate --dry-run
# Write to a specific directorynpx @usebetterdev/audit-cli migrate -o drizzle/
# Explicit adapter and dialect overridenpx @usebetterdev/audit-cli migrate --adapter drizzle --dialect postgres -o ./migrations/ORM and dialect are inferred from installed packages (drizzle-orm, @prisma/client) and the DATABASE_URL environment variable. Pass --adapter and --dialect explicitly to override. If auto-detection fails, the CLI exits with a clear error listing which flag to add.
Connects to your database and verifies that the audit_logs table exists with the expected schema. Reports any missing columns or indexes.
| Flag | Default | Description |
|---|---|---|
--database-url <url> | $DATABASE_URL | Connection string to your database. |
npx @usebetterdev/audit-cli check --database-url $DATABASE_URLA passing run prints ✓ for each check. Any ✗ line identifies the missing column or index — re-run migrate, apply the migration, and run check again. See Quick Start for the full migration workflow.
Reports aggregate counts and storage size for the audit_logs table.
Deletes audit entries older than the configured retention period. Always run with --dry-run first to see how many rows would be deleted. See Retention Policies for scheduling and archiving strategies.
| Flag | Default | Description |
|---|---|---|
--database-url <url> | $DATABASE_URL | Connection string to your database. |
--since <value> | from config | Override the cutoff for this run. Accepts ISO dates (2025-01-01) or duration shorthands (90d, 4w, 3m, 1y). |
--batch-size <n> | 1000 | Rows per DELETE batch. |
--dry-run | false | Print the number of rows that would be deleted without deleting anything. |
--yes | false | Skip the confirmation prompt. Required for non-interactive use (CI, cron jobs). |
# See how many rows would be deleted (safe — no changes made)npx @usebetterdev/audit-cli purge --dry-run --since 90d --database-url $DATABASE_URL
# Delete entries older than 90 days (prompts for confirmation)npx @usebetterdev/audit-cli purge --since 90d --database-url $DATABASE_URL
# Non-interactive deletion (CI/cron — skips confirmation prompt)npx @usebetterdev/audit-cli purge --since 90d --yes --database-url $DATABASE_URLexport
Section titled “export”Exports audit log entries to stdout in JSON or CSV format. Supports filtering by time range, table, actor, severity, and compliance tag. Use either --since (relative) or --from/--to (absolute range), not both.
| Flag | Default | Description |
|---|---|---|
--database-url <url> | $DATABASE_URL | Connection string to your database. |
--format <fmt> | json | Output format: json or csv. |
--since <duration> | (none) | Relative time filter, e.g. 1h, 7d, 30d. |
--from <iso> | (none) | Absolute start timestamp (ISO 8601). |
--to <iso> | (none) | Absolute end timestamp (ISO 8601). |
--table <name> | (none) | Filter by table name. |
--actor <id> | (none) | Filter by actor ID. |
--severity <level> | (none) | Filter by severity: low, medium, high, critical. |
--compliance <tag> | (none) | Filter by compliance tag, e.g. gdpr, soc2. |
-o, --output <path> | — | File path to write output (omit to print to stdout). |
# Export all entries from the last hour as JSONnpx @usebetterdev/audit-cli export --since 1h --database-url $DATABASE_URL
# Export critical-severity entries from the last 7 days as CSVnpx @usebetterdev/audit-cli export \ --since 7d \ --severity critical \ --format csv \ --database-url $DATABASE_URL
# Export GDPR-tagged entries for a specific actor in a date range, saved to a filenpx @usebetterdev/audit-cli export \ --from 2025-01-01T00:00:00Z \ --to 2025-01-31T23:59:59Z \ --compliance gdpr \ --actor user-42 \ --format json \ -o audit-export.json \ --database-url $DATABASE_URL