Skip to content

CLI & Migrations

The audit CLI generates the audit_logs migration, verifies your schema, exports log entries, and purges old data — available install-free via npx without adding anything to your package.json.

  1. Generate the audit_logs migration

    The CLI auto-detects your ORM and dialect from package.json and DATABASE_URL.

    Terminal window
    # Create an empty custom migration file for Drizzle Kit to manage
    npx drizzle-kit generate --custom --name=audit_logs --prefix=none
    # Fill it with the audit_logs DDL
    npx @usebetterdev/audit-cli migrate -o drizzle/_audit_logs.sql
  2. Apply the migration

    Terminal window
    npx drizzle-kit migrate
  3. Verify with check

    Terminal window
    npx @usebetterdev/audit-cli check --database-url $DATABASE_URL

    All checks should pass before wiring up the audit instance in your application.

Generates the audit_logs table DDL for your database dialect. Outputs SQL to a file or stdout.

FlagDefaultDescription
-o, --output <path>File path or directory to write the SQL (omit to print to stdout). Supports glob patterns (e.g., prisma/migrations/*_audit_logs/migration.sql). When a directory is given, the CLI generates a timestamped filename.
--adapter <orm>auto-detectedORM to target: drizzle or prisma.
--dialect <db>auto-detectedDatabase dialect: postgres, mysql, or sqlite.
--dry-runfalsePrint SQL to stdout without writing a file.
Terminal window
# Preview the SQL without writing a file
npx @usebetterdev/audit-cli migrate --dry-run
# Write to a specific directory
npx @usebetterdev/audit-cli migrate -o drizzle/
# Explicit adapter and dialect override
npx @usebetterdev/audit-cli migrate --adapter drizzle --dialect postgres -o ./migrations/

ORM and dialect are inferred from installed packages (drizzle-orm, @prisma/client) and the DATABASE_URL environment variable. Pass --adapter and --dialect explicitly to override. If auto-detection fails, the CLI exits with a clear error listing which flag to add.

Connects to your database and verifies that the audit_logs table exists with the expected schema. Reports any missing columns or indexes.

FlagDefaultDescription
--database-url <url>$DATABASE_URLConnection string to your database.
Terminal window
npx @usebetterdev/audit-cli check --database-url $DATABASE_URL

A passing run prints for each check. Any line identifies the missing column or index — re-run migrate, apply the migration, and run check again. See Quick Start for the full migration workflow.

Reports aggregate counts and storage size for the audit_logs table.

Deletes audit entries older than the configured retention period. Always run with --dry-run first to see how many rows would be deleted. See Retention Policies for scheduling and archiving strategies.

FlagDefaultDescription
--database-url <url>$DATABASE_URLConnection string to your database.
--since <value>from configOverride the cutoff for this run. Accepts ISO dates (2025-01-01) or duration shorthands (90d, 4w, 3m, 1y).
--batch-size <n>1000Rows per DELETE batch.
--dry-runfalsePrint the number of rows that would be deleted without deleting anything.
--yesfalseSkip the confirmation prompt. Required for non-interactive use (CI, cron jobs).
Terminal window
# See how many rows would be deleted (safe — no changes made)
npx @usebetterdev/audit-cli purge --dry-run --since 90d --database-url $DATABASE_URL
# Delete entries older than 90 days (prompts for confirmation)
npx @usebetterdev/audit-cli purge --since 90d --database-url $DATABASE_URL
# Non-interactive deletion (CI/cron — skips confirmation prompt)
npx @usebetterdev/audit-cli purge --since 90d --yes --database-url $DATABASE_URL

Exports audit log entries to stdout in JSON or CSV format. Supports filtering by time range, table, actor, severity, and compliance tag. Use either --since (relative) or --from/--to (absolute range), not both.

FlagDefaultDescription
--database-url <url>$DATABASE_URLConnection string to your database.
--format <fmt>jsonOutput format: json or csv.
--since <duration>(none)Relative time filter, e.g. 1h, 7d, 30d.
--from <iso>(none)Absolute start timestamp (ISO 8601).
--to <iso>(none)Absolute end timestamp (ISO 8601).
--table <name>(none)Filter by table name.
--actor <id>(none)Filter by actor ID.
--severity <level>(none)Filter by severity: low, medium, high, critical.
--compliance <tag>(none)Filter by compliance tag, e.g. gdpr, soc2.
-o, --output <path>File path to write output (omit to print to stdout).
Terminal window
# Export all entries from the last hour as JSON
npx @usebetterdev/audit-cli export --since 1h --database-url $DATABASE_URL
# Export critical-severity entries from the last 7 days as CSV
npx @usebetterdev/audit-cli export \
--since 7d \
--severity critical \
--format csv \
--database-url $DATABASE_URL
# Export GDPR-tagged entries for a specific actor in a date range, saved to a file
npx @usebetterdev/audit-cli export \
--from 2025-01-01T00:00:00Z \
--to 2025-01-31T23:59:59Z \
--compliance gdpr \
--actor user-42 \
--format json \
-o audit-export.json \
--database-url $DATABASE_URL