Compare commits

...

103 commits

Author SHA1 Message Date
52844d7a09 ci(mysql): add @emigrate/mysql integration tests to GitHub Actions
Some checks failed
Deploy to GitHub Pages / build (push) Failing after 2m38s
Deploy to GitHub Pages / deploy (push) Has been skipped
Integration Tests / Emigrate MySQL integration tests (push) Failing after 4m0s
Release / Release (push) Failing after 12s
CI / Build and Test (push) Has been cancelled
2025-04-25 09:48:34 +02:00
github-actions[bot]
fa3fb20dc5 chore(release): version packages 2025-04-24 16:06:29 +02:00
26240f49ff fix(mysql): make sure migrations are run in order when run concurrently
Now we either lock all or none of the migrations to run,
to make sure they are not out of order when multiple instances of Emigrate run concurrently.
2025-04-24 15:57:44 +02:00
6eb60177c5 fix: use another changesets-action version 2024-08-09 16:03:34 +02:00
b3b603b2fc feat: make aggregated GitHub releases instead of one per package
And also publish packages with unreleased changes tagged with `next` to NPM
2024-08-09 15:49:22 +02:00
bb9d674cd7 chore: turn off Turbo's UI as it messes with the terminal and is not as intuitive as it seems 2024-06-27 16:05:45 +02:00
c151031d41 chore(deps): upgrade Turbo and opt out from telemetry 2024-06-27 16:05:45 +02:00
dependabot[bot]
48181d88b7 chore(deps): bump turbo from 1.10.16 to 2.0.5
Bumps [turbo](https://github.com/vercel/turbo) from 1.10.16 to 2.0.5.
- [Release notes](https://github.com/vercel/turbo/releases)
- [Changelog](https://github.com/vercel/turbo/blob/main/release.md)
- [Commits](https://github.com/vercel/turbo/compare/v1.10.16...v2.0.5)

---
updated-dependencies:
- dependency-name: turbo
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-06-27 16:05:45 +02:00
d779286084 chore(deps): upgrade TypeScript to v5.5 and enable isolatedDeclarations 2024-06-27 15:38:50 +02:00
ef848a0553 chore(deps): re-add the specific PNPM version for the deploy workflow 2024-06-27 13:27:34 +02:00
4d12402595 chore(deps): make sure the correct PNPM version is used (everywhere) 2024-06-27 11:55:33 +02:00
be5c4d28b6 chore(deps): make sure the correct PNPM version is used 2024-06-27 11:47:40 +02:00
2cefa2508b chore(deps): upgrade PNPM to v9.4.0 2024-06-27 11:12:21 +02:00
0ff9f60d59 chore(deps): upgrade all action dependencies
Closes #70, #128, #135, #145
2024-06-27 10:59:47 +02:00
github-actions[bot]
31693ddb3c chore(release): version packages 2024-06-25 09:21:37 +02:00
57498db248 fix(mysql): close database connections gracefully when using Bun 2024-06-25 08:22:56 +02:00
github-actions[bot]
cf620a191d chore(release): version packages 2024-05-30 10:16:07 +02:00
ca154fadeb fix: exclude tsbuildinfo files from published packages for smaller bundles 2024-05-30 10:12:37 +02:00
github-actions[bot]
f300f147fa chore(release): version packages 2024-05-29 16:23:49 +02:00
44426042cf feat(mysql,postgres): automatically create the database if it doesn't exist (fixes #147) 2024-05-29 16:19:32 +02:00
aef2d7c861 fix(mysql): handle table initialization better in clustered database environments
The CREATE TABLE IF NOT EXISTS yields more locks than checking if the table exists using a SELECT first before running CREATE TABLE.
This makes more sense as the table usually already exists, so we optimize for the happy path.
2024-05-29 15:10:59 +02:00
github-actions[bot]
e396266f3d chore(release): version packages 2024-04-04 14:46:54 +02:00
081ab34cb4 fix(reporter-pino): make sure the Pino reporter outputs logs in Bun environments 2024-04-04 14:43:38 +02:00
dependabot[bot]
520fdd94ef chore(deps): bump changesets/action from 1.4.5 to 1.4.6
Bumps [changesets/action](https://github.com/changesets/action) from 1.4.5 to 1.4.6.
- [Release notes](https://github.com/changesets/action/releases)
- [Changelog](https://github.com/changesets/action/blob/main/CHANGELOG.md)
- [Commits](https://github.com/changesets/action/compare/v1.4.5...v1.4.6)

---
updated-dependencies:
- dependency-name: changesets/action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-03-15 10:23:19 +01:00
github-actions[bot]
d1bd8fc74f chore(release): version packages 2024-03-15 09:40:25 +01:00
41522094dd fix(cli): handle the case where the config is returned as an object with a nested default property 2024-02-19 10:59:02 +01:00
dependabot[bot]
6763f338ce chore(deps): bump the commitlint group with 2 updates
Bumps the commitlint group with 2 updates: [@commitlint/cli](https://github.com/conventional-changelog/commitlint/tree/HEAD/@commitlint/cli) and [@commitlint/config-conventional](https://github.com/conventional-changelog/commitlint/tree/HEAD/@commitlint/config-conventional).


Updates `@commitlint/cli` from 18.4.3 to 18.6.1
- [Release notes](https://github.com/conventional-changelog/commitlint/releases)
- [Changelog](https://github.com/conventional-changelog/commitlint/blob/master/@commitlint/cli/CHANGELOG.md)
- [Commits](https://github.com/conventional-changelog/commitlint/commits/v18.6.1/@commitlint/cli)

Updates `@commitlint/config-conventional` from 18.4.3 to 18.6.1
- [Release notes](https://github.com/conventional-changelog/commitlint/releases)
- [Changelog](https://github.com/conventional-changelog/commitlint/blob/master/@commitlint/config-conventional/CHANGELOG.md)
- [Commits](https://github.com/conventional-changelog/commitlint/commits/v18.6.1/@commitlint/config-conventional)

---
updated-dependencies:
- dependency-name: "@commitlint/cli"
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: commitlint
- dependency-name: "@commitlint/config-conventional"
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: commitlint
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-02-14 11:55:46 +01:00
github-actions[bot]
6c4e441eff chore(release): version packages 2024-02-13 13:00:15 +01:00
57a099169e fix(cli): cleanup AbortSignal event listeners to avoid MaxListenersExceededWarning 2024-02-12 20:59:26 +01:00
github-actions[bot]
ae9e8b1b04 chore(release): version packages 2024-02-12 13:56:28 +01:00
1065322435 fix(pino): show correct statuses for the "list" and "new" commands 2024-02-12 13:47:55 +01:00
17feb2d2c2 fix(mysql): only unreference connections in a Bun environment as it has problems with Node for some reason 2024-02-12 13:35:18 +01:00
dependabot[bot]
98e3ed5c1b chore(deps): bump pnpm/action-setup from 2.4.0 to 3.0.0
Bumps [pnpm/action-setup](https://github.com/pnpm/action-setup) from 2.4.0 to 3.0.0.
- [Release notes](https://github.com/pnpm/action-setup/releases)
- [Commits](https://github.com/pnpm/action-setup/compare/v2.4.0...v3.0.0)

---
updated-dependencies:
- dependency-name: pnpm/action-setup
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-02-12 09:05:01 +01:00
1d33d65135 docs(cli): change URL path from /commands/ to /cli/ 2024-02-09 14:48:51 +01:00
0c597fd7a8 docs(cli): add a main page for Emigrate's CLI 2024-02-09 14:48:51 +01:00
github-actions[bot]
0360d0b82f chore(release): version packages 2024-02-09 14:05:35 +01:00
c838ffb7f3 fix(typescript): load config written in TypeScript without the typescript package when using Bun, Deno or tsx 2024-02-09 14:00:24 +01:00
198aa545eb fix(mysql): unreference all connections so that the process can exit cleanly
In a NodeJS environment it will just work as before, but in a Bun environment it will make the "forced exit" error message disappear and remove the 10 s waiting period when migrations are done.
2024-02-09 13:13:27 +01:00
e7ec75d9e1 docs(faq): add note on using Emigrate for existing databases 2024-02-06 09:29:53 +01:00
b62c692846 docs(reporters): add "json" reporter and rename "default" to "pretty" 2024-02-06 09:22:35 +01:00
18382ce961 feat(reporters): add built-in "json" reporter and rename "default" to "pretty" 2024-02-06 09:22:35 +01:00
github-actions[bot]
4e8ac5294d chore(release): version packages 2024-02-05 15:51:38 +01:00
61cbcbd691 fix(cli): force exiting after 10 seconds should not change the exit code
If all migrations have been run successfully we want the exit code to be 0 even though we had to force exit the process.
This is because on some platforms (e.g. Bun) all handles are not cleaned up the same as in NodeJS, so lets be forgiving.
2024-02-05 15:48:55 +01:00
github-actions[bot]
f720aae83d chore(release): version packages 2024-02-05 15:14:33 +01:00
543b7f6f77 fix(bun): import setTimeout/setInterval from "node:timers" for .unref() to correctly work 2024-02-05 15:12:30 +01:00
db656c2310 chore: enable NPM provenance 2024-02-05 15:08:47 +01:00
github-actions[bot]
ff89dd4f86 chore(release): version packages 2024-02-05 14:54:05 +01:00
f8a5cc728d fix(storage): make sure the storage initialization crashes when db connection can't be established 2024-02-05 14:50:17 +01:00
f6761fe434 chore: add missing docs changeset 2024-02-05 14:29:30 +01:00
ef45be9233 fix(reporters): show number of skipped migrations correctly in command output 2024-02-05 14:17:30 +01:00
69bd88afdb chore: allow many parameters in test files 2024-01-26 16:09:49 +01:00
0faebbe647 docs(cli): document the relative file path support for the "remove" command 2024-01-26 16:09:49 +01:00
2f6b4d23e0 fix(reporter-default): don't dim decimal points in durations in the default reporter 2024-01-26 16:09:49 +01:00
1f139fd975 feat(remove): rework the "remove" command to be more similar to "up" and "list"
The old reporter methods related to the "remove" command is not used anymore and instead the shared `onMigrationStart`, `onMigrationSuccess` and `onMigrationError` methods are used.
Some preparation has also been made to support for removing multiple migrations at once in the future, similar to how the `--from` and `--to` CLI options work for the "up" command.
2024-01-26 16:09:49 +01:00
86e0d52e5c feat(reporter-pino): adapt to the new Reporter interface 2024-01-26 16:09:49 +01:00
94ad9feae9 feat(types): simplify the EmigrateReporter interface by removing the "remove" specific methods 2024-01-26 16:09:49 +01:00
f2d4bb346e fix(cli): make sure errors passed to the storage are serialized correctly 2024-01-26 16:09:49 +01:00
f1b9098750 fix(migrations): don't include folders when collecting migrations
It should be possible to have folders inside your migrations folder
2024-01-26 09:26:49 +01:00
9109238b86 feat(cli): improve the "up" commands --from and --to options
The given values can either be migration names or relative paths to migration files.
The given migration must exist to avoid accidentally running migrations that wasn't intended to run.
2024-01-26 09:13:03 +01:00
github-actions[bot]
986456b038 chore(release): version packages 2024-01-23 11:44:05 +01:00
b56b6daf73 fix(cli): handle migration history entries without file extensions correctly
...even when the migration file names include periods in their names.
2024-01-23 11:36:47 +01:00
github-actions[bot]
ea327bbc49 chore(release): version packages 2024-01-22 13:49:54 +01:00
121492b303 fix(cli): sort migrations lexicographically for real 2024-01-22 13:48:09 +01:00
github-actions[bot]
bddb2d6b14 chore(release): version packages 2024-01-22 11:32:48 +01:00
a4da353d5a feat(cli): add graceful process abort
Using an AbortSignal and Promise.race we abandon running migrations that take longer to complete after the process is aborted than the given abortRespite period
2024-01-22 11:30:06 +01:00
ce15648251 feat(types): add type for the onAbort Reporter method 2024-01-22 11:30:06 +01:00
github-actions[bot]
576dfbb124 chore(release): version packages 2024-01-19 13:48:24 +01:00
49d8925778 fix(docs): remove access control from package config 2024-01-19 13:43:59 +01:00
98adcda37e fix(reporters): use better wording in the header in the default reporter
Also show the number of skipped migrations
2024-01-19 13:43:59 +01:00
cbc35bd646 chore: start writing changesets for the documentation 2024-01-19 13:43:59 +01:00
e739e453d7 docs: add Baseline guide 2024-01-19 13:43:59 +01:00
f515c8a854 feat(cli): add --no-execution option to the "up" command
...which can be used to log manually run migrations as successful or for baselining a database.
2024-01-19 13:43:59 +01:00
e71c318ea5 test(up): structure the up tests in a better way 2024-01-19 13:43:59 +01:00
9ef0fa2776 feat(cli): add --from and --to options to limit what migrations to run 2024-01-19 13:43:59 +01:00
02c142e39a feat(up): add --limit option to limit the number of migrations to run 2024-01-19 13:43:59 +01:00
bf4d596980 fix(cli): clarify which options that takes parameters 2024-01-19 13:43:59 +01:00
github-actions[bot]
424d3e9903 chore(release): version packages 2024-01-18 15:25:51 +01:00
73a8a42e5f fix(history): support a migration history with entries without file extensions (.js is assumed in such case) 2024-01-18 15:18:35 +01:00
github-actions[bot]
114979f154 chore(release): version packages 2024-01-18 14:52:48 +01:00
dependabot[bot]
b083e88bac chore(deps): bump cosmiconfig from 8.3.6 to 9.0.0
Bumps [cosmiconfig](https://github.com/cosmiconfig/cosmiconfig) from 8.3.6 to 9.0.0.
- [Release notes](https://github.com/cosmiconfig/cosmiconfig/releases)
- [Changelog](https://github.com/cosmiconfig/cosmiconfig/blob/main/CHANGELOG.md)
- [Commits](https://github.com/cosmiconfig/cosmiconfig/compare/cosmiconfig-v8.3.6...v9.0.0)

---
updated-dependencies:
- dependency-name: cosmiconfig
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-18 14:50:14 +01:00
cbc3193626 chore(deps): downgrade Turbo to 1.10.16 as it did work in the CI env 2024-01-18 11:13:28 +01:00
1b8439a530 ci: and does this make any difference? 2024-01-18 11:09:12 +01:00
891402c7d4 ci: does this make the release flow work? 2024-01-18 11:02:07 +01:00
github-actions[bot]
9130af7b12 chore(release): version packages 2024-01-18 10:50:05 +01:00
83dc618c2e fix(cli): remove --enable-source-maps flag 2024-01-18 10:46:04 +01:00
dependabot[bot]
a6e096bcbc chore(deps): bump turbo from 1.10.16 to 1.11.3
Bumps [turbo](https://github.com/vercel/turbo) from 1.10.16 to 1.11.3.
- [Release notes](https://github.com/vercel/turbo/releases)
- [Changelog](https://github.com/vercel/turbo/blob/main/release.md)
- [Commits](https://github.com/vercel/turbo/compare/v1.10.16...v1.11.3)

---
updated-dependencies:
- dependency-name: turbo
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-16 14:06:06 +01:00
dependabot[bot]
9bfd0e44c3 chore(deps): bump typescript from 5.2.2 to 5.3.3
Bumps [typescript](https://github.com/Microsoft/TypeScript) from 5.2.2 to 5.3.3.
- [Release notes](https://github.com/Microsoft/TypeScript/releases)
- [Commits](https://github.com/Microsoft/TypeScript/compare/v5.2.2...v5.3.3)

---
updated-dependencies:
- dependency-name: typescript
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-15 10:21:31 +01:00
dependabot[bot]
af83bf6d7f chore(deps): bump tsx from 4.6.2 to 4.7.0
Bumps [tsx](https://github.com/privatenumber/tsx) from 4.6.2 to 4.7.0.
- [Release notes](https://github.com/privatenumber/tsx/releases)
- [Changelog](https://github.com/privatenumber/tsx/blob/develop/release.config.cjs)
- [Commits](https://github.com/privatenumber/tsx/compare/v4.6.2...v4.7.0)

---
updated-dependencies:
- dependency-name: tsx
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-15 09:57:03 +01:00
dependabot[bot]
a5264ab3d4 chore(deps): bump lint-staged from 15.1.0 to 15.2.0
Bumps [lint-staged](https://github.com/okonet/lint-staged) from 15.1.0 to 15.2.0.
- [Release notes](https://github.com/okonet/lint-staged/releases)
- [Changelog](https://github.com/lint-staged/lint-staged/blob/master/CHANGELOG.md)
- [Commits](https://github.com/okonet/lint-staged/compare/v15.1.0...v15.2.0)

---
updated-dependencies:
- dependency-name: lint-staged
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-15 09:49:09 +01:00
dependabot[bot]
0cce84743d chore(deps): bump actions/checkout from 3 to 4
Bumps [actions/checkout](https://github.com/actions/checkout) from 3 to 4.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v3...v4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-11 06:44:34 +01:00
a130248687 docs: update loader plugin intro after adding TypeScript support 2024-01-08 11:06:59 +01:00
github-actions[bot]
3c54917c35 chore(release): version packages 2023-12-28 09:20:03 +01:00
9a605a85f1 feat: add support for TypeScript migration files
And add a guide to the documentation on how to set it up for NodeJS
2023-12-20 15:27:03 +01:00
github-actions[bot]
59eb90b8cb chore(release): version packages 2023-12-20 11:24:17 +01:00
9f91bdcfa0 feat(cli): add the --import option for importing modules/packages before commands are run
Can for instance be used to load environment variables using Dotenv
2023-12-20 11:08:27 +01:00
e6e4433018 feat(cli): rename extension short option from -e to -x
BREAKING CHANGE: if you've been using the `-e` short option you should change it to `-x` or use the long option name `--extension`
2023-12-20 09:27:43 +01:00
f9a16d87a1 feat: add color option to CLI and configuration file
The option is used to force enable/disable color output and is passed to the reporter which should respect it
2023-12-20 09:11:01 +01:00
7bae76f496 docs: include Deno usage instructions in the documentation 2023-12-19 15:40:05 +01:00
github-actions[bot]
e8e35215be chore(release): version packages 2023-12-19 14:51:40 +01:00
a6c6e6dc78 fix(types): forgot about the bun key in one package 2023-12-19 14:49:29 +01:00
github-actions[bot]
e67ce0de1e chore(release): version packages 2023-12-19 14:41:04 +01:00
beb6cf7719 chore(deps): upgrade ansis package 2023-12-19 14:34:54 +01:00
3a8b06b3b1 fix: revert usage of bun key in package.json exports 2023-12-19 14:29:42 +01:00
111 changed files with 12920 additions and 6102 deletions

View file

@ -13,6 +13,7 @@ jobs:
env:
TURBO_TOKEN: ${{ secrets.TURBO_TOKEN }}
TURBO_TEAM: ${{ secrets.TURBO_TEAM }}
DO_NOT_TRACK: 1
steps:
- name: Check out code
@ -20,14 +21,12 @@ jobs:
with:
fetch-depth: 2
- uses: pnpm/action-setup@v2.4.0
with:
version: 8.3.1
- uses: pnpm/action-setup@v4.0.0
- name: Setup Node.js environment
uses: actions/setup-node@v4
with:
node-version: 20.9.0
node-version: 22.15.0
cache: 'pnpm'
- name: Install dependencies

View file

@ -10,6 +10,7 @@ on:
# Allow this job to clone the repo and create a page deployment
permissions:
actions: read
contents: read
pages: write
id-token: write
@ -23,17 +24,16 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout your repository using git
uses: actions/checkout@v3
uses: actions/checkout@v4
- name: Show vars
run: |
echo $ASTRO_SITE
echo $ASTRO_BASE
- name: Install, build, and upload your site output
uses: withastro/action@v1
uses: withastro/action@v2
with:
path: ./docs # The root location of your Astro project inside the repository. (optional)
node-version: 20 # The specific version of Node that should be used to build your site. Defaults to 18. (optional)
package-manager: pnpm@8.10.2 # The Node package manager that should be used to install dependencies and build your site. Automatically detected based on your lockfile. (optional)
package-manager: pnpm@9.4.0 # The Node package manager that should be used to install dependencies and build your site. Automatically detected based on your lockfile. (optional)
deploy:
needs: build
@ -44,4 +44,4 @@ jobs:
steps:
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v1
uses: actions/deploy-pages@v4

62
.github/workflows/integration.yaml vendored Normal file
View file

@ -0,0 +1,62 @@
name: Integration Tests
on:
push:
branches: ['main', 'changeset-release/main']
pull_request:
jobs:
mysql_integration:
name: Emigrate MySQL integration tests
timeout-minutes: 15
runs-on: ubuntu-latest
env:
TURBO_TOKEN: ${{ secrets.TURBO_TOKEN }}
TURBO_TEAM: ${{ secrets.TURBO_TEAM }}
DO_NOT_TRACK: 1
services:
mysql:
image: mysql:8.0
env:
MYSQL_ROOT_PASSWORD: root
MYSQL_DATABASE: emigrate
MYSQL_USER: emigrate
MYSQL_PASSWORD: emigrate
ports:
- 3306:3306
options: --health-cmd="mysqladmin ping -h localhost" --health-interval=10s --health-timeout=5s --health-retries=5
steps:
- name: Check out code
uses: actions/checkout@v4
with:
fetch-depth: 2
- uses: pnpm/action-setup@v4.0.0
- name: Setup Node.js environment
uses: actions/setup-node@v4
with:
node-version: 22.15.0
cache: 'pnpm'
- name: Install dependencies
run: pnpm install
- name: Wait for MySQL to be ready
run: |
for i in {1..30}; do
nc -z localhost 3306 && echo "MySQL is up!" && break
echo "Waiting for MySQL..."
sleep 2
done
- name: Build package
run: pnpm build --filter @emigrate/mysql
- name: Integration Tests
env:
MYSQL_HOST: '127.0.0.1'
MYSQL_PORT: 3306
run: pnpm --filter @emigrate/mysql integration

View file

@ -15,31 +15,58 @@ jobs:
contents: write
packages: write
pull-requests: write
actions: read
id-token: write
steps:
- name: Checkout Repo
uses: actions/checkout@v4
with:
token: ${{ secrets.PAT_GITHUB_TOKEN }}
persist-credentials: false
fetch-depth: 0
- uses: pnpm/action-setup@v2.4.0
with:
version: 8.3.1
- uses: pnpm/action-setup@v4.0.0
- name: Setup Node.js environment
uses: actions/setup-node@v4
with:
node-version: 20.9.0
node-version: 22.15.0
cache: 'pnpm'
- name: Install Dependencies
run: pnpm install
- name: Create Release Pull Request
uses: changesets/action@v1
id: changesets
uses: aboviq/changesets-action@v1.5.2
with:
publish: pnpm run release
commit: 'chore(release): version packages'
title: 'chore(release): version packages'
createGithubReleases: aggregate
env:
GITHUB_TOKEN: ${{ secrets.PAT_GITHUB_TOKEN }}
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Release to @next tag on npm
if: github.ref_name == 'main' && steps.changesets.outputs.published != 'true'
run: |
git checkout main
CHANGESET_FILE=$(git diff-tree --no-commit-id --name-only HEAD -r ".changeset/*-*-*.md")
if [ -z "$CHANGESET_FILE" ]; then
echo "No changesets found, skipping release to @next tag"
exit 0
fi
AFFECTED_PACKAGES=$(sed -n '/---/,/---/p' "$CHANGESET_FILE" | sed '/---/d')
if [ -z "$AFFECTED_PACKAGES" ]; then
echo "No packages affected by changesets, skipping release to @next tag"
exit 0
fi
pnpm changeset version --snapshot next
pnpm changeset publish --tag next
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
GITHUB_TOKEN: ${{ secrets.PAT_GITHUB_TOKEN }}

View file

@ -30,27 +30,79 @@ It's effectively a successor of [klei-migrate](https://www.npmjs.com/package/kle
Install the Emigrate CLI in your project:
```bash
npm install --save-dev @emigrate/cli
npm install @emigrate/cli
# or
pnpm add --save-dev @emigrate/cli
pnpm add @emigrate/cli
# or
yarn add --dev @emigrate/cli
yarn add @emigrate/cli
# or
bun add --dev @emigrate/cli
bun add @emigrate/cli
```
## Usage
```text
Usage: emigrate up [options]
Run all pending migrations
Options:
-h, --help Show this help message and exit
-d, --directory <path> The directory where the migration files are located (required)
-i, --import <module> Additional modules/packages to import before running the migrations (can be specified multiple times)
For example if you want to use Dotenv to load environment variables or when using TypeScript
-s, --storage <name> The storage to use for where to store the migration history (required)
-p, --plugin <name> The plugin(s) to use (can be specified multiple times)
-r, --reporter <name> The reporter to use for reporting the migration progress
-l, --limit <count> Limit the number of migrations to run
-f, --from <name/path> Start running migrations from the given migration name or relative file path to a migration file,
the given name or path needs to exist. The same migration and those after it lexicographically will be run
-t, --to <name/path> Skip migrations after the given migration name or relative file path to a migration file,
the given name or path needs to exist. The same migration and those before it lexicographically will be run
--dry List the pending migrations that would be run without actually running them
--color Force color output (this option is passed to the reporter)
--no-color Disable color output (this option is passed to the reporter)
--no-execution Mark the migrations as executed and successful without actually running them,
which is useful if you want to mark migrations as successful after running them manually
--abort-respite <sec> The number of seconds to wait before abandoning running migrations after the command has been aborted (default: 10)
Examples:
emigrate up --directory src/migrations -s fs
emigrate up -d ./migrations --storage @emigrate/mysql
emigrate up -d src/migrations -s postgres -r json --dry
emigrate up -d ./migrations -s mysql --import dotenv/config
emigrate up --limit 1
emigrate up --to 20231122120529381_some_migration_file.js
emigrate up --to 20231122120529381_some_migration_file.js --no-execution
```
### Examples
Create a new migration:
```bash
npx emigrate new -d migrations -e .js create some fancy table
npx emigrate new -d migrations create some fancy table
# or
pnpm emigrate new -d migrations -e .js create some fancy table
pnpm emigrate new -d migrations create some fancy table
# or
yarn emigrate new -d migrations -e .js create some fancy table
yarn emigrate new -d migrations create some fancy table
# or
bunx --bun emigrate new -d migrations -e .js create some fancy table
bunx --bun emigrate new -d migrations create some fancy table
```
Will create a new empty JavaScript migration file with the name "YYYYMMDDHHmmssuuu_create_some_fancy_table.js" in the `migrations` directory.

43
docs/CHANGELOG.md Normal file
View file

@ -0,0 +1,43 @@
# @emigrate/docs
## 1.0.0
### Major Changes
- 1d33d65: Rename the URL path "/commands/" to "/cli/" to make it more clear that those pages are the documentation for the CLI. This change is a BREAKING CHANGE because it changes the URL path of the pages.
### Minor Changes
- 0c597fd: Add a separate page for the Emigrate CLI itself, with all the commands as sub pages
## 0.4.0
### Minor Changes
- b62c692: Add documentation for the built-in "json" reporter
- b62c692: The "default" reporter is now named "pretty"
- e7ec75d: Add note in FAQ on using Emigrate for existing databases
### Patch Changes
- c838ffb: Add note on how to write Emigrate's config using TypeScript in a production environment without having `typescript` installed.
## 0.3.0
### Minor Changes
- f6761fe: Document the changes to the "remove" command, specifically that it also accepts relative file paths now
- 9109238: Document the changes to the "up" command's `--from` and `--to` options, specifically that they can take relative file paths and that the given migration must exist.
## 0.2.0
### Minor Changes
- a4da353: Document the --abort-respite CLI option and the corresponding abortRespite config
## 0.1.0
### Minor Changes
- cbc35bd: Add first version of the [Baseline guide](https://emigrate.dev/guides/baseline)
- cbc35bd: Document the new --limit, --from and --to options for the ["up" command](https://emigrate.dev/cli/up/)

View file

@ -77,24 +77,46 @@ export default defineConfig({
},
],
},
{
label: 'Command Line Interface',
items: [
{
label: 'Introduction',
link: '/cli/',
},
{
label: 'Commands',
items: [
{
label: 'emigrate up',
link: '/commands/up/',
link: '/cli/up/',
},
{
label: 'emigrate list',
link: '/commands/list/',
link: '/cli/list/',
},
{
label: 'emigrate new',
link: '/commands/new/',
link: '/cli/new/',
},
{
label: 'emigrate remove',
link: '/commands/remove/',
link: '/cli/remove/',
},
],
},
],
},
{
label: 'Guides',
items: [
{
label: 'Using TypeScript',
link: '/guides/typescript/',
},
{
label: 'Baseline existing database',
link: '/guides/baseline/',
},
],
},
@ -102,7 +124,7 @@ export default defineConfig({
label: 'Plugins',
items: [
{
label: 'Introduction',
label: 'Plugins Introduction',
link: '/plugins/',
},
{
@ -110,7 +132,7 @@ export default defineConfig({
collapsed: true,
items: [
{
label: 'Introduction',
label: 'Storage Plugins',
link: '/plugins/storage/',
},
{
@ -132,7 +154,7 @@ export default defineConfig({
collapsed: true,
items: [
{
label: 'Introduction',
label: 'Loader Plugins',
link: '/plugins/loaders/',
},
{
@ -154,12 +176,16 @@ export default defineConfig({
collapsed: true,
items: [
{
label: 'Introduction',
label: 'Reporters',
link: '/plugins/reporters/',
},
{
label: 'Default Reporter',
link: '/plugins/reporters/default/',
label: 'Pretty Reporter (default)',
link: '/plugins/reporters/pretty/',
},
{
label: 'JSON Reporter',
link: '/plugins/reporters/json/',
},
{
label: 'Pino Reporter',
@ -172,7 +198,7 @@ export default defineConfig({
collapsed: true,
items: [
{
label: 'Introduction',
label: 'Generator Plugins',
link: '/plugins/generators/',
},
{

View file

@ -1,11 +1,8 @@
{
"name": "@emigrate/docs",
"private": true,
"publishConfig": {
"access": "public"
},
"type": "module",
"version": "0.0.1",
"version": "1.0.0",
"scripts": {
"dev": "astro dev",
"start": "astro dev",
@ -14,6 +11,7 @@
"astro": "astro"
},
"dependencies": {
"@astrojs/check": "^0.7.0",
"@astrojs/starlight": "^0.15.0",
"@astrojs/starlight-tailwind": "2.0.1",
"@astrojs/tailwind": "^5.0.3",
@ -23,5 +21,6 @@
},
"volta": {
"extends": "../package.json"
}
},
"packageManager": "pnpm@9.4.0"
}

View file

@ -0,0 +1,73 @@
---
title: "CLI Introduction"
description: "Some basic information about the Emigrate CLI."
---
import { Tabs, TabItem, LinkCard } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
Emigrate comes with a CLI that you can use to manage your migrations. The CLI is a powerful tool that allows you to create, run, and manage migrations.
### Installing the Emigrate CLI
<Tabs>
<TabItem label="npm">
```bash
npm install @emigrate/cli
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm add @emigrate/cli
```
</TabItem>
<TabItem label="yarn">
```bash
yarn add @emigrate/cli
```
</TabItem>
<TabItem label="bun">
```bash
bun add @emigrate/cli
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
</TabItem>
</Tabs>
### Existing commands
<LinkCard
href="up/"
title="emigrate up"
description="The command for executing migrations, or showing pending migrations in dry run mode."
/>
<LinkCard
href="list/"
title="emigrate list"
description="The command for listing all migrations and their status."
/>
<LinkCard
href="new/"
title="emigrate new"
description="The command for creating new migration files."
/>
<LinkCard
href="remove/"
title="emigrate remove"
description="The command for removing migrations from the migration history."
/>

View file

@ -34,14 +34,21 @@ It then sorts the migrations by filename in ascending order and outputs them and
bunx --bun emigrate list [options]
```
</TabItem>
<TabItem label="package.json">
```json {3}
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate list [options]"
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
```bash
deno task emigrate list [options]
```
</TabItem>
</Tabs>
@ -55,6 +62,12 @@ Show command help and exit
The directory where the migration files are located. The given path should be absolute or relative to the current working directory.
### `-i`, `--import <module>`
A module to import before listing the migrations. This option can be specified multiple times.
Can for instance be used to load environment variables using [dotenv](https://github.com/motdotla/dotenv) with `--import dotenv/config`.
### `-s`, `--storage <name>`
The <Link href="/plugins/storage/">storage plugin</Link> to use, which is responsible for where to store the migration history.
@ -73,6 +86,9 @@ In case you have both a `emigrate-storage-somedb` and a `somedb` package install
### `-r`, `--reporter <name>`
**type:** `"pretty" | "json" | string`
**default:** `"pretty"`
The <Link href="/plugins/reporters/">reporter</Link> to use for listing the migrations.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
@ -84,3 +100,7 @@ The name can be either a path to a module or a package name. For package names E
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-reporter-somereporter` package, you can specify either `emigrate-reporter-somereporter` or just `somereporter` as the name.
### `--color`, `--no-color`
Force enable/disable colored output, option is passed to the reporter which should respect it.

View file

@ -33,14 +33,21 @@ The migration file can be based on a template, generated by a <Link href="/plugi
bunx --bun emigrate new [options] <name>
```
</TabItem>
<TabItem label="package.json">
```json {3}
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate new [options] <name>"
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
```bash
deno task emigrate new [options] <name>
```
</TabItem>
</Tabs>
@ -67,12 +74,16 @@ The directory where the migration files are located. The given path should be ab
The template file to use for generating the migration file. The given path should be absolute or relative to the current working directory.
The template can contain a `{{name}}` placeholder which will be replaced with the migration name provided to the command. The generated file will have the same extension as the template file, unless the [`--extension`](#-e---extension-ext) option is used.
The template can contain a `{{name}}` placeholder which will be replaced with the migration name provided to the command. The generated file will have the same extension as the template file, unless the [`--extension`](#-x---extension-ext) option is used.
### `-e`, `--extension <ext>`
### `-x`, `--extension <ext>`
The extension to use for the migration file. Unless the [`--template`](#-t---template-path) option is also specified the file will be empty.
If both the `--template` and `--extension` options are specified, the extension will override the template file extension.
**Example:** `--extension .sql` will generate a file with the `.sql` extension.
### `-p`, `--plugin <name>`
The <Link href="/plugins/generators">generator plugin</Link> to use. The generator plugin is responsible for generating the migration filename and its contents.
@ -90,6 +101,9 @@ In case you have both a `emigrate-plugin-someplugin` and a `someplugin` package
### `-r`, `--reporter <name>`
**type:** `"pretty" | "json" | string`
**default:** `"pretty"`
The <Link href="/plugins/reporters/">reporter</Link> to use for listing the migrations.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
@ -101,3 +115,7 @@ The name can be either a path to a module or a package name. For package names E
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-reporter-somereporter` package, you can specify either `emigrate-reporter-somereporter` or just `somereporter` as the name.
### `--color`, `--no-color`
Force enable/disable colored output, option is passed to the reporter which should respect it.

View file

@ -13,40 +13,49 @@ The `remove` command is used to remove a migration from the history. This is use
<Tabs>
<TabItem label="npm">
```bash
npx emigrate remove [options] <name>
npx emigrate remove [options] <name/path>
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate remove [options] <name>
pnpm emigrate remove [options] <name/path>
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate remove [options] <name>
yarn emigrate remove [options] <name/path>
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate remove [options] <name>
bunx --bun emigrate remove [options] <name/path>
```
</TabItem>
<TabItem label="package.json">
```json {3}
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate remove [options] <name>"
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
```bash
deno task emigrate remove [options] <name/path>
```
</TabItem>
</Tabs>
## Arguments
### `<name>`
### `<name/path>`
The name of the migration file to remove, including the extension, e.g. `20200101000000_some_migration.js`.
The name of the migration file to remove, including the extension, e.g. `20200101000000_some_migration.js`, or a relative file path to a migration file to remove, e.g: `migrations/20200101000000_some_migration.js`.
Using relative file paths is useful in terminals that support autocomplete, and also when you copy and use the relative migration file path from the output of the <Link href="/cli/list/">`list`</Link> command.
## Options
@ -62,6 +71,12 @@ The directory where the migration files are located. The given path should be ab
Force removal of the migration history entry even if the migration file does not exist or it's in a non-failed state.
### `-i`, `--import <module>`
A module to import before remove the migration. This option can be specified multiple times.
Can for instance be used to load environment variables using [dotenv](https://github.com/motdotla/dotenv) with `--import dotenv/config`.
### `-s`, `--storage <name>`
The <Link href="/plugins/storage/">storage plugin</Link> to use, which is responsible for where to store the migration history.
@ -80,6 +95,9 @@ In case you have both a `emigrate-storage-somedb` and a `somedb` package install
### `-r`, `--reporter <name>`
**type:** `"pretty" | "json" | string`
**default:** `"pretty"`
The <Link href="/plugins/reporters/">reporter</Link> to use for listing the migrations.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
@ -91,3 +109,7 @@ The name can be either a path to a module or a package name. For package names E
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-reporter-somereporter` package, you can specify either `emigrate-reporter-somereporter` or just `somereporter` as the name.
### `--color`, `--no-color`
Force enable/disable colored output, option is passed to the reporter which should respect it.

View file

@ -0,0 +1,183 @@
---
title: "`emigrate up`"
description: "Run migrations"
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
The `up` command is used to either list or run all pending migrations, i.e. migrations that haven't been run yet.
Emigrate takes all migration files in the given directory and compares them to the migration history so that it knows which migrations are pending.
It then sorts the pending migrations by filename in ascending order and runs them one by one.
If any of the migrations fail, the command will be aborted and the rest of the migrations will not be run.
## Usage
<Tabs>
<TabItem label="npm">
```bash
npx emigrate up [options]
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate up [options]
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate up [options]
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate up [options]
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
```bash
deno task emigrate up [options]
```
</TabItem>
</Tabs>
## Options
### `-h`, `--help`
Show command help and exit
### `--dry`
List the pending migrations that would be run without actually running them
### `-l, --limit <count>`
**type:** `number`
Limit the number of migrations to run. Can be combined with `--dry` which will show "pending" for the migrations that would be run if not in dry-run mode,
and "skipped" for the migrations that also haven't been run but won't because of the set limit.
### `-d`, `--directory <path>`
The directory where the migration files are located. The given path should be absolute or relative to the current working directory.
### `-f`, `--from <name/path>`
The name of the migration to start from. This can be used to run only a subset of the pending migrations.
The given migration need to exist and is compared in lexicographical order with all migrations, the migration with the same name and those lexicographically after it will be migrated.
It's okay to use an already executed migration as the "from" migration, it won't be executed again.
The reason for why the given migration name must exist and cannot be just a prefix is to avoid accidentally running migrations that you didn't intend to run.
The given name can also be a relative path to a migration file, which makes it easier to use with terminals that support tab completion
or when copying the output from Emigrate and using it directly as the value of the `--from` option.
Relative paths are resolved relative to the current working directory.
Can be combined with `--dry` which will show "pending" for the migrations that would be run if not in dry-run mode,
and "skipped" for the migrations that also haven't been run but won't because of the set "from".
### `-t`, `--to <name/path>`
The name of the migration to end at. This can be used to run only a subset of the pending migrations.
The given migration name need to exist and is compared in lexicographical order with all migrations, the migration with the same name and those lexicographically before it will be migrated.
It's okay to use an already executed migration as the "to" migration, it won't be executed again.
The reason for why the given migration name must exist and cannot be just a prefix is to avoid accidentally running migrations that you didn't intend to run.
The given name can also be a relative path to a migration file, which makes it easier to use with terminals that support tab completion
or when copying the output from Emigrate and using it directly as the value of the `--to` option.
Relative paths are resolved relative to the current working directory.
Can be combined with `--dry` which will show "pending" for the migrations that would be run if not in dry-run mode,
and "skipped" for the migrations that also haven't been run but won't because of the set "to".
### `-i`, `--import <module>`
A module to import before running the migrations. This option can be specified multiple times.
Can for instance be used to load environment variables using [dotenv](https://github.com/motdotla/dotenv) with `--import dotenv/config`,
or for running migrations in NodeJS written in TypeScript with [tsx](https://github.com/privatenumber/tsx) (`--import tsx`), see the <Link href="/guides/typescript/">TypeScript guide</Link> for more information.
### `-s`, `--storage <name>`
The <Link href="/plugins/storage/">storage plugin</Link> to use, which is responsible for where to store the migration history.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
- `@emigrate/storage-`
- `emigrate-storage-`
- `@emigrate/plugin-storage-`
- `@emigrate/`
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-storage-somedb` package, you can specify either `emigrate-storage-somedb` or just `somedb` as the name.
In case you have both a `emigrate-storage-somedb` and a `somedb` package installed, the `emigrate-storage-somedb` package will be used.
### `-p`, `--plugin <name>`
The <Link href="/plugins/loaders/">loader plugin(s)</Link> to use. Can be specified multiple times to use multiple plugins.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
- `@emigrate/plugin-`
- `emigrate-plugin-`
- `@emigrate/`
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-plugin-someplugin` package, you can specify either `emigrate-plugin-someplugin` or just `someplugin` as the name.
In case you have both a `emigrate-plugin-someplugin` and a `someplugin` package installed, the `emigrate-plugin-someplugin` package will be used.
### `-r`, `--reporter <name>`
**type:** `"pretty" | "json" | string`
**default:** `"pretty"`
The <Link href="/plugins/reporters/">reporter</Link> to use for reporting the migration progress.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
- `@emigrate/reporter-`
- `emigrate-reporter-`
- `@emigrate/`
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-reporter-somereporter` package, you can specify either `emigrate-reporter-somereporter` or just `somereporter` as the name.
### `--color`, `--no-color`
Force enable/disable colored output, option is passed to the reporter which should respect it.
### `--no-execution`
Mark the migrations as executed and successful without actually running them,
which is useful if you want to mark migrations as successful after running them manually
:::tip
See the <Link href="/guides/baseline/">Baseline guide</Link> for example usage of the `--no-execution` option
:::
### `--abort-respite`
**type:** `number`
**default:** `10`
Customize the number of seconds to wait before abandoning a running migration when the process is about to shutdown, for instance when the user presses `Ctrl+C` or when the container is being stopped (if running inside a container).

View file

@ -1,107 +0,0 @@
---
title: "`emigrate up`"
description: "Run migrations"
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
The `up` command is used to either list or run all pending migrations, i.e. migrations that haven't been run yet.
Emigrate takes all migration files in the given directory and compares them to the migration history so that it knows which migrations are pending.
It then sorts the pending migrations by filename in ascending order and runs them one by one.
If any of the migrations fail, the command will be aborted and the rest of the migrations will not be run.
## Usage
<Tabs>
<TabItem label="npm">
```bash
npx emigrate up [options]
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate up [options]
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate up [options]
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate up [options]
```
</TabItem>
<TabItem label="package.json">
```json {3}
{
"scripts": {
"emigrate": "emigrate up [options]"
}
}
```
</TabItem>
</Tabs>
## Options
### `-h`, `--help`
Show command help and exit
### `--dry`
List the pending migrations that would be run without actually running them
### `-d`, `--directory <path>`
The directory where the migration files are located. The given path should be absolute or relative to the current working directory.
### `-s`, `--storage <name>`
The <Link href="/plugins/storage/">storage plugin</Link> to use, which is responsible for where to store the migration history.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
- `@emigrate/storage-`
- `emigrate-storage-`
- `@emigrate/plugin-storage-`
- `@emigrate/`
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-storage-somedb` package, you can specify either `emigrate-storage-somedb` or just `somedb` as the name.
In case you have both a `emigrate-storage-somedb` and a `somedb` package installed, the `emigrate-storage-somedb` package will be used.
### `-p`, `--plugin <name>`
The <Link href="/plugins/loaders/">loader plugin(s)</Link> to use. Can be specified multiple times to use multiple plugins.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
- `@emigrate/plugin-`
- `emigrate-plugin-`
- `@emigrate/`
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-plugin-someplugin` package, you can specify either `emigrate-plugin-someplugin` or just `someplugin` as the name.
In case you have both a `emigrate-plugin-someplugin` and a `someplugin` package installed, the `emigrate-plugin-someplugin` package will be used.
### `-r`, `--reporter <name>`
The <Link href="/plugins/reporters/">reporter</Link> to use for reporting the migration progress.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
- `@emigrate/reporter-`
- `emigrate-reporter-`
- `@emigrate/`
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-reporter-somereporter` package, you can specify either `emigrate-reporter-somereporter` or just `somereporter` as the name.

View file

@ -0,0 +1,255 @@
---
title: Baseline
description: A guide on how to baseline an existing database at a specific version
---
import { Tabs, TabItem, LinkCard } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
A common scenario is to have an existing database that you want to start managing with Emigrate. This is called baselining.
## Baselining an existing database schema
Let's assume you have a PostgreSQL database with the following schema:
```sql
CREATE TABLE public.users (
id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL,
email VARCHAR(255) NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE TABLE public.posts (
id SERIAL PRIMARY KEY,
user_id INTEGER NOT NULL REFERENCES public.users(id),
title VARCHAR(255) NOT NULL,
body TEXT NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
```
<LinkCard
href="../../plugins/storage/postgres/"
title="PostgreSQL Storage Plugin"
description="See how to configure the PostgreSQL storage plugin here..."
/>
<LinkCard
href="../../plugins/storage/"
title="Storage Plugins"
description="Learn more about storage plugins here..."
/>
### Create a baseline migration
You can baseline this database by first creating a baseline migration (here we name it "baseline"):
<Tabs>
<TabItem label="npm">
```bash
npx emigrate new --plugin postgres baseline
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate new --plugin postgres baseline
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate new --plugin postgres baseline
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate new --plugin postgres baseline
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
```bash
deno task emigrate new --plugin postgres baseline
```
</TabItem>
</Tabs>
Which will generate an empty migration file in your migration directory:
```sql title="migrations/20240118123456789_baseline.sql"
-- Migration: baseline
```
You can then add the SQL statements for your database schema to this migration file:
```sql title="migrations/20240118123456789_baseline.sql"
-- Migration: baseline
CREATE TABLE public.users (
id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL,
email VARCHAR(255) NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE TABLE public.posts (
id SERIAL PRIMARY KEY,
user_id INTEGER NOT NULL REFERENCES public.users(id),
title VARCHAR(255) NOT NULL,
body TEXT NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
```
### Log the baseline migration
For new environments this baseline migration will automatically be run when you run <Link href="/cli/up/">`emigrate up`</Link>.
For any existing environments you will need to run `emigrate up` with the <Link href="/cli/up/#--no-execution">`--no-execution`</Link> flag to prevent the migration from being executed and only log the migration:
<Tabs>
<TabItem label="npm">
```bash
npx emigrate up --no-execution
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate up --no-execution
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate up --no-execution
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate up --no-execution
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
```bash
deno task emigrate up --no-execution
```
</TabItem>
</Tabs>
In case you have already added more migration files to your migration directory you can limit the "up" command to just log the baseline migration by specifying the <Link href="/cli/up/#-t---to-name">`--to`</Link> option:
<Tabs>
<TabItem label="npm">
```bash
npx emigrate up --no-execution --to 20240118123456789_baseline.sql
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate up --no-execution --to 20240118123456789_baseline.sql
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate up --no-execution --to 20240118123456789_baseline.sql
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate up --no-execution --to 20240118123456789_baseline.sql
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
```bash
deno task emigrate up --no-execution --to 20240118123456789_baseline.sql
```
</TabItem>
</Tabs>
### Verify the baseline migration status
You can verify the status of the baseline migration by running the <Link href="/cli/list/">`emigrate list`</Link> command:
<Tabs>
<TabItem label="npm">
```bash
npx emigrate list
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate list
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate list
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate list
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
```bash
deno task emigrate list
```
</TabItem>
</Tabs>
Which should output something like this:
```txt title="emigrate list"
Emigrate list v0.14.1 /your/project/path
✔ migrations/20240118123456789_baseline.sql (done)
1 done (1 total)
```
### Happy migrating!
You can now start adding new migrations to your migration directory and run <Link href="/cli/up/">`emigrate up`</Link> to apply them to your database.
Which should be part of your CD pipeline to ensure that your database schema is always up to date in each environment.

View file

@ -1,11 +0,0 @@
---
title: Example Guide
description: A guide in my new Starlight docs site.
---
Guides lead a user through a specific task they want to accomplish, often with a sequence of steps.
Writing a good guide requires thinking about what your users are trying to do.
## Further reading
- Read [about how-to guides](https://diataxis.fr/how-to-guides/) in the Diátaxis framework

View file

@ -0,0 +1,136 @@
---
title: Using TypeScript
description: A guide on how to support migration files written in TypeScript
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
:::tip[Using Bun or Deno?]
If you are using [Bun](https://bun.sh) or [Deno](https://deno.land) you are already good to go as they both support TypeScript out of the box!
:::
If you're using NodeJS you have at least the two following options to support running TypeScript migration files in NodeJS.
## Using `tsx`
If you want to be able to write and run migration files written in TypeScript an easy way is to install the [`tsx`](https://github.com/privatenumber/tsx) package.
### Installing `tsx`
<Tabs>
<TabItem label="npm">
```bash
npm install tsx
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm add tsx
```
</TabItem>
<TabItem label="yarn">
```bash
yarn add tsx
```
</TabItem>
</Tabs>
:::note
You must install `tsx` as an ordinary dependency, not as a dev dependency,
in case you are pruning your development dependencies before deploying your application (which you should).
:::
### Loading TypeScript migrations
After installing `tsx` you can load it in two ways.
#### Via CLI
Using the <Link href="/cli/up/#-i---import-module">`--import`</Link> flag you can load `tsx` before running your migration files.
<Tabs>
<TabItem label="npm">
```bash
npx emigrate up --import tsx
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate up --import tsx
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate up --import tsx
```
</TabItem>
</Tabs>
:::note
This method is necessary if you want to write your configuration file in TypeScript without having `typescript` installed in your production environment, as `tsx` must be loaded before the configuration file is loaded.
:::
#### Via configuration file
You can also directly import `tsx` in your configuration file (will only work if you're not using TypeScript for your configuration file).
```js title="emigrate.config.js" {1}
import 'tsx';
export default {
// ...
};
```
Then you can run your migration files as usual:
<Tabs>
<TabItem label="npm">
```bash
npx emigrate up
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate up
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate up
```
</TabItem>
</Tabs>
## Building TypeScript migrations
If you don't want to have `tsx` (or similar) as a dependency included in your production environment then
you can build your TypeScript migration files using the [`tsc`](https://www.typescriptlang.org/docs/handbook/compiler-options.html) compiler or
some other tool that are already part of your build process when transpiling your TypeScript code to JavaScript.
Assume that you have all of your migrations in a `src/migrations` directory and you have built them to a `dist/migrations` directory.
Then you can run your migration files by pointing to the `dist/migrations` directory:
<Tabs>
<TabItem label="npm">
```bash
npx emigrate up -d dist/migrations
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate up -d dist/migrations
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate up -d dist/migrations
```
</TabItem>
</Tabs>
:::note
If you're mixing languages for your migration files, e.g. you have both `.sql` and `.ts` files in `src/migrations`, make sure that they are all copied to the destination directory if not part of the TypeScript build process.
:::

View file

@ -3,11 +3,13 @@ title: "FAQ"
description: "Frequently asked questions about Emigrate."
---
import Link from '@components/Link.astro';
## Why no `down` migrations?
> Always forward never backwards.
Many other migration tools support `down` migrations, but in all the years we have been
Many other migration tools support `down` (or undo) migrations, but in all the years we have been
doing migrations we have never needed to rollback a migration in production,
in that case we would just write a new migration to fix the problem.
@ -17,3 +19,7 @@ and in such case you just revert the migration manually and fix the `up` migrati
The benefit of this is that you don't have to worry about writing `down` migrations, and you can focus on writing the `up` migrations.
This way you will only ever have to write `down` migrations when they are really necessary instead of for every migration
(which makes it the exception rather than the rule, which is closer to the truth).
## Can I use Emigrate with my existing database?
Yes, you can use Emigrate with an existing database. See the <Link href="/guides/baseline/">Baseline guide</Link> for more information.

View file

@ -40,6 +40,18 @@ But for now, this is the way to go.
bun add @emigrate/cli
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
</TabItem>
</Tabs>
### Pick a storage plugin
@ -69,17 +81,65 @@ Install the plugin you want to use, for example the <Link href="/plugins/storage
bun add @emigrate/postgres
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {4}
{
"dependencies": {
"@emigrate/cli": "*",
"@emigrate/postgres": "*"
}
}
```
</TabItem>
</Tabs>
### Create your first migration
<LinkCard
href="../../guides/baseline/"
title="Baseline your database"
description="Learn how to create a baseline of your existing database."
/>
Create a new migration file in your project using:
<Tabs>
<TabItem label="npm">
```bash title="Create a new migration file"
npx emigrate new --plugin postgres create users table
```
</TabItem>
<TabItem label="pnpm">
```bash title="Create a new migration file"
pnpm emigrate new --plugin postgres create users table
```
</TabItem>
<TabItem label="yarn">
```bash title="Create a new migration file"
yarn emigrate new --plugin postgres create users table
```
</TabItem>
<TabItem label="bun">
```bash title="Create a new migration file"
bunx --bun emigrate new --plugin postgres create users table
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```txt title="Output"
```bash title="Create a new migration file"
deno task emigrate new --plugin postgres create users table
```
</TabItem>
</Tabs>
```txt title="emigrate new"
Emigrate new v0.10.0 /your/project/path
✔ migrations/20231215125421364_create_users_table.sql (done) 3ms
@ -119,11 +179,43 @@ There's no magic about the first line comment as when using Liquibase, it's just
To show both pending and already applied migrations (or previously failed), use the `list` command:
<Tabs>
<TabItem label="npm">
```bash title="Show all migrations"
npx emigrate list --storage postgres
```
</TabItem>
<TabItem label="pnpm">
```bash title="Show all migrations"
pnpm emigrate list --storage postgres
```
</TabItem>
<TabItem label="yarn">
```bash title="Show all migrations"
yarn emigrate list --storage postgres
```
</TabItem>
<TabItem label="bun">
```bash title="Show all migrations"
bunx --bun emigrate list --storage postgres
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```txt title="Example output"
```bash title="Show all migrations"
deno task emigrate list --storage postgres
```
</TabItem>
</Tabs>
```txt title="emigrate list"
Emigrate list v0.10.0 /your/project/path
✔ migrations/20231211090830577_another_table.sql (done)
@ -137,9 +229,41 @@ Emigrate list v0.10.0 /your/project/path
A good way to test your configuration is to run the migrations in dry mode:
<Tabs>
<TabItem label="npm">
```bash title="Show pending migrations"
npx emigrate up --storage postgres --plugin postgres --dry
```
</TabItem>
<TabItem label="pnpm">
```bash title="Show pending migrations"
pnpm emigrate up --storage postgres --plugin postgres --dry
```
</TabItem>
<TabItem label="yarn">
```bash title="Show pending migrations"
yarn emigrate up --storage postgres --plugin postgres --dry
```
</TabItem>
<TabItem label="bun">
```bash title="Show pending migrations"
bunx --bun emigrate up --storage postgres --plugin postgres --dry
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash title="Show pending migrations"
deno task emigrate up --storage postgres --plugin postgres --dry
```
</TabItem>
</Tabs>
:::note
This will connect to the database using some default values. For ways to configure the connection, see <Link href="/reference/configuration/">Configuration</Link>.

View file

@ -8,7 +8,7 @@ import Link from '@components/Link.astro';
Emigrate is written in [TypeScript](https://www.typescriptlang.org) and is a migration tool for any database or data.
* It's database agnostic - you can use it with any database, or even with non-database data.
* It can be run on multiple platforms - currently [NodeJS](https://nodejs.org) and [Bun](https://bun.sh) is supported, but more platforms is planned.
* It can be run on multiple platforms - currently [NodeJS](https://nodejs.org), [Bun](https://bun.sh) and [Deno](https://deno.com) is supported, but more platforms is planned.
* It's the successor of [klei-migrate](https://github.com/klei/migrate) and is designed to be compatible with [Immigration](https://github.com/blakeembrey/node-immigration) and many of its storage plugins, as well as [Migrate](https://github.com/tj/node-migrate).
* It supports migration files written using <Link href="/plugins/loaders/default/">CommonJS or ES Modules out of the box</Link>, with any of the following extensions: `.js`, `.cjs` or `.mjs`, and supports [async functions](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/async_function), [Promises](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise) or using the [NodeJS Callback Pattern](https://nodejs.org/en/learn/asynchronous-work/javascript-asynchronous-programming-and-callbacks#handling-errors-in-callbacks).
* Other languages can be used by using a <Link href="/plugins/loaders/">Loader Plugin</Link>.

View file

@ -22,5 +22,5 @@ The generator is responsible for generating migration files in a specific format
</CardGrid>
:::note
Instead of having to install a generator plugin, you can also use the much simpler <Link href="/commands/new/#-t---template-path">`--template`</Link> option to specify a custom template file for new migrations.
Instead of having to install a generator plugin, you can also use the much simpler <Link href="/cli/new/#-t---template-path">`--template`</Link> option to specify a custom template file for new migrations.
:::

View file

@ -30,12 +30,53 @@ A <Link href="/plugins/generators/">generator plugin</Link> for generating new m
bun add @emigrate/plugin-generate-js
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/plugin-generate-js": "*"
}
}
```
</TabItem>
</Tabs>
## Usage
<Tabs>
<TabItem label="npm">
```bash
emigrate new --plugin generate-js create some fancy table
npx emigrate new --plugin generate-js create some fancy table
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate new --plugin generate-js create some fancy table
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate new --plugin generate-js create some fancy table
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate new --plugin generate-js create some fancy table
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
For more information see <Link href="/commands/new/">the `new` command</Link>'s documentation.
```bash
deno task emigrate new --plugin generate-js create some fancy table
```
</TabItem>
</Tabs>
For more information see <Link href="/cli/new/">the `new` command</Link>'s documentation.

View file

@ -30,12 +30,53 @@ The MySQL generator creates new migration files with the `.sql` extension. In th
bun add @emigrate/mysql
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/mysql": "*"
}
}
```
</TabItem>
</Tabs>
## Usage
<Tabs>
<TabItem label="npm">
```bash
emigrate new --plugin mysql create some fancy table
npx emigrate new --plugin mysql create some fancy table
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate new --plugin mysql create some fancy table
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate new --plugin mysql create some fancy table
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate new --plugin mysql create some fancy table
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
For more information see <Link href="/commands/new/">the `new` command</Link>'s documentation.
```bash
deno task emigrate new --plugin mysql create some fancy table
```
</TabItem>
</Tabs>
For more information see <Link href="/cli/new/">the `new` command</Link>'s documentation.

View file

@ -30,12 +30,53 @@ The PostgreSQL generator creates new migration files with the `.sql` extension.
bun add @emigrate/postgres
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/postgres": "*"
}
}
```
</TabItem>
</Tabs>
## Usage
<Tabs>
<TabItem label="npm">
```bash
emigrate new --plugin postgres create some fancy table
npx emigrate new --plugin postgres create some fancy table
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate new --plugin postgres create some fancy table
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate new --plugin postgres create some fancy table
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate new --plugin postgres create some fancy table
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
For more information see <Link href="/commands/new/">the `new` command</Link>'s documentation.
```bash
deno task emigrate new --plugin postgres create some fancy table
```
</TabItem>
</Tabs>
For more information see <Link href="/cli/new/">the `new` command</Link>'s documentation.

View file

@ -3,8 +3,9 @@ title: Default Loader Plugin
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
The default loader plugin is responsible for importing migration files written in JavaScript.
The default loader plugin is responsible for importing migration files written in JavaScript or TypeScript.
Migration files can be written using either CommonJS or ES Modules.
## Supported extensions
@ -14,6 +15,13 @@ The default loader plugin supports the following extensions:
* `.js` - either CommonJS or ES Modules depending on your package.json's [`type` field](https://nodejs.org/api/packages.html#type)
* `.cjs` - CommonJS
* `.mjs` - ES Modules
* `.ts` - either CommonJS or ES Modules written in TypeScript
* `.cts` - CommonJS written in TypeScript
* `.mts` - ES Modules written in TypeScript
:::note
To enable TypeScript support in NodeJS you also need to follow the <Link href="/guides/typescript/">TypeScript setup guide</Link>.
:::
## Supported exports

View file

@ -7,7 +7,7 @@ import Link from '@components/Link.astro';
Loader plugins are used to transform any file type into a JavaScript function that will be called when the migration file is executed.
Out of the box, Emigrate supports the following file extensions: `.js`, `.cjs` and `.mjs`. And both CommonJS and ES Modules are supported. See the <Link href="/plugins/loaders/default/">Default Loader</Link> for more information.
Out of the box, Emigrate supports the following file extensions: `.js`, `.cjs`, `.mjs`, `.ts`, `.cts` and `.mts`. And both CommonJS and ES Modules are supported. See the <Link href="/plugins/loaders/default/">Default Loader</Link> for more information.
## Using a loader plugin
@ -21,14 +21,14 @@ Or set it up in your configuration file, see <Link href="/reference/configuratio
:::tip[Did you know?]
You can specify multiple loader plugins at the same time, which is needed when you mix file types in your migrations folder.
For example, you can use the `postgres` or `mysql` loader for `.sql` files and the `typescript` loader for `.ts` files.
For example, you can use the `postgres` or `mysql` loader for `.sql` files and a `yaml` loader for `.yml` files.
The <Link href="/plugins/loaders/default/">default loader</Link> will be used for all other file types, and doesn't need to be specified.
:::
## Available Loader Plugins
<CardGrid>
<LinkCard title="Default Loader" href="default/" description="The loader responsible for loading .js, .cjs and .mjs files" />
<LinkCard title="Default Loader" href="default/" description="The loader responsible for loading .js, .cjs, .mjs, .ts, .cts and .mts files" />
<LinkCard title="PostgreSQL Loader" href="postgres/" description="Can load and execute .sql files against a PostgreSQL database" />
<LinkCard title="MySQL Loader" href="mysql/" description="Can load and execute .sql files against a MySQL database" />
</CardGrid>

View file

@ -30,6 +30,15 @@ The MySQL loader plugin transforms `.sql` files into JavaScript functions that E
bun add @emigrate/mysql
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/mysql": "*"
}
}
```
</TabItem>
</Tabs>
## Configuration
@ -78,9 +87,41 @@ The `MYSQL_URL` environment variable takes precedence over the other environment
The environment variables are used when the plugin is used using the `--plugin` command line option:
<Tabs>
<TabItem label="npm">
```bash
npx emigrate list --plugin mysql
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate list --plugin mysql
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate list --plugin mysql
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate list --plugin mysql
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash
deno task emigrate list --plugin mysql
```
</TabItem>
</Tabs>
Or when specifying the plugin in the <Link href="/reference/configuration/">`emigrate.config.js` file</Link> as a string:

View file

@ -30,6 +30,15 @@ The PostgreSQL loader plugin transforms `.sql` files into JavaScript functions t
bun add @emigrate/postgres
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/postgres": "*"
}
}
```
</TabItem>
</Tabs>
## Configuration
@ -78,9 +87,41 @@ The `POSTGRES_URL` environment variable takes precedence over the other environm
The environment variables are used when the plugin is used using the `--plugin` command line option:
<Tabs>
<TabItem label="npm">
```bash
npx emigrate list --plugin postgres
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate list --plugin postgres
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate list --plugin postgres
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate list --plugin postgres
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash
deno task emigrate list --plugin postgres
```
</TabItem>
</Tabs>
Or when specifying the plugin in the <Link href="/reference/configuration/">`emigrate.config.js` file</Link> as a string:

View file

@ -1,23 +0,0 @@
---
title: Default Reporter
---
Emigrate's default reporter. The default reporter recognizes if the current terminal is an interactive shell (or if it's a CI environment), if that's the case _no_ animations will be shown.
## Usage
By default, Emigrate uses the default reporter.
## Example output
```bash
Emigrate up v0.10.0 /Users/joakim/dev/@aboviq/test-emigrate (dry run)
1 pending migrations to run
migration-folder/20231218135441244_create_some_table.sql (pending)
1 pending (1 total)
```

View file

@ -20,6 +20,7 @@ Or set it up in your configuration file, see <Link href="/reference/configuratio
## Available Reporters
<CardGrid>
<LinkCard title="Default Reporter" href="default/" />
<LinkCard title="Pino Reporter" href="pino/" />
<LinkCard title="Pretty Reporter" description="The default reporter" href="pretty/" />
<LinkCard title="JSON Reporter" description="A built-in reporter for outputing a JSON object" href="json/" />
<LinkCard title="Pino Reporter" description="A reporter package for outputting new line delimited JSON" href="pino/" />
</CardGrid>

View file

@ -0,0 +1,102 @@
---
title: JSON Reporter
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
An Emigrate reporter that outputs a JSON object.
The reporter is included by default and does not need to be installed separately.
## Usage
### Via CLI
<Tabs>
<TabItem label="npm">
```bash
npx emigrate <command> --reporter json
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate <command> --reporter json
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate <command> --reporter json
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate <command> --reporter json
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash
deno task emigrate <command> --reporter json
```
</TabItem>
</Tabs>
See for instance the <Link href="/cli/up/#-r---reporter-name">Reporter Option</Link> for the `up` command for more information.
### Via configuration file
<Tabs>
<TabItem label="JavaScript">
```js title="emigrate.config.js"
/** @type {import('@emigrate/cli').EmigrateConfig} */
export default {
reporter: 'json',
};
```
</TabItem>
<TabItem label="TypeScript">
```ts title="emigrate.config.ts"
import { type EmigrateConfig } from '@emigrate/cli';
const config: EmigrateConfig = {
reporter: 'json',
};
export default config;
```
</TabItem>
</Tabs>
See <Link href="/reference/configuration/#reporter">Reporter Configuration</Link> for more information.
## Example output
```json
{
"command": "up",
"version": "0.17.2",
"numberTotalMigrations": 1,
"numberDoneMigrations": 0,
"numberSkippedMigrations": 0,
"numberFailedMigrations": 0,
"numberPendingMigrations": 1,
"success": true,
"startTime": 1707206599968,
"endTime": 1707206600005,
"migrations": [
{
"name": "/your/project/migrations/20240206075446123_some_other_table.sql",
"status": "pending",
"duration": 0
}
]
}
```

View file

@ -32,6 +32,15 @@ This is useful in production environments where you want all logs as JSON, which
bun add @emigrate/reporter-pino
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/reporter-pino": "*"
}
}
```
</TabItem>
</Tabs>
## Usage
@ -42,19 +51,67 @@ The `@emigrate/reporter-` prefix is optional when using this reporter.
### Via CLI
<Tabs>
<TabItem label="npm">
```bash
emigrate <command> --reporter pino
npx emigrate <command> --reporter pino
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate <command> --reporter pino
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate <command> --reporter pino
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate <command> --reporter pino
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
See for instance the <Link href="/commands/up/#-r---reporter-name">Reporter Option</Link> for the `up` command for more information.
```bash
deno task emigrate <command> --reporter pino
```
</TabItem>
</Tabs>
See for instance the <Link href="/cli/up/#-r---reporter-name">Reporter Option</Link> for the `up` command for more information.
### Via configuration file
```js title="emigrate.config.js" {2}
<Tabs>
<TabItem label="JavaScript">
```js title="emigrate.config.js"
/** @type {import('@emigrate/cli').EmigrateConfig} */
export default {
reporter: 'pino',
};
```
</TabItem>
<TabItem label="TypeScript">
```ts title="emigrate.config.ts"
import { type EmigrateConfig } from '@emigrate/cli';
const config: EmigrateConfig = {
reporter: 'pino',
};
export default config;
```
</TabItem>
</Tabs>
See <Link href="/reference/configuration/#reporter">Reporter Configuration</Link> for more information.

View file

@ -0,0 +1,90 @@
---
title: Pretty Reporter (default)
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
Emigrate's default reporter. It recognizes if the current terminal is an interactive shell (or if it's a CI environment), if that's the case _no_ animations will be shown.
The reporter is included by default and does not need to be installed separately.
## Usage
By default, Emigrate uses the "pretty" reporter, but it can also be explicitly set by using the <Link href="/cli/up/#-r---reporter-name">`--reporter`</Link> flag.
<Tabs>
<TabItem label="npm">
```bash
npx emigrate <command> --reporter pretty
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate <command> --reporter pretty
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate <command> --reporter pretty
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate <command> --reporter pretty
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash
deno task emigrate <command> --reporter pretty
```
</TabItem>
</Tabs>
Or by setting it in the configuration file.
<Tabs>
<TabItem label="JavaScript">
```js title="emigrate.config.js"
/** @type {import('@emigrate/cli').EmigrateConfig} */
export default {
reporter: 'pretty',
};
```
</TabItem>
<TabItem label="TypeScript">
```ts title="emigrate.config.ts"
import { type EmigrateConfig } from '@emigrate/cli';
const config: EmigrateConfig = {
reporter: 'pretty',
};
export default config;
```
</TabItem>
</Tabs>
See <Link href="/reference/configuration/#reporter">Reporter Configuration</Link> for more information.
## Example output
```bash
Emigrate up v0.17.2 /your/working/directory (dry run)
1 pending migrations to run
migration-folder/20231218135441244_create_some_table.sql (pending)
1 pending (1 total)
```

View file

@ -34,6 +34,15 @@ This is suitable for simple setups, but for more advanced setups for instance wh
bun add @emigrate/storage-fs
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/storage-fs": "*"
}
}
```
</TabItem>
</Tabs>
## Configuration

View file

@ -30,6 +30,15 @@ The MySQL storage plugin uses a MySQL database to store the migration history (*
bun add @emigrate/mysql
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/mysql": "*"
}
}
```
</TabItem>
</Tabs>
## Configuration

View file

@ -30,6 +30,15 @@ The PostgreSQL storage plugin uses a PostgreSQL database to store the migration
bun add @emigrate/postgres
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/postgres": "*"
}
}
```
</TabItem>
</Tabs>
## Configuration

View file

@ -45,8 +45,9 @@ Set the directory where your migrations are located, relative to the project roo
### `reporter`
**type:** `string | EmigrateReporter | Promise<EmigrateReporter> | (() => Promise<EmigrateReporter>)`
**default:** `"default"` - the default reporter
**type:** `"pretty" | "json" | string | EmigrateReporter | Promise<EmigrateReporter> | (() => Promise<EmigrateReporter>)`
**default:** `"pretty"` - the default reporter
Set the reporter to use for the different commands. Specifying a <Link href="/plugins/reporters/">reporter</Link> is most useful in a CI or production environment where you either ship logs or want to have a machine-readable format.
@ -63,6 +64,9 @@ export default {
up: {
reporter: 'json',
},
new: {
reporter: 'pretty', // Not really necessary, as it's the default
},
};
```
@ -74,6 +78,20 @@ Commands that are not specified will use the default reporter.
The default reporter automatically detects if the current environment is an interactive terminal or not, and will only render animations and similar if it is.
:::
### `color`
**type:** `boolean | undefined`
**default:** `undefined`
Set whether to force colors in the output or not. This option is passed to the reporter which should respect it.
```js title="emigrate.config.js" {2}
export default {
color: false,
};
```
### `storage`
**type:** `string | EmigrateStorage | Promise<EmigrateStorage> | (() => Promise<EmigrateStorage>)`
@ -142,3 +160,16 @@ export default {
```
Will create new migration files with the `.ts` extension.
### `abortRespite`
**type:** `number`
**default:** `10`
Customize the number of seconds to wait before abandoning a running migration when the process is about to shutdown, for instance when the user presses `Ctrl+C` or when the container is being stopped (if running inside a container).
```js title="emigrate.config.js" {2}
export default {
abortRespite: 10,
};
```

View file

@ -37,9 +37,10 @@
"bugs": "https://github.com/aboviq/emigrate/issues",
"license": "MIT",
"volta": {
"node": "20.9.0",
"pnpm": "8.10.2"
"node": "22.15.0",
"pnpm": "9.4.0"
},
"packageManager": "pnpm@9.4.0",
"engines": {
"node": ">=18"
},
@ -61,26 +62,31 @@
},
"overrides": [
{
"files": "packages/**/*.test.ts",
"files": [
"packages/**/*.test.ts",
"packages/**/*.integration.ts"
],
"rules": {
"@typescript-eslint/no-floating-promises": 0
"@typescript-eslint/no-floating-promises": 0,
"max-params": 0
}
}
]
},
"dependencies": {
"@changesets/cli": "2.27.1",
"@commitlint/cli": "18.4.3",
"@commitlint/config-conventional": "18.4.3",
"@commitlint/cli": "18.6.1",
"@commitlint/config-conventional": "18.6.1",
"@types/node": "20.10.4",
"glob": "10.3.10",
"husky": "8.0.3",
"lint-staged": "15.1.0",
"lint-staged": "15.2.0",
"npm-run-all": "4.1.5",
"prettier": "3.1.1",
"tsx": "4.6.2",
"turbo": "1.10.16",
"typescript": "5.2.2",
"testcontainers": "10.24.2",
"tsx": "4.15.7",
"turbo": "2.0.5",
"typescript": "5.5.2",
"xo": "0.56.0"
}
}

View file

@ -1,5 +1,173 @@
# @emigrate/cli
## 0.18.4
### Patch Changes
- d779286: Upgrade TypeScript to v5.5 and enable [isolatedDeclarations](https://devblogs.microsoft.com/typescript/announcing-typescript-5-5/#isolated-declarations)
- Updated dependencies [d779286]
- @emigrate/plugin-tools@0.9.8
- @emigrate/types@0.12.2
## 0.18.3
### Patch Changes
- ca154fa: Minimize package size by excluding \*.tsbuildinfo files
- Updated dependencies [ca154fa]
- @emigrate/plugin-tools@0.9.7
- @emigrate/types@0.12.2
## 0.18.2
### Patch Changes
- 4152209: Handle the case where the config is returned as an object with a nested `default` property
## 0.18.1
### Patch Changes
- 57a0991: Cleanup AbortSignal listeners when they are not needed to avoid MaxListenersExceededWarning when migrating many migrations at once
## 0.18.0
### Minor Changes
- c838ffb: Make it possible to write the Emigrate configuration file in TypeScript and load it using `tsx` in a NodeJS environment by importing packages provided using the `--import` CLI option before loading the configuration file. This makes it possible to run Emigrate in production with a configuration file written in TypeScript without having the `typescript` package installed.
- 18382ce: Add a built-in "json" reporter for outputting a single JSON object
- 18382ce: Rename the "default" reporter to "pretty" and make it possible to specify it using the `--reporter` CLI option or in the configuration file
### Patch Changes
- c838ffb: Don't use the `typescript` package for loading an Emigrate configuration file written in TypeScript in a Bun or Deno environment
## 0.17.2
### Patch Changes
- 61cbcbd: Force exiting after 10 seconds should not change the exit code, i.e. if all migrations have run successfully the exit code should be 0
## 0.17.1
### Patch Changes
- 543b7f6: Use setTimeout/setInterval from "node:timers" so that .unref() correctly works with Bun
- db656c2: Enable NPM provenance
- Updated dependencies [db656c2]
- @emigrate/plugin-tools@0.9.6
- @emigrate/types@0.12.1
## 0.17.0
### Minor Changes
- 0faebbe: Add support for passing the relative path to a migration file to remove from the history using the "remove" command
- 9109238: When the `--from` or `--to` CLI options are used the given migration name (or path to migration file) must exist. This is a BREAKING CHANGE from before. The reasoning is that by forcing the migrations to exist you avoid accidentally running migrations you don't intend to, because a simple typo could have the effect that many unwanted migrations is executed so it's better to show an error if that's the case.
- 1f139fd: Completely rework how the "remove" command is run, this is to make it more similar to the "up" and "list" command as now it will also use the `onMigrationStart`, `onMigrationSuccess` and `onMigrationError` reporter methods when reporting the command progress. It's also in preparation for adding `--from` and `--to` CLI options for the "remove" command, similar to how the same options work for the "up" command.
- 9109238: Add support for passing relative paths to migration files as the `--from` and `--to` CLI options. This is very useful from terminals that support autocomplete for file paths. It also makes it possible to copy the path to a migration file from Emigrate's output and use that as either `--from` and `--to` directly.
### Patch Changes
- f1b9098: Only include files when collecting migrations, i.e. it should be possible to have folders inside your migrations folder.
- 2f6b4d2: Don't dim decimal points in durations in the default reporter
- f2d4bb3: Set Emigrate error instance names from their respective constructor's name for consistency and correct error deserialization.
- ef45be9: Show number of skipped migrations correctly in the command output
- Updated dependencies [94ad9fe]
- @emigrate/types@0.12.0
- @emigrate/plugin-tools@0.9.5
## 0.16.2
### Patch Changes
- b56b6da: Handle migration history entries without file extensions for migration files with periods in their names that are not part of the file extension. Previously Emigrate would attempt to re-run these migrations, but now it will correctly ignore them. E.g. the migration history contains an entry for "migration.file.name" and the migration file is named "migration.file.name.js" it will not be re-run.
## 0.16.1
### Patch Changes
- 121492b: Sort migration files lexicographically correctly by using the default Array.sort implementation
## 0.16.0
### Minor Changes
- a4da353: Handle process interruptions gracefully, e.g. due to receiving a SIGINT or SIGTERM signal. If a migration is currently running when the process is about to shutdown it will have a maximum of 10 more seconds to finish before being deserted (there's no way to cancel a promise sadly, and many database queries are not easy to abort either). The 10 second respite length can be customized using the --abort-respite CLI option or the abortRespite config.
### Patch Changes
- Updated dependencies [ce15648]
- @emigrate/types@0.11.0
- @emigrate/plugin-tools@0.9.4
## 0.15.0
### Minor Changes
- f515c8a: Add support for the --no-execution option to the "up" command to be able to log migrations as successful without actually running them. Can for instance be used for baselining a database or logging manually run migrations as successful.
- 9ef0fa2: Add --from and --to CLI options to control which migrations to include or skip when executing migrations.
- 02c142e: Add --limit option to the "up" command, for limiting the number of migrations to run
### Patch Changes
- bf4d596: Clarify which cli options that needs parameters
- 98adcda: Use better wording in the header in the console output from the default reporter
## 0.14.1
### Patch Changes
- 73a8a42: Support stored migration histories that have only stored the migration file names without file extension and assume it's .js files in that case. This is to be compatible with a migration history generated by Immigration.
## 0.14.0
### Minor Changes
- b083e88: Upgrade cosmiconfig to 9.0.0
## 0.13.1
### Patch Changes
- 83dc618: Remove the --enable-source-maps flag from the shebang for better NodeJS compatibility
## 0.13.0
### Minor Changes
- 9a605a8: Add support for loading TypeScript migration files in the default loader
- 9a605a8: Add a guide for running migration files written in TypeScript to the documentation
## 0.12.0
### Minor Changes
- 9f91bdc: Add support for the `--import` option to import modules/packages before any command is run. This can for instance be used to load environment variables using the [dotenv](https://github.com/motdotla/dotenv) package with `--import dotenv/config`.
- f9a16d8: Add `color` option to the CLI and configuration file, which is used to force enable/disable color output from the reporter (the option is passed to the chosen reporter which should respect it)
- e6e4433: BREAKING CHANGE: Rename the `extension` short CLI option from `-e` to `-x` in preparation for an upcoming option that will take its place
### Patch Changes
- Updated dependencies [f9a16d8]
- @emigrate/types@0.10.0
- @emigrate/plugin-tools@0.9.3
## 0.11.2
### Patch Changes
- Updated dependencies [a6c6e6d]
- @emigrate/types@0.9.1
- @emigrate/plugin-tools@0.9.2
## 0.11.1
### Patch Changes
- Updated dependencies [3a8b06b]
- @emigrate/plugin-tools@0.9.1
## 0.11.0
### Minor Changes

View file

@ -2,20 +2,104 @@
Emigrate is a tool for managing database migrations. It is designed to be simple yet support advanced setups, modular and extensible.
📖 Read the [documentation](https://emigrate.dev) for more information!
## Installation
Install the Emigrate CLI in your project:
```bash
npm install --save-dev @emigrate/cli
npm install @emigrate/cli
# or
pnpm add @emigrate/cli
# or
yarn add @emigrate/cli
# or
bun add @emigrate/cli
```
## Usage
```text
Usage: emigrate <options>/<command>
Options:
-h, --help Show this help message and exit
-v, --version Print version number and exit
Commands:
up Run all pending migrations (or do a dry run)
new Create a new migration file
list List all migrations and their status
remove Remove entries from the migration history
```
### `emigrate up`
```text
Usage: emigrate up [options]
Run all pending migrations
Options:
-h, --help Show this help message and exit
-d, --directory <path> The directory where the migration files are located (required)
-i, --import <module> Additional modules/packages to import before running the migrations (can be specified multiple times)
For example if you want to use Dotenv to load environment variables or when using TypeScript
-s, --storage <name> The storage to use for where to store the migration history (required)
-p, --plugin <name> The plugin(s) to use (can be specified multiple times)
-r, --reporter <name> The reporter to use for reporting the migration progress
-l, --limit <count> Limit the number of migrations to run
-f, --from <name/path> Start running migrations from the given migration name or relative file path to a migration file,
the given name or path needs to exist. The same migration and those after it lexicographically will be run
-t, --to <name/path> Skip migrations after the given migration name or relative file path to a migration file,
the given name or path needs to exist. The same migration and those before it lexicographically will be run
--dry List the pending migrations that would be run without actually running them
--color Force color output (this option is passed to the reporter)
--no-color Disable color output (this option is passed to the reporter)
--no-execution Mark the migrations as executed and successful without actually running them,
which is useful if you want to mark migrations as successful after running them manually
--abort-respite <sec> The number of seconds to wait before abandoning running migrations after the command has been aborted (default: 10)
Examples:
emigrate up --directory src/migrations -s fs
emigrate up -d ./migrations --storage @emigrate/mysql
emigrate up -d src/migrations -s postgres -r json --dry
emigrate up -d ./migrations -s mysql --import dotenv/config
emigrate up --limit 1
emigrate up --to 20231122120529381_some_migration_file.js
emigrate up --to 20231122120529381_some_migration_file.js --no-execution
```
### Examples
Create a new migration:
```bash
emigrate new -d migrations -e .js create some fancy table
npx emigrate new -d migrations create some fancy table
# or
pnpm emigrate new -d migrations create some fancy table
# or
yarn emigrate new -d migrations create some fancy table
# or
bunx --bun emigrate new -d migrations create some fancy table
```
Will create a new empty JavaScript migration file with the name "YYYYMMDDHHmmssuuu_create_some_fancy_table.js" in the `migrations` directory.

View file

@ -1,8 +1,9 @@
{
"name": "@emigrate/cli",
"version": "0.11.0",
"version": "0.18.4",
"publishConfig": {
"access": "public"
"access": "public",
"provenance": true
},
"description": "",
"type": "module",
@ -18,7 +19,8 @@
"emigrate": "dist/cli.js"
},
"files": [
"dist"
"dist",
"!dist/*.tsbuildinfo"
],
"scripts": {
"build": "tsc --pretty",
@ -35,7 +37,9 @@
"immigration"
],
"devDependencies": {
"@emigrate/tsconfig": "workspace:*"
"@emigrate/tsconfig": "workspace:*",
"@types/bun": "1.0.5",
"bun-types": "1.0.26"
},
"author": "Aboviq AB <dev@aboviq.com> (https://www.aboviq.com)",
"homepage": "https://github.com/aboviq/emigrate/tree/main/packages/cli#readme",
@ -45,10 +49,11 @@
"dependencies": {
"@emigrate/plugin-tools": "workspace:*",
"@emigrate/types": "workspace:*",
"ansis": "2.0.2",
"cosmiconfig": "8.3.6",
"ansis": "2.0.3",
"cosmiconfig": "9.0.0",
"elegant-spinner": "3.0.0",
"figures": "6.0.1",
"import-from-esm": "1.3.3",
"is-interactive": "2.0.0",
"log-update": "6.0.0",
"pretty-ms": "8.0.0",

View file

@ -0,0 +1,5 @@
export async function* arrayMapAsync<T, U>(iterable: AsyncIterable<T>, mapper: (item: T) => U): AsyncIterable<U> {
for await (const item of iterable) {
yield mapper(item);
}
}

View file

@ -1,13 +1,29 @@
#!/usr/bin/env node --enable-source-maps
#!/usr/bin/env node
import process from 'node:process';
import { parseArgs } from 'node:util';
import { ShowUsageError } from './errors.js';
import { setTimeout } from 'node:timers';
import importFromEsm from 'import-from-esm';
import { CommandAbortError, ShowUsageError } from './errors.js';
import { getConfig } from './get-config.js';
import { DEFAULT_RESPITE_SECONDS } from './defaults.js';
type Action = (args: string[]) => Promise<void>;
type Action = (args: string[], abortSignal: AbortSignal) => Promise<void>;
const up: Action = async (args) => {
const config = await getConfig('up');
const useColors = (values: { color?: boolean; 'no-color'?: boolean }) => {
if (values['no-color']) {
return false;
}
return values.color;
};
const importAll = async (cwd: string, modules: string[]) => {
for await (const module of modules) {
await importFromEsm(cwd, module);
}
};
const up: Action = async (args, abortSignal) => {
const { values } = parseArgs({
args,
options: {
@ -19,6 +35,12 @@ const up: Action = async (args) => {
type: 'string',
short: 'd',
},
import: {
type: 'string',
short: 'i',
multiple: true,
default: [],
},
reporter: {
type: 'string',
short: 'r',
@ -27,6 +49,18 @@ const up: Action = async (args) => {
type: 'string',
short: 's',
},
limit: {
type: 'string',
short: 'l',
},
from: {
type: 'string',
short: 'f',
},
to: {
type: 'string',
short: 't',
},
dry: {
type: 'boolean',
},
@ -36,6 +70,18 @@ const up: Action = async (args) => {
multiple: true,
default: [],
},
color: {
type: 'boolean',
},
'no-execution': {
type: 'boolean',
},
'no-color': {
type: 'boolean',
},
'abort-respite': {
type: 'string',
},
},
allowPositionals: false,
});
@ -47,17 +93,46 @@ Run all pending migrations
Options:
-h, --help Show this help message and exit
-d, --directory The directory where the migration files are located (required)
-s, --storage The storage to use for where to store the migration history (required)
-p, --plugin The plugin(s) to use (can be specified multiple times)
-r, --reporter The reporter to use for reporting the migration progress
-d, --directory <path> The directory where the migration files are located (required)
-i, --import <module> Additional modules/packages to import before running the migrations (can be specified multiple times)
For example if you want to use Dotenv to load environment variables or when using TypeScript
-s, --storage <name> The storage to use for where to store the migration history (required)
-p, --plugin <name> The plugin(s) to use (can be specified multiple times)
-r, --reporter <name> The reporter to use for reporting the migration progress (default: pretty)
-l, --limit <count> Limit the number of migrations to run
-f, --from <name/path> Start running migrations from the given migration name or relative file path to a migration file,
the given name or path needs to exist. The same migration and those after it lexicographically will be run
-t, --to <name/path> Skip migrations after the given migration name or relative file path to a migration file,
the given name or path needs to exist. The same migration and those before it lexicographically will be run
--dry List the pending migrations that would be run without actually running them
--color Force color output (this option is passed to the reporter)
--no-color Disable color output (this option is passed to the reporter)
--no-execution Mark the migrations as executed and successful without actually running them,
which is useful if you want to mark migrations as successful after running them manually
--abort-respite <sec> The number of seconds to wait before abandoning running migrations after the command has been aborted (default: ${DEFAULT_RESPITE_SECONDS})
Examples:
emigrate up --directory src/migrations -s fs
emigrate up -d ./migrations --storage @emigrate/mysql
emigrate up -d src/migrations -s postgres -r json --dry
emigrate up -d ./migrations -s mysql --import dotenv/config
emigrate up --limit 1
emigrate up --to 20231122120529381_some_migration_file.js
emigrate up --to 20231122120529381_some_migration_file.js --no-execution
`;
if (values.help) {
@ -66,12 +141,65 @@ Examples:
return;
}
const { directory = config.directory, storage = config.storage, reporter = config.reporter, dry } = values;
const cwd = process.cwd();
if (values.import) {
await importAll(cwd, values.import);
}
const forceImportTypeScriptAsIs = values.import?.some((module) => module === 'tsx' || module.startsWith('tsx/'));
const config = await getConfig('up', forceImportTypeScriptAsIs);
const {
directory = config.directory,
storage = config.storage,
reporter = config.reporter,
dry,
from,
to,
limit: limitString,
'abort-respite': abortRespiteString,
'no-execution': noExecution,
} = values;
const plugins = [...(config.plugins ?? []), ...(values.plugin ?? [])];
const limit = limitString === undefined ? undefined : Number.parseInt(limitString, 10);
const abortRespite = abortRespiteString === undefined ? config.abortRespite : Number.parseInt(abortRespiteString, 10);
if (Number.isNaN(limit)) {
console.error('Invalid limit value, expected an integer but was:', limitString);
console.log(usage);
process.exitCode = 1;
return;
}
if (Number.isNaN(abortRespite)) {
console.error(
'Invalid abortRespite value, expected an integer but was:',
abortRespiteString ?? config.abortRespite,
);
console.log(usage);
process.exitCode = 1;
return;
}
try {
const { default: upCommand } = await import('./commands/up.js');
process.exitCode = await upCommand({ storage, reporter, directory, plugins, dry });
process.exitCode = await upCommand({
storage,
reporter,
directory,
plugins,
cwd,
dry,
limit,
from,
to,
noExecution,
abortSignal,
abortRespite: (abortRespite ?? DEFAULT_RESPITE_SECONDS) * 1000,
color: useColors(values),
});
} catch (error) {
if (error instanceof ShowUsageError) {
console.error(error.message, '\n');
@ -85,7 +213,6 @@ Examples:
};
const newMigration: Action = async (args) => {
const config = await getConfig('new');
const { values, positionals } = parseArgs({
args,
options: {
@ -107,7 +234,7 @@ const newMigration: Action = async (args) => {
},
extension: {
type: 'string',
short: 'e',
short: 'x',
},
plugin: {
type: 'string',
@ -115,6 +242,18 @@ const newMigration: Action = async (args) => {
multiple: true,
default: [],
},
import: {
type: 'string',
short: 'i',
multiple: true,
default: [],
},
color: {
type: 'boolean',
},
'no-color': {
type: 'boolean',
},
},
allowPositionals: true,
});
@ -130,22 +269,34 @@ Arguments:
Options:
-h, --help Show this help message and exit
-d, --directory The directory where the migration files are located (required)
-r, --reporter The reporter to use for reporting the migration file creation progress
-p, --plugin The plugin(s) to use (can be specified multiple times)
-t, --template A template file to use as contents for the new migration file
-d, --directory <path> The directory where the migration files are located (required)
-i, --import <module> Additional modules/packages to import before creating the migration (can be specified multiple times)
For example if you want to use Dotenv to load environment variables or when using TypeScript
-r, --reporter <name> The reporter to use for reporting the migration file creation progress (default: pretty)
-p, --plugin <name> The plugin(s) to use (can be specified multiple times)
-t, --template <path> A template file to use as contents for the new migration file
(if the extension option is not provided the template file's extension will be used)
-e, --extension The extension to use for the new migration file
-x, --extension <ext> The extension to use for the new migration file
(if no template or plugin is provided an empty migration file will be created with the given extension)
--color Force color output (this option is passed to the reporter)
--no-color Disable color output (this option is passed to the reporter)
One of the --template, --extension or the --plugin options must be specified
Examples:
emigrate new -d src/migrations -t migration-template.js create users table
emigrate new --directory ./migrations --plugin @emigrate/postgres create_users_table
emigrate new -d ./migrations -e .sql create_users_table
emigrate new -d ./migrations -t .migration-template -e .sql "drop some table"
emigrate new -d ./migrations -x .sql create_users_table
emigrate new -d ./migrations -t .migration-template -x .sql "drop some table"
`;
if (values.help) {
@ -154,6 +305,15 @@ Examples:
return;
}
const cwd = process.cwd();
if (values.import) {
await importAll(cwd, values.import);
}
const forceImportTypeScriptAsIs = values.import?.some((module) => module === 'tsx' || module.startsWith('tsx/'));
const config = await getConfig('new', forceImportTypeScriptAsIs);
const {
directory = config.directory,
template = config.template,
@ -165,7 +325,7 @@ Examples:
try {
const { default: newCommand } = await import('./commands/new.js');
await newCommand({ directory, template, plugins, extension, reporter }, name);
await newCommand({ directory, template, plugins, extension, reporter, cwd, color: useColors(values) }, name);
} catch (error) {
if (error instanceof ShowUsageError) {
console.error(error.message, '\n');
@ -179,7 +339,6 @@ Examples:
};
const list: Action = async (args) => {
const config = await getConfig('list');
const { values } = parseArgs({
args,
options: {
@ -191,6 +350,12 @@ const list: Action = async (args) => {
type: 'string',
short: 'd',
},
import: {
type: 'string',
short: 'i',
multiple: true,
default: [],
},
reporter: {
type: 'string',
short: 'r',
@ -199,6 +364,12 @@ const list: Action = async (args) => {
type: 'string',
short: 's',
},
color: {
type: 'boolean',
},
'no-color': {
type: 'boolean',
},
},
allowPositionals: false,
});
@ -210,9 +381,19 @@ List all migrations and their status. This command does not run any migrations.
Options:
-h, --help Show this help message and exit
-d, --directory The directory where the migration files are located (required)
-r, --reporter The reporter to use for reporting the migrations
-s, --storage The storage to use to get the migration history (required)
-d, --directory <path> The directory where the migration files are located (required)
-i, --import <module> Additional modules/packages to import before listing the migrations (can be specified multiple times)
For example if you want to use Dotenv to load environment variables
-r, --reporter <name> The reporter to use for reporting the migrations (default: pretty)
-s, --storage <name> The storage to use to get the migration history (required)
--color Force color output (this option is passed to the reporter)
--no-color Disable color output (this option is passed to the reporter)
Examples:
@ -226,11 +407,20 @@ Examples:
return;
}
const cwd = process.cwd();
if (values.import) {
await importAll(cwd, values.import);
}
const forceImportTypeScriptAsIs = values.import?.some((module) => module === 'tsx' || module.startsWith('tsx/'));
const config = await getConfig('list', forceImportTypeScriptAsIs);
const { directory = config.directory, storage = config.storage, reporter = config.reporter } = values;
try {
const { default: listCommand } = await import('./commands/list.js');
process.exitCode = await listCommand({ directory, storage, reporter });
process.exitCode = await listCommand({ directory, storage, reporter, cwd, color: useColors(values) });
} catch (error) {
if (error instanceof ShowUsageError) {
console.error(error.message, '\n');
@ -244,7 +434,6 @@ Examples:
};
const remove: Action = async (args) => {
const config = await getConfig('remove');
const { values, positionals } = parseArgs({
args,
options: {
@ -256,6 +445,12 @@ const remove: Action = async (args) => {
type: 'string',
short: 'd',
},
import: {
type: 'string',
short: 'i',
multiple: true,
default: [],
},
force: {
type: 'boolean',
short: 'f',
@ -268,32 +463,50 @@ const remove: Action = async (args) => {
type: 'string',
short: 's',
},
color: {
type: 'boolean',
},
'no-color': {
type: 'boolean',
},
},
allowPositionals: true,
});
const usage = `Usage: emigrate remove [options] <name>
const usage = `Usage: emigrate remove [options] <name/path>
Remove entries from the migration history.
This is useful if you want to retry a migration that has failed.
Arguments:
name The name of the migration file to remove from the history (required)
name/path The name of or relative path to the migration file to remove from the history (required)
Options:
-h, --help Show this help message and exit
-d, --directory The directory where the migration files are located (required)
-r, --reporter The reporter to use for reporting the removal process
-s, --storage The storage to use to get the migration history (required)
-f, --force Force removal of the migration history entry even if the migration file does not exist
or it's in a non-failed state
-d, --directory <path> The directory where the migration files are located (required)
-i, --import <module> Additional modules/packages to import before removing the migration (can be specified multiple times)
For example if you want to use Dotenv to load environment variables
-r, --reporter <name> The reporter to use for reporting the removal process (default: pretty)
-s, --storage <name> The storage to use to get the migration history (required)
-f, --force Force removal of the migration history entry even if the migration is not in a failed state
--color Force color output (this option is passed to the reporter)
--no-color Disable color output (this option is passed to the reporter)
Examples:
emigrate remove -d migrations -s fs 20231122120529381_some_migration_file.js
emigrate remove --directory ./migrations --storage postgres 20231122120529381_some_migration_file.sql
emigrate remove -i dotenv/config -d ./migrations -s postgres 20231122120529381_some_migration_file.sql
emigrate remove -i dotenv/config -d ./migrations -s postgres migrations/20231122120529381_some_migration_file.sql
`;
if (values.help) {
@ -302,11 +515,23 @@ Examples:
return;
}
const cwd = process.cwd();
if (values.import) {
await importAll(cwd, values.import);
}
const forceImportTypeScriptAsIs = values.import?.some((module) => module === 'tsx' || module.startsWith('tsx/'));
const config = await getConfig('remove', forceImportTypeScriptAsIs);
const { directory = config.directory, storage = config.storage, reporter = config.reporter, force } = values;
try {
const { default: removeCommand } = await import('./commands/remove.js');
process.exitCode = await removeCommand({ directory, storage, reporter, force }, positionals[0] ?? '');
process.exitCode = await removeCommand(
{ directory, storage, reporter, force, cwd, color: useColors(values) },
positionals[0] ?? '',
);
} catch (error) {
if (error instanceof ShowUsageError) {
console.error(error.message, '\n');
@ -326,7 +551,7 @@ const commands: Record<string, Action> = {
new: newMigration,
};
const main: Action = async (args) => {
const main: Action = async (args, abortSignal) => {
const { values, positionals } = parseArgs({
args,
options: {
@ -378,16 +603,13 @@ Commands:
return;
}
await action(process.argv.slice(3));
};
try {
await main(process.argv.slice(2));
await action(process.argv.slice(3), abortSignal);
} catch (error) {
if (error instanceof Error) {
console.error(error.message);
console.error(error);
if (error.cause instanceof Error) {
console.error(error.cause.stack);
console.error(error.cause);
}
} else {
console.error(error);
@ -395,3 +617,29 @@ try {
process.exitCode = 1;
}
};
const controller = new AbortController();
process.on('SIGINT', () => {
controller.abort(CommandAbortError.fromSignal('SIGINT'));
});
process.on('SIGTERM', () => {
controller.abort(CommandAbortError.fromSignal('SIGTERM'));
});
process.on('uncaughtException', (error) => {
controller.abort(CommandAbortError.fromReason('Uncaught exception', error));
});
process.on('unhandledRejection', (error) => {
controller.abort(CommandAbortError.fromReason('Unhandled rejection', error));
});
await main(process.argv.slice(2), controller.signal);
setTimeout(() => {
console.error('Process did not exit within 10 seconds, forcing exit');
process.exit(process.exitCode);
}, 10_000).unref();

View file

@ -0,0 +1,99 @@
import { describe, it } from 'node:test';
import assert from 'node:assert';
import { collectMigrations } from './collect-migrations.js';
import { toEntries, toEntry, toMigration, toMigrations } from './test-utils.js';
import { arrayFromAsync } from './array-from-async.js';
import { MigrationHistoryError } from './errors.js';
describe('collect-migrations', () => {
it('returns all migrations from the history and all pending migrations', async () => {
const cwd = '/cwd';
const directory = 'directory';
const history = {
async *[Symbol.asyncIterator]() {
yield* toEntries(['migration1.js', 'migration2.js']);
},
};
const getMigrations = async () => toMigrations(cwd, directory, ['migration1.js', 'migration2.js', 'migration3.js']);
const result = await arrayFromAsync(collectMigrations(cwd, directory, history, getMigrations));
assert.deepStrictEqual(result, [
{
...toMigration(cwd, directory, 'migration1.js'),
duration: 0,
status: 'done',
},
{
...toMigration(cwd, directory, 'migration2.js'),
duration: 0,
status: 'done',
},
toMigration(cwd, directory, 'migration3.js'),
]);
});
it('includes any errors from the history', async () => {
const entry = toEntry('migration1.js', 'failed');
const cwd = '/cwd';
const directory = 'directory';
const history = {
async *[Symbol.asyncIterator]() {
yield* [entry];
},
};
const getMigrations = async () => toMigrations(cwd, directory, ['migration1.js', 'migration2.js', 'migration3.js']);
const result = await arrayFromAsync(collectMigrations(cwd, directory, history, getMigrations));
assert.deepStrictEqual(result, [
{
...toMigration(cwd, directory, 'migration1.js'),
duration: 0,
status: 'failed',
error: MigrationHistoryError.fromHistoryEntry(entry),
},
toMigration(cwd, directory, 'migration2.js'),
toMigration(cwd, directory, 'migration3.js'),
]);
});
it('can handle a migration history without file extensions', async () => {
const cwd = '/cwd';
const directory = 'directory';
const history = {
async *[Symbol.asyncIterator]() {
yield* toEntries(['migration1']);
},
};
const getMigrations = async () => toMigrations(cwd, directory, ['migration1.js', 'migration2.js', 'migration3.js']);
const result = await arrayFromAsync(collectMigrations(cwd, directory, history, getMigrations));
assert.deepStrictEqual(result, [
{ ...toMigration(cwd, directory, 'migration1.js'), duration: 0, status: 'done' },
toMigration(cwd, directory, 'migration2.js'),
toMigration(cwd, directory, 'migration3.js'),
]);
});
it('can handle a migration history without file extensions even if the migration name contains periods', async () => {
const cwd = '/cwd';
const directory = 'directory';
const history = {
async *[Symbol.asyncIterator]() {
yield* toEntries(['mig.ration1']);
},
};
const getMigrations = async () =>
toMigrations(cwd, directory, ['mig.ration1.js', 'migration2.js', 'migration3.js']);
const result = await arrayFromAsync(collectMigrations(cwd, directory, history, getMigrations));
assert.deepStrictEqual(result, [
{ ...toMigration(cwd, directory, 'mig.ration1.js'), duration: 0, status: 'done' },
toMigration(cwd, directory, 'migration2.js'),
toMigration(cwd, directory, 'migration3.js'),
]);
});
});

View file

@ -1,26 +1,28 @@
import { type MigrationHistoryEntry, type MigrationMetadata, type MigrationMetadataFinished } from '@emigrate/types';
import { toMigrationMetadata } from './to-migration-metadata.js';
import { getMigrations as getMigrationsOriginal } from './get-migrations.js';
import { getMigrations as getMigrationsOriginal, type GetMigrationsFunction } from './get-migrations.js';
export async function* collectMigrations(
cwd: string,
directory: string,
history: AsyncIterable<MigrationHistoryEntry>,
getMigrations = getMigrationsOriginal,
getMigrations: GetMigrationsFunction = getMigrationsOriginal,
): AsyncIterable<MigrationMetadata | MigrationMetadataFinished> {
const allMigrations = await getMigrations(cwd, directory);
const seen = new Set<string>();
for await (const entry of history) {
const index = allMigrations.findIndex((migrationFile) => migrationFile.name === entry.name);
const migration = allMigrations.find((migrationFile) => {
return migrationFile.name === entry.name || migrationFile.name === `${entry.name}.js`;
});
if (index === -1) {
if (!migration) {
continue;
}
yield toMigrationMetadata(entry, { cwd, directory });
yield toMigrationMetadata({ ...entry, name: migration.name }, { cwd, directory });
seen.add(entry.name);
seen.add(migration.name);
}
yield* allMigrations.filter((migration) => !seen.has(migration.name));

View file

@ -1,28 +1,34 @@
import process from 'node:process';
import { getOrLoadReporter, getOrLoadStorage } from '@emigrate/plugin-tools';
import { BadOptionError, MissingOptionError, StorageInitError, toError } from '../errors.js';
import { type Config } from '../types.js';
import { exec } from '../exec.js';
import { migrationRunner } from '../migration-runner.js';
import { arrayFromAsync } from '../array-from-async.js';
import { collectMigrations } from '../collect-migrations.js';
import { version } from '../get-package-info.js';
import { getStandardReporter } from '../reporters/get.js';
const lazyDefaultReporter = async () => import('../reporters/default.js');
type ExtraFlags = {
cwd: string;
};
export default async function listCommand({ directory, reporter: reporterConfig, storage: storageConfig }: Config) {
export default async function listCommand({
directory,
reporter: reporterConfig,
storage: storageConfig,
color,
cwd,
}: Config & ExtraFlags): Promise<number> {
if (!directory) {
throw MissingOptionError.fromOption('directory');
}
const cwd = process.cwd();
const storagePlugin = await getOrLoadStorage([storageConfig]);
if (!storagePlugin) {
throw BadOptionError.fromOption('storage', 'No storage found, please specify a storage using the storage option');
}
const reporter = await getOrLoadReporter([reporterConfig ?? lazyDefaultReporter]);
const reporter = getStandardReporter(reporterConfig) ?? (await getOrLoadReporter([reporterConfig]));
if (!reporter) {
throw BadOptionError.fromOption(
@ -31,7 +37,7 @@ export default async function listCommand({ directory, reporter: reporterConfig,
);
}
await reporter.onInit?.({ command: 'list', version, cwd, dry: false, directory });
await reporter.onInit?.({ command: 'list', version, cwd, dry: false, directory, color });
const [storage, storageError] = await exec(async () => storagePlugin.initializeStorage());
@ -48,13 +54,19 @@ export default async function listCommand({ directory, reporter: reporterConfig,
dry: true,
reporter,
storage,
migrations: await arrayFromAsync(collectedMigrations),
migrations: collectedMigrations,
async validate() {
// No-op
},
async execute() {
throw new Error('Unexpected execute call');
},
async onSuccess() {
throw new Error('Unexpected onSuccess call');
},
async onError() {
throw new Error('Unexpected onError call');
},
});
return error ? 1 : 0;

View file

@ -1,4 +1,4 @@
import process from 'node:process';
import { hrtime } from 'node:process';
import fs from 'node:fs/promises';
import path from 'node:path';
import { getTimestampPrefix, sanitizeMigrationName, getOrLoadPlugin, getOrLoadReporter } from '@emigrate/plugin-tools';
@ -15,13 +15,16 @@ import { type Config } from '../types.js';
import { withLeadingPeriod } from '../with-leading-period.js';
import { version } from '../get-package-info.js';
import { getDuration } from '../get-duration.js';
import { getStandardReporter } from '../reporters/get.js';
const lazyDefaultReporter = async () => import('../reporters/default.js');
type ExtraFlags = {
cwd: string;
};
export default async function newCommand(
{ directory, template, reporter: reporterConfig, plugins = [], extension }: Config,
{ directory, template, reporter: reporterConfig, plugins = [], cwd, extension, color }: Config & ExtraFlags,
name: string,
) {
): Promise<void> {
if (!directory) {
throw MissingOptionError.fromOption('directory');
}
@ -34,9 +37,7 @@ export default async function newCommand(
throw MissingOptionError.fromOption(['extension', 'template', 'plugin']);
}
const cwd = process.cwd();
const reporter = await getOrLoadReporter([reporterConfig ?? lazyDefaultReporter]);
const reporter = getStandardReporter(reporterConfig) ?? (await getOrLoadReporter([reporterConfig]));
if (!reporter) {
throw BadOptionError.fromOption(
@ -45,16 +46,16 @@ export default async function newCommand(
);
}
await reporter.onInit?.({ command: 'new', version, cwd, dry: false, directory });
await reporter.onInit?.({ command: 'new', version, cwd, dry: false, directory, color });
const start = process.hrtime();
const start = hrtime();
let filename: string | undefined;
let content: string | undefined;
if (template) {
const fs = await import('node:fs/promises');
const templatePath = path.resolve(process.cwd(), template);
const templatePath = path.resolve(cwd, template);
const fileExtension = path.extname(templatePath);
try {
@ -98,7 +99,7 @@ export default async function newCommand(
);
}
const directoryPath = path.resolve(process.cwd(), directory);
const directoryPath = path.resolve(cwd, directory);
const filePath = path.resolve(directoryPath, filename);
const migration: MigrationMetadata = {

View file

@ -0,0 +1,305 @@
import { describe, it } from 'node:test';
import assert from 'node:assert';
import { type EmigrateReporter, type Storage, type Plugin, type MigrationMetadataFinished } from '@emigrate/types';
import { deserializeError } from 'serialize-error';
import { version } from '../get-package-info.js';
import {
BadOptionError,
MigrationNotRunError,
MigrationRemovalError,
OptionNeededError,
StorageInitError,
} from '../errors.js';
import {
assertErrorEqualEnough,
getErrorCause,
getMockedReporter,
getMockedStorage,
toEntry,
toMigrations,
type Mocked,
} from '../test-utils.js';
import removeCommand from './remove.js';
describe('remove', () => {
it("returns 1 and finishes with an error when the storage couldn't be initialized", async () => {
const { reporter, run } = getRemoveCommand([]);
const exitCode = await run('some_migration.js');
assert.strictEqual(exitCode, 1, 'Exit code');
assertPreconditionsFailed(reporter, StorageInitError.fromError(new Error('No storage configured')));
});
it('returns 1 and finishes with an error when the given migration has not been executed', async () => {
const storage = getMockedStorage(['some_other_migration.js']);
const { reporter, run } = getRemoveCommand(['some_migration.js'], storage);
const exitCode = await run('some_migration.js');
assert.strictEqual(exitCode, 1, 'Exit code');
assertPreconditionsFulfilled(
reporter,
storage,
[
{
name: 'some_migration.js',
status: 'failed',
error: new MigrationNotRunError('Migration "some_migration.js" is not in the migration history'),
},
],
new MigrationNotRunError('Migration "some_migration.js" is not in the migration history'),
);
});
it('returns 1 and finishes with an error when the given migration is not in a failed state in the history', async () => {
const storage = getMockedStorage(['1_old_migration.js', '2_some_migration.js', '3_new_migration.js']);
const { reporter, run } = getRemoveCommand(['2_some_migration.js'], storage);
const exitCode = await run('2_some_migration.js');
assert.strictEqual(exitCode, 1, 'Exit code');
assertPreconditionsFulfilled(
reporter,
storage,
[
{
name: '2_some_migration.js',
status: 'failed',
error: OptionNeededError.fromOption(
'force',
'The migration "2_some_migration.js" is not in a failed state. Use the "force" option to force its removal',
),
},
],
OptionNeededError.fromOption(
'force',
'The migration "2_some_migration.js" is not in a failed state. Use the "force" option to force its removal',
),
);
});
it('returns 1 and finishes with an error when the given migration does not exist at all', async () => {
const storage = getMockedStorage(['some_migration.js']);
const { reporter, run } = getRemoveCommand(['some_migration.js'], storage);
const exitCode = await run('some_other_migration.js');
assert.strictEqual(exitCode, 1, 'Exit code');
assertPreconditionsFulfilled(
reporter,
storage,
[],
BadOptionError.fromOption('name', 'The migration: "migrations/some_other_migration.js" was not found'),
);
});
it('returns 0, removes the migration from the history and finishes without an error when the given migration is in a failed state', async () => {
const storage = getMockedStorage([toEntry('some_migration.js', 'failed')]);
const { reporter, run } = getRemoveCommand(['some_migration.js'], storage);
const exitCode = await run('some_migration.js');
assert.strictEqual(exitCode, 0, 'Exit code');
assertPreconditionsFulfilled(reporter, storage, [{ name: 'some_migration.js', status: 'done', started: true }]);
});
it('returns 0, removes the migration from the history and finishes without an error when the given migration is not in a failed state but "force" is true', async () => {
const storage = getMockedStorage(['1_old_migration.js', '2_some_migration.js', '3_new_migration.js']);
const { reporter, run } = getRemoveCommand(['2_some_migration.js'], storage);
const exitCode = await run('2_some_migration.js', { force: true });
assert.strictEqual(exitCode, 0, 'Exit code');
assertPreconditionsFulfilled(reporter, storage, [{ name: '2_some_migration.js', status: 'done', started: true }]);
});
it('returns 1 and finishes with an error when the removal of the migration crashes', async () => {
const storage = getMockedStorage([toEntry('some_migration.js', 'failed')]);
storage.remove.mock.mockImplementation(async () => {
throw new Error('Some error');
});
const { reporter, run } = getRemoveCommand(['some_migration.js'], storage);
const exitCode = await run('some_migration.js');
assert.strictEqual(exitCode, 1, 'Exit code');
assertPreconditionsFulfilled(
reporter,
storage,
[
{
name: 'some_migration.js',
status: 'failed',
error: new Error('Some error'),
started: true,
},
],
new MigrationRemovalError('Failed to remove migration: migrations/some_migration.js', {
cause: new Error('Some error'),
}),
);
});
});
function getRemoveCommand(migrationFiles: string[], storage?: Mocked<Storage>, plugins?: Plugin[]) {
const reporter = getMockedReporter();
const run = async (
name: string,
options?: Omit<Parameters<typeof removeCommand>[0], 'cwd' | 'directory' | 'storage' | 'reporter' | 'plugins'>,
) => {
return removeCommand(
{
cwd: '/emigrate',
directory: 'migrations',
storage: {
async initializeStorage() {
if (!storage) {
throw new Error('No storage configured');
}
return storage;
},
},
reporter,
plugins: plugins ?? [],
async getMigrations(cwd, directory) {
return toMigrations(cwd, directory, migrationFiles);
},
...options,
},
name,
);
};
return {
reporter,
storage,
run,
};
}
function assertPreconditionsFailed(reporter: Mocked<Required<EmigrateReporter>>, finishedError?: Error) {
assert.strictEqual(reporter.onInit.mock.calls.length, 1);
assert.deepStrictEqual(reporter.onInit.mock.calls[0]?.arguments, [
{
command: 'remove',
cwd: '/emigrate',
version,
dry: false,
color: undefined,
directory: 'migrations',
},
]);
assert.strictEqual(reporter.onCollectedMigrations.mock.calls.length, 0, 'Collected call');
assert.strictEqual(reporter.onLockedMigrations.mock.calls.length, 0, 'Locked call');
assert.strictEqual(reporter.onMigrationStart.mock.calls.length, 0, 'Started migrations');
assert.strictEqual(reporter.onMigrationSuccess.mock.calls.length, 0, 'Successful migrations');
assert.strictEqual(reporter.onMigrationError.mock.calls.length, 0, 'Failed migrations');
assert.strictEqual(reporter.onMigrationSkip.mock.calls.length, 0, 'Total pending and skipped');
assert.strictEqual(reporter.onFinished.mock.calls.length, 1, 'Finished called once');
const [entries, error] = reporter.onFinished.mock.calls[0]?.arguments ?? [];
// hackety hack:
if (finishedError) {
finishedError.stack = error?.stack;
}
assert.deepStrictEqual(error, finishedError, 'Finished error');
const cause = getErrorCause(error);
const expectedCause = finishedError?.cause;
assert.deepStrictEqual(
cause,
expectedCause ? deserializeError(expectedCause) : expectedCause,
'Finished error cause',
);
assert.strictEqual(entries?.length, 0, 'Finished entries length');
}
function assertPreconditionsFulfilled(
reporter: Mocked<Required<EmigrateReporter>>,
storage: Mocked<Storage>,
expected: Array<{ name: string; status: MigrationMetadataFinished['status']; started?: boolean; error?: Error }>,
finishedError?: Error,
) {
assert.strictEqual(reporter.onInit.mock.calls.length, 1);
assert.deepStrictEqual(reporter.onInit.mock.calls[0]?.arguments, [
{
command: 'remove',
cwd: '/emigrate',
version,
dry: false,
color: undefined,
directory: 'migrations',
},
]);
let started = 0;
let done = 0;
let failed = 0;
let skipped = 0;
let pending = 0;
let failedAndStarted = 0;
const failedEntries: typeof expected = [];
const successfulEntries: typeof expected = [];
for (const entry of expected) {
if (entry.started) {
started++;
}
// eslint-disable-next-line default-case
switch (entry.status) {
case 'done': {
done++;
if (entry.started) {
successfulEntries.push(entry);
}
break;
}
case 'failed': {
failed++;
failedEntries.push(entry);
if (entry.started) {
failedAndStarted++;
}
break;
}
case 'skipped': {
skipped++;
break;
}
case 'pending': {
pending++;
break;
}
}
}
assert.strictEqual(reporter.onCollectedMigrations.mock.calls.length, 1, 'Collected call');
assert.strictEqual(storage.lock.mock.calls.length, 0, 'Storage lock never called');
assert.strictEqual(storage.unlock.mock.calls.length, 0, 'Storage unlock never called');
assert.strictEqual(reporter.onLockedMigrations.mock.calls.length, 0, 'Locked call');
assert.strictEqual(reporter.onMigrationStart.mock.calls.length, started, 'Started migrations');
assert.strictEqual(reporter.onMigrationSuccess.mock.calls.length, successfulEntries.length, 'Successful migrations');
assert.strictEqual(storage.remove.mock.calls.length, started, 'Storage remove called');
assert.strictEqual(reporter.onMigrationError.mock.calls.length, failedEntries.length, 'Failed migrations');
assert.strictEqual(reporter.onMigrationSkip.mock.calls.length, 0, 'Total pending and skipped');
assert.strictEqual(reporter.onFinished.mock.calls.length, 1, 'Finished called once');
const [entries, error] = reporter.onFinished.mock.calls[0]?.arguments ?? [];
assertErrorEqualEnough(error, finishedError, 'Finished error');
assert.strictEqual(entries?.length, expected.length, 'Finished entries length');
assert.deepStrictEqual(
entries.map((entry) => `${entry.name} (${entry.status})`),
expected.map((entry) => `${entry.name} (${entry.status})`),
'Finished entries',
);
assert.strictEqual(storage.end.mock.calls.length, 1, 'Storage end called once');
}

View file

@ -1,30 +1,45 @@
import process from 'node:process';
import path from 'node:path';
import { getOrLoadReporter, getOrLoadStorage } from '@emigrate/plugin-tools';
import { type MigrationHistoryEntry, type MigrationMetadataFinished } from '@emigrate/types';
import { type MigrationMetadata, isFinishedMigration } from '@emigrate/types';
import {
BadOptionError,
MigrationNotRunError,
MigrationRemovalError,
MissingArgumentsError,
MissingOptionError,
OptionNeededError,
StorageInitError,
toError,
} from '../errors.js';
import { type Config } from '../types.js';
import { getMigration } from '../get-migration.js';
import { getDuration } from '../get-duration.js';
import { exec } from '../exec.js';
import { version } from '../get-package-info.js';
import { collectMigrations } from '../collect-migrations.js';
import { migrationRunner } from '../migration-runner.js';
import { arrayMapAsync } from '../array-map-async.js';
import { type GetMigrationsFunction } from '../get-migrations.js';
import { getStandardReporter } from '../reporters/get.js';
type ExtraFlags = {
cwd: string;
force?: boolean;
getMigrations?: GetMigrationsFunction;
};
const lazyDefaultReporter = async () => import('../reporters/default.js');
type RemovableMigrationMetadata = MigrationMetadata & { originalStatus?: 'done' | 'failed' };
export default async function removeCommand(
{ directory, reporter: reporterConfig, storage: storageConfig, force }: Config & ExtraFlags,
{
directory,
reporter: reporterConfig,
storage: storageConfig,
color,
cwd,
force = false,
getMigrations,
}: Config & ExtraFlags,
name: string,
) {
): Promise<number> {
if (!directory) {
throw MissingOptionError.fromOption('directory');
}
@ -33,14 +48,13 @@ export default async function removeCommand(
throw MissingArgumentsError.fromArgument('name');
}
const cwd = process.cwd();
const storagePlugin = await getOrLoadStorage([storageConfig]);
if (!storagePlugin) {
throw BadOptionError.fromOption('storage', 'No storage found, please specify a storage using the storage option');
}
const reporter = await getOrLoadReporter([reporterConfig ?? lazyDefaultReporter]);
const reporter = getStandardReporter(reporterConfig) ?? (await getOrLoadReporter([reporterConfig]));
if (!reporter) {
throw BadOptionError.fromOption(
@ -49,6 +63,8 @@ export default async function removeCommand(
);
}
await reporter.onInit?.({ command: 'remove', version, cwd, dry: false, directory, color });
const [storage, storageError] = await exec(async () => storagePlugin.initializeStorage());
if (storageError) {
@ -57,73 +73,79 @@ export default async function removeCommand(
return 1;
}
await reporter.onInit?.({ command: 'remove', version, cwd, dry: false, directory });
try {
const collectedMigrations = arrayMapAsync(
collectMigrations(cwd, directory, storage.getHistory(), getMigrations),
(migration) => {
if (isFinishedMigration(migration)) {
if (migration.status === 'failed') {
const { status, duration, error, ...pendingMigration } = migration;
const removableMigration: RemovableMigrationMetadata = { ...pendingMigration, originalStatus: status };
const [migrationFile, fileError] = await exec(async () => getMigration(cwd, directory, name, !force));
return removableMigration;
}
if (fileError) {
await reporter.onFinished?.([], fileError);
if (migration.status === 'done') {
const { status, duration, ...pendingMigration } = migration;
const removableMigration: RemovableMigrationMetadata = { ...pendingMigration, originalStatus: status };
await storage.end();
return removableMigration;
}
throw new Error(`Unexpected migration status: ${migration.status}`);
}
return migration as RemovableMigrationMetadata;
},
);
if (!name.includes(path.sep)) {
name = path.join(directory, name);
}
const error = await migrationRunner({
dry: false,
lock: false,
name,
reporter,
storage,
migrations: collectedMigrations,
migrationFilter(migration) {
return migration.relativeFilePath === name;
},
async validate(migration) {
if (migration.originalStatus === 'done' && !force) {
throw OptionNeededError.fromOption(
'force',
`The migration "${migration.name}" is not in a failed state. Use the "force" option to force its removal`,
);
}
if (!migration.originalStatus) {
throw MigrationNotRunError.fromMetadata(migration);
}
},
async execute(migration) {
try {
await storage.remove(migration);
} catch (error) {
throw MigrationRemovalError.fromMetadata(migration, toError(error));
}
},
async onSuccess() {
// No-op
},
async onError() {
// No-op
},
});
return error ? 1 : 0;
} catch (error) {
await reporter.onFinished?.([], toError(error));
return 1;
}
const finishedMigrations: MigrationMetadataFinished[] = [];
let historyEntry: MigrationHistoryEntry | undefined;
let removalError: Error | undefined;
for await (const migrationHistoryEntry of storage.getHistory()) {
if (migrationHistoryEntry.name !== migrationFile.name) {
continue;
}
if (migrationHistoryEntry.status === 'done' && !force) {
removalError = OptionNeededError.fromOption(
'force',
`The migration "${migrationFile.name}" is not in a failed state. Use the "force" option to force its removal`,
);
} else {
historyEntry = migrationHistoryEntry;
}
}
await reporter.onMigrationRemoveStart?.(migrationFile);
const start = process.hrtime();
if (historyEntry) {
try {
await storage.remove(migrationFile);
const duration = getDuration(start);
const finishedMigration: MigrationMetadataFinished = { ...migrationFile, status: 'done', duration };
await reporter.onMigrationRemoveSuccess?.(finishedMigration);
finishedMigrations.push(finishedMigration);
} catch (error) {
removalError = error instanceof Error ? error : new Error(String(error));
}
} else if (!removalError) {
removalError = MigrationNotRunError.fromMetadata(migrationFile);
}
if (removalError) {
const duration = getDuration(start);
const finishedMigration: MigrationMetadataFinished = {
...migrationFile,
status: 'failed',
error: removalError,
duration,
};
await reporter.onMigrationRemoveError?.(finishedMigration, removalError);
finishedMigrations.push(finishedMigration);
}
await reporter.onFinished?.(finishedMigrations, removalError);
} finally {
await storage.end();
return removalError ? 1 : 0;
}
}

File diff suppressed because it is too large Load diff

View file

@ -1,33 +1,51 @@
import process from 'node:process';
import path from 'node:path';
import { getOrLoadPlugins, getOrLoadReporter, getOrLoadStorage } from '@emigrate/plugin-tools';
import { isFinishedMigration, type LoaderPlugin } from '@emigrate/types';
import { BadOptionError, MigrationLoadError, MissingOptionError, StorageInitError, toError } from '../errors.js';
import {
BadOptionError,
MigrationLoadError,
MissingOptionError,
StorageInitError,
toError,
toSerializedError,
} from '../errors.js';
import { type Config } from '../types.js';
import { withLeadingPeriod } from '../with-leading-period.js';
import { type GetMigrationsFunction } from '../get-migrations.js';
import { exec } from '../exec.js';
import { migrationRunner } from '../migration-runner.js';
import { filterAsync } from '../filter-async.js';
import { collectMigrations } from '../collect-migrations.js';
import { arrayFromAsync } from '../array-from-async.js';
import { version } from '../get-package-info.js';
import { getStandardReporter } from '../reporters/get.js';
type ExtraFlags = {
cwd?: string;
cwd: string;
dry?: boolean;
limit?: number;
from?: string;
to?: string;
noExecution?: boolean;
getMigrations?: GetMigrationsFunction;
abortSignal?: AbortSignal;
abortRespite?: number;
};
const lazyDefaultReporter = async () => import('../reporters/default.js');
const lazyPluginLoaderJs = async () => import('../plugin-loader-js.js');
export default async function upCommand({
storage: storageConfig,
reporter: reporterConfig,
directory,
color,
limit,
from,
to,
noExecution,
abortSignal,
abortRespite,
dry = false,
plugins = [],
cwd = process.cwd(),
cwd,
getMigrations,
}: Config & ExtraFlags): Promise<number> {
if (!directory) {
@ -40,7 +58,7 @@ export default async function upCommand({
throw BadOptionError.fromOption('storage', 'No storage found, please specify a storage using the storage option');
}
const reporter = await getOrLoadReporter([reporterConfig ?? lazyDefaultReporter]);
const reporter = getStandardReporter(reporterConfig) ?? (await getOrLoadReporter([reporterConfig]));
if (!reporter) {
throw BadOptionError.fromOption(
@ -49,7 +67,7 @@ export default async function upCommand({
);
}
await reporter.onInit?.({ command: 'up', version, cwd, dry, directory });
await reporter.onInit?.({ command: 'up', version, cwd, dry, directory, color });
const [storage, storageError] = await exec(async () => storagePlugin.initializeStorage());
@ -60,10 +78,7 @@ export default async function upCommand({
}
try {
const collectedMigrations = filterAsync(
collectMigrations(cwd, directory, storage.getHistory(), getMigrations),
(migration) => !isFinishedMigration(migration) || migration.status === 'failed',
);
const collectedMigrations = collectMigrations(cwd, directory, storage.getHistory(), getMigrations);
const loaderPlugins = await getOrLoadPlugins('loader', [lazyPluginLoaderJs, ...plugins]);
@ -81,12 +96,32 @@ export default async function upCommand({
return loaderByExtension.get(extension);
};
if (from && !from.includes(path.sep)) {
from = path.join(directory, from);
}
if (to && !to.includes(path.sep)) {
to = path.join(directory, to);
}
const error = await migrationRunner({
dry,
limit,
from,
to,
abortSignal,
abortRespite,
reporter,
storage,
migrations: await arrayFromAsync(collectedMigrations),
migrations: collectedMigrations,
migrationFilter(migration) {
return !isFinishedMigration(migration) || migration.status === 'failed';
},
async validate(migration) {
if (noExecution) {
return;
}
const loader = getLoaderByExtension(migration.extension);
if (!loader) {
@ -97,6 +132,10 @@ export default async function upCommand({
}
},
async execute(migration) {
if (noExecution) {
return;
}
const loader = getLoaderByExtension(migration.extension)!;
const [migrationFunction, loadError] = await exec(async () => loader.loadMigration(migration));
@ -106,6 +145,12 @@ export default async function upCommand({
await migrationFunction();
},
async onSuccess(migration) {
await storage.onSuccess(migration);
},
async onError(migration, error) {
await storage.onError(migration, toSerializedError(error));
},
});
return error ? 1 : 0;

View file

@ -0,0 +1,2 @@
// eslint-disable-next-line @typescript-eslint/naming-convention
export const DEFAULT_RESPITE_SECONDS = 10;

6
packages/cli/src/deno.d.ts vendored Normal file
View file

@ -0,0 +1,6 @@
declare global {
// eslint-disable-next-line @typescript-eslint/naming-convention
const Deno: any;
}
export {};

View file

@ -8,7 +8,7 @@ import { serializeError, errorConstructors, deserializeError } from 'serialize-e
const formatter = new Intl.ListFormat('en', { style: 'long', type: 'disjunction' });
export const toError = (error: unknown) => (error instanceof Error ? error : new Error(String(error)));
export const toError = (error: unknown): Error => (error instanceof Error ? error : new Error(String(error)));
export const toSerializedError = (error: unknown) => {
const errorInstance = toError(error);
@ -23,13 +23,14 @@ export class EmigrateError extends Error {
public code?: string,
) {
super(message, options);
this.name = this.constructor.name;
}
}
export class ShowUsageError extends EmigrateError {}
export class MissingOptionError extends ShowUsageError {
static fromOption(option: string | string[]) {
static fromOption(option: string | string[]): MissingOptionError {
return new MissingOptionError(
`Missing required option: ${Array.isArray(option) ? formatter.format(option) : option}`,
undefined,
@ -47,7 +48,7 @@ export class MissingOptionError extends ShowUsageError {
}
export class MissingArgumentsError extends ShowUsageError {
static fromArgument(argument: string) {
static fromArgument(argument: string): MissingArgumentsError {
return new MissingArgumentsError(`Missing required argument: ${argument}`, undefined, argument);
}
@ -61,7 +62,7 @@ export class MissingArgumentsError extends ShowUsageError {
}
export class OptionNeededError extends ShowUsageError {
static fromOption(option: string, message: string) {
static fromOption(option: string, message: string): OptionNeededError {
return new OptionNeededError(message, undefined, option);
}
@ -75,7 +76,7 @@ export class OptionNeededError extends ShowUsageError {
}
export class BadOptionError extends ShowUsageError {
static fromOption(option: string, message: string) {
static fromOption(option: string, message: string): BadOptionError {
return new BadOptionError(message, undefined, option);
}
@ -95,7 +96,7 @@ export class UnexpectedError extends EmigrateError {
}
export class MigrationHistoryError extends EmigrateError {
static fromHistoryEntry(entry: FailedMigrationHistoryEntry) {
static fromHistoryEntry(entry: FailedMigrationHistoryEntry): MigrationHistoryError {
return new MigrationHistoryError(`Migration ${entry.name} is in a failed state, it should be fixed and removed`, {
cause: deserializeError(entry.error),
});
@ -107,7 +108,7 @@ export class MigrationHistoryError extends EmigrateError {
}
export class MigrationLoadError extends EmigrateError {
static fromMetadata(metadata: MigrationMetadata, cause?: Error) {
static fromMetadata(metadata: MigrationMetadata, cause?: Error): MigrationLoadError {
return new MigrationLoadError(`Failed to load migration file: ${metadata.relativeFilePath}`, { cause });
}
@ -117,7 +118,7 @@ export class MigrationLoadError extends EmigrateError {
}
export class MigrationRunError extends EmigrateError {
static fromMetadata(metadata: FailedMigrationMetadata) {
static fromMetadata(metadata: FailedMigrationMetadata): MigrationRunError {
return new MigrationRunError(`Failed to run migration: ${metadata.relativeFilePath}`, { cause: metadata.error });
}
@ -127,7 +128,7 @@ export class MigrationRunError extends EmigrateError {
}
export class MigrationNotRunError extends EmigrateError {
static fromMetadata(metadata: MigrationMetadata, cause?: Error) {
static fromMetadata(metadata: MigrationMetadata, cause?: Error): MigrationNotRunError {
return new MigrationNotRunError(`Migration "${metadata.name}" is not in the migration history`, { cause });
}
@ -136,8 +137,18 @@ export class MigrationNotRunError extends EmigrateError {
}
}
export class MigrationRemovalError extends EmigrateError {
static fromMetadata(metadata: MigrationMetadata, cause?: Error): MigrationRemovalError {
return new MigrationRemovalError(`Failed to remove migration: ${metadata.relativeFilePath}`, { cause });
}
constructor(message: string | undefined, options?: ErrorOptions) {
super(message, options, 'ERR_MIGRATION_REMOVE');
}
}
export class StorageInitError extends EmigrateError {
static fromError(error: Error) {
static fromError(error: Error): StorageInitError {
return new StorageInitError('Could not initialize storage', { cause: error });
}
@ -146,6 +157,30 @@ export class StorageInitError extends EmigrateError {
}
}
export class CommandAbortError extends EmigrateError {
static fromSignal(signal: NodeJS.Signals): CommandAbortError {
return new CommandAbortError(`Command aborted due to signal: ${signal}`);
}
static fromReason(reason: string, cause?: unknown): CommandAbortError {
return new CommandAbortError(`Command aborted: ${reason}`, { cause });
}
constructor(message: string | undefined, options?: ErrorOptions) {
super(message, options, 'ERR_COMMAND_ABORT');
}
}
export class ExecutionDesertedError extends EmigrateError {
static fromReason(reason: string, cause?: Error): ExecutionDesertedError {
return new ExecutionDesertedError(`Execution deserted: ${reason}`, { cause });
}
constructor(message: string | undefined, options?: ErrorOptions) {
super(message, options, 'ERR_EXECUTION_DESERTED');
}
}
errorConstructors.set('EmigrateError', EmigrateError as ErrorConstructor);
errorConstructors.set('ShowUsageError', ShowUsageError as ErrorConstructor);
errorConstructors.set('MissingOptionError', MissingOptionError as unknown as ErrorConstructor);
@ -157,4 +192,7 @@ errorConstructors.set('MigrationHistoryError', MigrationHistoryError as unknown
errorConstructors.set('MigrationLoadError', MigrationLoadError as unknown as ErrorConstructor);
errorConstructors.set('MigrationRunError', MigrationRunError as unknown as ErrorConstructor);
errorConstructors.set('MigrationNotRunError', MigrationNotRunError as unknown as ErrorConstructor);
errorConstructors.set('MigrationRemovalError', MigrationRemovalError as unknown as ErrorConstructor);
errorConstructors.set('StorageInitError', StorageInitError as unknown as ErrorConstructor);
errorConstructors.set('CommandAbortError', CommandAbortError as unknown as ErrorConstructor);
errorConstructors.set('ExecutionDesertedError', ExecutionDesertedError as unknown as ErrorConstructor);

View file

@ -1,22 +1,85 @@
import { toError } from './errors.js';
import { setTimeout } from 'node:timers';
import prettyMs from 'pretty-ms';
import { ExecutionDesertedError, toError } from './errors.js';
import { DEFAULT_RESPITE_SECONDS } from './defaults.js';
type Fn<Args extends any[], Result> = (...args: Args) => Result;
type Result<T> = [value: T, error: undefined] | [value: undefined, error: Error];
type ExecOptions = {
abortSignal?: AbortSignal;
abortRespite?: number;
};
/**
* Execute a function and return a result tuple
*
* This is a helper function to make it easier to handle errors without the extra nesting of try/catch
* If an abort signal is provided the function will reject with an ExecutionDesertedError if the signal is aborted
* and the given function has not yet resolved within the given respite time (or a default of 30 seconds)
*
* @param fn The function to execute
* @param options Options for the execution
*/
export const exec = async <Args extends any[], Return extends Promise<any>>(
fn: Fn<Args, Return>,
...args: Args
export const exec = async <Return extends Promise<any>>(
fn: () => Return,
options: ExecOptions = {},
): Promise<Result<Awaited<Return>>> => {
try {
const result = await fn(...args);
const aborter = options.abortSignal ? getAborter(options.abortSignal, options.abortRespite) : undefined;
const result = await Promise.race(aborter ? [aborter, fn()] : [fn()]);
aborter?.cancel();
return [result, undefined];
} catch (error) {
return [undefined, toError(error)];
}
};
/**
* Returns a promise that rejects after a given time after the given signal is aborted
*
* @param signal The abort signal to listen to
* @param respite The time in milliseconds to wait before rejecting
*/
const getAborter = (
signal: AbortSignal,
respite = DEFAULT_RESPITE_SECONDS * 1000,
): PromiseLike<never> & { cancel: () => void } => {
const cleanups: Array<() => void> = [];
const aborter = new Promise<never>((_, reject) => {
const abortListener = () => {
const timer = setTimeout(
reject,
respite,
ExecutionDesertedError.fromReason(`Deserted after ${prettyMs(respite)}`, toError(signal.reason)),
);
timer.unref();
cleanups.push(() => {
clearTimeout(timer);
});
};
if (signal.aborted) {
abortListener();
return;
}
signal.addEventListener('abort', abortListener, { once: true });
cleanups.push(() => {
signal.removeEventListener('abort', abortListener);
});
});
const cancel = () => {
for (const cleanup of cleanups) {
cleanup();
}
cleanups.length = 0;
};
return Object.assign(aborter, { cancel });
};

View file

@ -1,13 +0,0 @@
export function filterAsync<T, S extends T>(
iterable: AsyncIterable<T>,
filter: (item: T) => item is S,
): AsyncIterable<S>;
export function filterAsync<T>(iterable: AsyncIterable<T>, filter: (item: T) => unknown): AsyncIterable<T>;
export async function* filterAsync<T>(iterable: AsyncIterable<T>, filter: (item: T) => unknown): AsyncIterable<T> {
for await (const item of iterable) {
if (filter(item)) {
yield item;
}
}
}

View file

@ -1,11 +1,28 @@
import { cosmiconfig } from 'cosmiconfig';
import process from 'node:process';
import { cosmiconfig, defaultLoaders } from 'cosmiconfig';
import { type Config, type EmigrateConfig } from './types.js';
const commands = ['up', 'list', 'new', 'remove'] as const;
type Command = (typeof commands)[number];
const canImportTypeScriptAsIs = Boolean(process.isBun) || typeof Deno !== 'undefined';
export const getConfig = async (command: Command): Promise<Config> => {
const explorer = cosmiconfig('emigrate');
const getEmigrateConfig = (config: any): EmigrateConfig => {
if ('default' in config && typeof config.default === 'object' && config.default !== null) {
return config.default as EmigrateConfig;
}
if (typeof config === 'object' && config !== null) {
return config as EmigrateConfig;
}
return {};
};
export const getConfig = async (command: Command, forceImportTypeScriptAsIs = false): Promise<Config> => {
const explorer = cosmiconfig('emigrate', {
// eslint-disable-next-line @typescript-eslint/naming-convention
loaders: forceImportTypeScriptAsIs || canImportTypeScriptAsIs ? { '.ts': defaultLoaders['.js'] } : undefined,
});
const result = await explorer.search();
@ -13,7 +30,7 @@ export const getConfig = async (command: Command): Promise<Config> => {
return {};
}
const config = result.config as EmigrateConfig;
const config = getEmigrateConfig(result.config);
const commandConfig = config[command];

View file

@ -1,6 +1,6 @@
import process from 'node:process';
export const getDuration = (start: [number, number]) => {
export const getDuration = (start: [number, number]): number => {
const [seconds, nanoseconds] = process.hrtime(start);
return seconds * 1000 + nanoseconds / 1_000_000;
};

View file

@ -0,0 +1,190 @@
import fs from 'node:fs/promises';
import { afterEach, beforeEach, describe, it, mock } from 'node:test';
import assert from 'node:assert';
import { getMigrations } from './get-migrations.js';
const originalOpendir = fs.opendir;
const opendirMock = mock.fn(originalOpendir);
describe('get-migrations', () => {
beforeEach(() => {
fs.opendir = opendirMock;
});
afterEach(() => {
opendirMock.mock.restore();
fs.opendir = originalOpendir;
});
it('should skip files with leading periods', async () => {
opendirMock.mock.mockImplementation(async function* () {
yield* [
{ name: '.foo.js', isFile: () => true },
{ name: 'bar.js', isFile: () => true },
{ name: 'baz.js', isFile: () => true },
];
});
const migrations = await getMigrations('/cwd/', 'directory');
assert.deepStrictEqual(migrations, [
{
name: 'bar.js',
filePath: '/cwd/directory/bar.js',
relativeFilePath: 'directory/bar.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
{
name: 'baz.js',
filePath: '/cwd/directory/baz.js',
relativeFilePath: 'directory/baz.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
]);
});
it('should skip files with leading underscores', async () => {
opendirMock.mock.mockImplementation(async function* () {
yield* [
{ name: '_foo.js', isFile: () => true },
{ name: 'bar.js', isFile: () => true },
{ name: 'baz.js', isFile: () => true },
];
});
const migrations = await getMigrations('/cwd/', 'directory');
assert.deepStrictEqual(migrations, [
{
name: 'bar.js',
filePath: '/cwd/directory/bar.js',
relativeFilePath: 'directory/bar.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
{
name: 'baz.js',
filePath: '/cwd/directory/baz.js',
relativeFilePath: 'directory/baz.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
]);
});
it('should skip files without file extensions', async () => {
opendirMock.mock.mockImplementation(async function* () {
yield* [
{ name: 'foo', isFile: () => true },
{ name: 'bar.js', isFile: () => true },
{ name: 'baz.js', isFile: () => true },
];
});
const migrations = await getMigrations('/cwd/', 'directory');
assert.deepStrictEqual(migrations, [
{
name: 'bar.js',
filePath: '/cwd/directory/bar.js',
relativeFilePath: 'directory/bar.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
{
name: 'baz.js',
filePath: '/cwd/directory/baz.js',
relativeFilePath: 'directory/baz.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
]);
});
it('should skip non-files', async () => {
opendirMock.mock.mockImplementation(async function* () {
yield* [
{ name: 'foo.js', isFile: () => false },
{ name: 'bar.js', isFile: () => true },
{ name: 'baz.js', isFile: () => true },
];
});
const migrations = await getMigrations('/cwd/', 'directory');
assert.deepStrictEqual(migrations, [
{
name: 'bar.js',
filePath: '/cwd/directory/bar.js',
relativeFilePath: 'directory/bar.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
{
name: 'baz.js',
filePath: '/cwd/directory/baz.js',
relativeFilePath: 'directory/baz.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
]);
});
it('should sort them in lexicographical order', async () => {
opendirMock.mock.mockImplementation(async function* () {
yield* [
{ name: 'foo.js', isFile: () => true },
{ name: 'bar_data.js', isFile: () => true },
{ name: 'bar.js', isFile: () => true },
{ name: 'baz.js', isFile: () => true },
];
});
const migrations = await getMigrations('/cwd/', 'directory');
assert.deepStrictEqual(migrations, [
{
name: 'bar.js',
filePath: '/cwd/directory/bar.js',
relativeFilePath: 'directory/bar.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
{
name: 'bar_data.js',
filePath: '/cwd/directory/bar_data.js',
relativeFilePath: 'directory/bar_data.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
{
name: 'baz.js',
filePath: '/cwd/directory/baz.js',
relativeFilePath: 'directory/baz.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
{
name: 'foo.js',
filePath: '/cwd/directory/foo.js',
relativeFilePath: 'directory/foo.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
]);
});
});

View file

@ -1,31 +1,35 @@
import path from 'node:path';
import fs from 'node:fs/promises';
import { type Dirent } from 'node:fs';
import { type MigrationMetadata } from '@emigrate/types';
import { withLeadingPeriod } from './with-leading-period.js';
import { BadOptionError } from './errors.js';
import { arrayFromAsync } from './array-from-async.js';
export type GetMigrationsFunction = typeof getMigrations;
const tryReadDirectory = async (directoryPath: string): Promise<Dirent[]> => {
async function* tryReadDirectory(directoryPath: string): AsyncIterable<string> {
try {
return await fs.readdir(directoryPath, {
withFileTypes: true,
});
for await (const entry of await fs.opendir(directoryPath)) {
if (
entry.isFile() &&
!entry.name.startsWith('.') &&
!entry.name.startsWith('_') &&
path.extname(entry.name) !== ''
) {
yield entry.name;
}
}
} catch {
throw BadOptionError.fromOption('directory', `Couldn't read directory: ${directoryPath}`);
}
};
}
export const getMigrations = async (cwd: string, directory: string): Promise<MigrationMetadata[]> => {
const directoryPath = path.resolve(cwd, directory);
const allFilesInMigrationDirectory = await tryReadDirectory(directoryPath);
const allFilesInMigrationDirectory = await arrayFromAsync(tryReadDirectory(directoryPath));
const migrationFiles: MigrationMetadata[] = allFilesInMigrationDirectory
.filter((file) => file.isFile() && !file.name.startsWith('.') && !file.name.startsWith('_'))
.sort((a, b) => a.name.localeCompare(b.name))
.map(({ name }) => {
return allFilesInMigrationDirectory.sort().map((name) => {
const filePath = path.join(directoryPath, name);
return {
@ -37,6 +41,4 @@ export const getMigrations = async (cwd: string, directory: string): Promise<Mig
cwd,
};
});
return migrationFiles;
};

View file

@ -28,4 +28,7 @@ const getPackageInfo = async () => {
throw new UnexpectedError(`Could not read package info from: ${packageInfoPath}`);
};
export const { version } = await getPackageInfo();
const packageInfo = await getPackageInfo();
// eslint-disable-next-line prefer-destructuring
export const version: string = packageInfo.version;

View file

@ -1,5 +1,5 @@
export * from './types.js';
export const emigrate = () => {
export const emigrate = (): void => {
// console.log('Done!');
};

View file

@ -1,4 +1,4 @@
import process from 'node:process';
import { hrtime } from 'node:process';
import {
isFinishedMigration,
isFailedMigration,
@ -9,56 +9,121 @@ import {
type FailedMigrationMetadata,
type SuccessfulMigrationMetadata,
} from '@emigrate/types';
import { toError, EmigrateError, MigrationRunError, toSerializedError } from './errors.js';
import { toError, EmigrateError, MigrationRunError, BadOptionError } from './errors.js';
import { exec } from './exec.js';
import { getDuration } from './get-duration.js';
type MigrationRunnerParameters = {
type MigrationRunnerParameters<T extends MigrationMetadata | MigrationMetadataFinished> = {
dry: boolean;
lock?: boolean;
limit?: number;
name?: string;
from?: string;
to?: string;
abortSignal?: AbortSignal;
abortRespite?: number;
reporter: EmigrateReporter;
storage: Storage;
migrations: Array<MigrationMetadata | MigrationMetadataFinished>;
validate: (migration: MigrationMetadata) => Promise<void>;
execute: (migration: MigrationMetadata) => Promise<void>;
migrations: AsyncIterable<T>;
migrationFilter?: (migration: T) => boolean;
validate: (migration: T) => Promise<void>;
execute: (migration: T) => Promise<void>;
onSuccess: (migration: SuccessfulMigrationMetadata) => Promise<void>;
onError: (migration: FailedMigrationMetadata, error: Error) => Promise<void>;
};
export const migrationRunner = async ({
export const migrationRunner = async <T extends MigrationMetadata | MigrationMetadataFinished>({
dry,
lock = true,
limit,
name,
from,
to,
abortSignal,
abortRespite,
reporter,
storage,
migrations,
validate,
execute,
}: MigrationRunnerParameters): Promise<Error | undefined> => {
await reporter.onCollectedMigrations?.(migrations);
const finishedMigrations: MigrationMetadataFinished[] = [];
const migrationsToRun: MigrationMetadata[] = [];
onSuccess,
onError,
migrationFilter = () => true,
}: MigrationRunnerParameters<T>): Promise<Error | undefined> => {
const validatedMigrations: Array<MigrationMetadata | MigrationMetadataFinished> = [];
const migrationsToLock: MigrationMetadata[] = [];
let skip = false;
abortSignal?.addEventListener(
'abort',
() => {
skip = true;
reporter.onAbort?.(toError(abortSignal.reason))?.then(
() => {
/* noop */
},
() => {
/* noop */
},
);
},
{ once: true },
);
let nameFound = false;
let fromFound = false;
let toFound = false;
for await (const migration of migrations) {
if (name && migration.relativeFilePath === name) {
nameFound = true;
}
if (from && migration.relativeFilePath === from) {
fromFound = true;
}
if (to && migration.relativeFilePath === to) {
toFound = true;
}
if (!migrationFilter(migration)) {
continue;
}
if (isFinishedMigration(migration)) {
skip ||= migration.status === 'failed' || migration.status === 'skipped';
finishedMigrations.push(migration);
} else if (skip) {
finishedMigrations.push({
validatedMigrations.push(migration);
} else if (
skip ||
Boolean(from && migration.relativeFilePath < from) ||
Boolean(to && migration.relativeFilePath > to) ||
(limit && migrationsToLock.length >= limit)
) {
validatedMigrations.push({
...migration,
status: dry ? 'pending' : 'skipped',
status: 'skipped',
});
} else {
try {
await validate(migration);
migrationsToRun.push(migration);
migrationsToLock.push(migration);
validatedMigrations.push(migration);
} catch (error) {
for await (const migration of migrationsToRun) {
finishedMigrations.push({ ...migration, status: 'skipped' });
for (const migration of migrationsToLock) {
const validatedIndex = validatedMigrations.indexOf(migration);
validatedMigrations[validatedIndex] = {
...migration,
status: 'skipped',
};
}
migrationsToRun.length = 0;
migrationsToLock.length = 0;
finishedMigrations.push({
validatedMigrations.push({
...migration,
status: 'failed',
duration: 0,
@ -70,45 +135,99 @@ export const migrationRunner = async ({
}
}
const [lockedMigrations, lockError] = dry ? [migrationsToRun] : await exec(async () => storage.lock(migrationsToRun));
await reporter.onCollectedMigrations?.(validatedMigrations);
if (lockError) {
for await (const migration of migrationsToRun) {
finishedMigrations.push({ ...migration, status: 'skipped' });
let optionError: Error | undefined;
if (name && !nameFound) {
optionError = BadOptionError.fromOption('name', `The migration: "${name}" was not found`);
} else if (from && !fromFound) {
optionError = BadOptionError.fromOption('from', `The "from" migration: "${from}" was not found`);
} else if (to && !toFound) {
optionError = BadOptionError.fromOption('to', `The "to" migration: "${to}" was not found`);
}
migrationsToRun.length = 0;
if (optionError) {
dry = true;
skip = true;
for (const migration of migrationsToLock) {
const validatedIndex = validatedMigrations.indexOf(migration);
validatedMigrations[validatedIndex] = {
...migration,
status: 'skipped',
};
}
migrationsToLock.length = 0;
}
const [lockedMigrations, lockError] =
dry || !lock
? [migrationsToLock]
: await exec(async () => storage.lock(migrationsToLock), { abortSignal, abortRespite });
if (lockError) {
for (const migration of migrationsToLock) {
const validatedIndex = validatedMigrations.indexOf(migration);
validatedMigrations[validatedIndex] = {
...migration,
status: 'skipped',
};
}
migrationsToLock.length = 0;
skip = true;
} else {
} else if (lock) {
for (const migration of migrationsToLock) {
const isLocked = lockedMigrations.some((lockedMigration) => lockedMigration.name === migration.name);
if (!isLocked) {
const validatedIndex = validatedMigrations.indexOf(migration);
validatedMigrations[validatedIndex] = {
...migration,
status: 'skipped',
};
}
}
await reporter.onLockedMigrations?.(lockedMigrations);
}
for await (const finishedMigration of finishedMigrations) {
switch (finishedMigration.status) {
const finishedMigrations: MigrationMetadataFinished[] = [];
for await (const migration of validatedMigrations) {
if (isFinishedMigration(migration)) {
switch (migration.status) {
case 'failed': {
await reporter.onMigrationError?.(finishedMigration, finishedMigration.error);
await reporter.onMigrationError?.(migration, migration.error);
break;
}
case 'pending': {
await reporter.onMigrationSkip?.(finishedMigration);
await reporter.onMigrationSkip?.(migration);
break;
}
case 'skipped': {
await reporter.onMigrationSkip?.(finishedMigration);
await reporter.onMigrationSkip?.(migration);
break;
}
default: {
await reporter.onMigrationSuccess?.(finishedMigration);
await reporter.onMigrationSuccess?.(migration);
break;
}
}
finishedMigrations.push(migration);
continue;
}
for await (const migration of lockedMigrations ?? []) {
if (dry || skip) {
const finishedMigration: MigrationMetadataFinished = {
...migration,
@ -123,9 +242,9 @@ export const migrationRunner = async ({
await reporter.onMigrationStart?.(migration);
const start = process.hrtime();
const start = hrtime();
const [, migrationError] = await exec(async () => execute(migration));
const [, migrationError] = await exec(async () => execute(migration as T), { abortSignal, abortRespite });
const duration = getDuration(start);
@ -136,7 +255,7 @@ export const migrationRunner = async ({
duration,
error: migrationError,
};
await storage.onError(finishedMigration, toSerializedError(migrationError));
await onError(finishedMigration, migrationError);
await reporter.onMigrationError?.(finishedMigration, migrationError);
finishedMigrations.push(finishedMigration);
skip = true;
@ -146,13 +265,14 @@ export const migrationRunner = async ({
status: 'done',
duration,
};
await storage.onSuccess(finishedMigration);
await onSuccess(finishedMigration);
await reporter.onMigrationSuccess?.(finishedMigration);
finishedMigrations.push(finishedMigration);
}
}
const [, unlockError] = dry ? [] : await exec(async () => storage.unlock(lockedMigrations ?? []));
const [, unlockError] =
dry || !lock ? [] : await exec(async () => storage.unlock(lockedMigrations ?? []), { abortSignal, abortRespite });
// eslint-disable-next-line unicorn/no-array-callback-reference
const firstFailed = finishedMigrations.find(isFailedMigration);
@ -162,7 +282,12 @@ export const migrationRunner = async ({
: firstFailed
? MigrationRunError.fromMetadata(firstFailed)
: undefined;
const error = unlockError ?? firstError ?? lockError;
const error =
optionError ??
unlockError ??
firstError ??
lockError ??
(abortSignal?.aborted ? toError(abortSignal.reason) : undefined);
await reporter.onFinished?.(finishedMigrations, error);

View file

@ -17,7 +17,7 @@ const promisifyIfNeeded = <T extends Function>(fn: T) => {
};
const loaderJs: LoaderPlugin = {
loadableExtensions: ['.js', '.cjs', '.mjs'],
loadableExtensions: ['.js', '.cjs', '.mjs', '.ts', '.cts', '.mts'],
async loadMigration(migration) {
const migrationModule: unknown = await import(migration.filePath);

View file

@ -1,4 +1,5 @@
import { black, blueBright, bold, cyan, dim, faint, gray, green, red, redBright, yellow } from 'ansis';
import { setInterval } from 'node:timers';
import { black, blueBright, bold, cyan, dim, faint, gray, green, red, redBright, yellow, yellowBright } from 'ansis';
import logUpdate from 'log-update';
import elegantSpinner from 'elegant-spinner';
import figures from 'figures';
@ -13,6 +14,7 @@ import {
} from '@emigrate/types';
type Status = ReturnType<typeof getMigrationStatus>;
type Command = ReporterInitParameters['command'];
const interactive = isInteractive();
const spinner = interactive ? elegantSpinner() : () => figures.pointerSmall;
@ -20,21 +22,26 @@ const spinner = interactive ? elegantSpinner() : () => figures.pointerSmall;
const formatDuration = (duration: number): string => {
const pretty = prettyMs(duration);
return yellow(pretty.replaceAll(/([^\s\d]+)/g, dim('$1')));
return yellow(pretty.replaceAll(/([^\s\d.]+)/g, dim('$1')));
};
const getTitle = ({ command, version, dry, cwd }: ReporterInitParameters) => {
return `${black.bgBlueBright(' Emigrate ').trim()} ${blueBright.bold(command)} ${blueBright(`v${version}`)} ${gray(
cwd,
)}${dry ? yellow` (dry run)` : ''}`;
return `${black.bgBlueBright` Emigrate `.trim()} ${blueBright.bold(command)} ${blueBright`v${version}`} ${gray(cwd)}${
dry ? yellow` (dry run)` : ''
}`;
};
const getMigrationStatus = (
command: Command,
migration: MigrationMetadata | MigrationMetadataFinished,
activeMigration?: MigrationMetadata,
) => {
if ('status' in migration) {
return migration.status;
return command === 'remove' && migration.status === 'done' ? 'removed' : migration.status;
}
if (command === 'remove' && migration.name === activeMigration?.name) {
return 'removing';
}
return migration.name === activeMigration?.name ? 'running' : 'pending';
@ -42,6 +49,10 @@ const getMigrationStatus = (
const getIcon = (status: Status) => {
switch (status) {
case 'removing': {
return cyan(spinner());
}
case 'running': {
return cyan(spinner());
}
@ -50,6 +61,10 @@ const getIcon = (status: Status) => {
return gray(figures.pointerSmall);
}
case 'removed': {
return green(figures.tick);
}
case 'done': {
return green(figures.tick);
}
@ -89,20 +104,19 @@ const getName = (name: string, status?: Status) => {
};
const getMigrationText = (
command: Command,
migration: MigrationMetadata | MigrationMetadataFinished,
activeMigration?: MigrationMetadata,
) => {
const pathWithoutName = migration.relativeFilePath.slice(0, -migration.name.length);
const nameWithoutExtension = migration.name.slice(0, -migration.extension.length);
const status = getMigrationStatus(migration, activeMigration);
const status = getMigrationStatus(command, migration, activeMigration);
const parts = [' ', getIcon(status)];
parts.push(`${dim(pathWithoutName)}${getName(nameWithoutExtension, status)}${dim(migration.extension)}`);
if ('status' in migration) {
parts.push(gray(`(${migration.status})`));
} else if (migration.name === activeMigration?.name) {
parts.push(gray`(running)`);
if ('status' in migration || migration.name === activeMigration?.name) {
parts.push(gray`(${status})`);
}
if ('duration' in migration && migration.duration) {
@ -165,6 +179,20 @@ const getError = (error?: ErrorLike, indent = ' ') => {
return parts.join('\n');
};
const getAbortMessage = (reason?: Error) => {
if (!reason) {
return '';
}
const parts = [` ${red.bold(reason.message)}`];
if (isErrorLike(reason.cause)) {
parts.push(getError(reason.cause, ' '));
}
return parts.join('\n');
};
const getSummary = (
command: ReporterInitParameters['command'],
migrations: Array<MigrationMetadata | MigrationMetadataFinished> = [],
@ -232,26 +260,39 @@ const getHeaderMessage = (
}
if (migrations.length === 0) {
return ' No pending migrations found';
return ' No migrations found';
}
const statusText = command === 'list' ? 'migrations are pending' : 'pending migrations to run';
if (migrations.length === lockedMigrations.length) {
return ` ${bold(migrations.length.toString())} ${dim('pending migrations to run')}`;
return ` ${bold(migrations.length.toString())} ${dim(statusText)}`;
}
const nonLockedMigrations = migrations.filter(
(migration) => !lockedMigrations.some((lockedMigration) => lockedMigration.name === migration.name),
);
const failedMigrations = nonLockedMigrations.filter(
(migration) => 'status' in migration && migration.status === 'failed',
);
const unlockableCount = command === 'up' ? nonLockedMigrations.length - failedMigrations.length : 0;
let skippedCount = 0;
let failedCount = 0;
for (const migration of migrations) {
const isLocked = lockedMigrations.some((lockedMigration) => lockedMigration.name === migration.name);
if (isLocked) {
continue;
}
if ('status' in migration) {
if (migration.status === 'failed') {
failedCount += 1;
} else if (migration.status === 'skipped') {
skippedCount += 1;
}
}
}
const parts = [
bold(`${lockedMigrations.length} of ${migrations.length}`),
dim`pending migrations to run`,
unlockableCount > 0 ? yellow(`(${unlockableCount} locked)`) : '',
failedMigrations.length > 0 ? redBright(`(${failedMigrations.length} failed)`) : '',
dim(statusText),
skippedCount > 0 ? yellowBright(`(${skippedCount} skipped)`) : '',
failedCount > 0 ? redBright(`(${failedCount} failed)`) : '',
].filter(Boolean);
return ` ${parts.join(' ')}`;
@ -264,6 +305,7 @@ class DefaultFancyReporter implements Required<EmigrateReporter> {
#error: Error | undefined;
#parameters!: ReporterInitParameters;
#interval: NodeJS.Timeout | undefined;
#abortReason: Error | undefined;
onInit(parameters: ReporterInitParameters): void | PromiseLike<void> {
this.#parameters = parameters;
@ -271,6 +313,10 @@ class DefaultFancyReporter implements Required<EmigrateReporter> {
this.#start();
}
onAbort(reason: Error): void | PromiseLike<void> {
this.#abortReason = reason;
}
onCollectedMigrations(migrations: MigrationMetadata[]): void | PromiseLike<void> {
this.#migrations = migrations;
}
@ -283,19 +329,6 @@ class DefaultFancyReporter implements Required<EmigrateReporter> {
this.#migrations = [migration];
}
onMigrationRemoveStart(migration: MigrationMetadata): Awaitable<void> {
this.#migrations = [migration];
this.#activeMigration = migration;
}
onMigrationRemoveSuccess(migration: MigrationMetadataFinished): Awaitable<void> {
this.#finishMigration(migration);
}
onMigrationRemoveError(migration: MigrationMetadataFinished, _error: Error): Awaitable<void> {
this.#finishMigration(migration);
}
onMigrationStart(migration: MigrationMetadata): void | PromiseLike<void> {
this.#activeMigration = migration;
}
@ -340,7 +373,10 @@ class DefaultFancyReporter implements Required<EmigrateReporter> {
const parts = [
getTitle(this.#parameters),
getHeaderMessage(this.#parameters.command, this.#migrations, this.#lockedMigrations),
this.#migrations?.map((migration) => getMigrationText(migration, this.#activeMigration)).join('\n') ?? '',
this.#migrations
?.map((migration) => getMigrationText(this.#parameters.command, migration, this.#activeMigration))
.join('\n') ?? '',
getAbortMessage(this.#abortReason),
getSummary(this.#parameters.command, this.#migrations),
getError(this.#error),
];
@ -386,6 +422,12 @@ class DefaultReporter implements Required<EmigrateReporter> {
console.log('');
}
onAbort(reason: Error): void | PromiseLike<void> {
console.log('');
console.error(getAbortMessage(reason));
console.log('');
}
onCollectedMigrations(migrations: MigrationMetadata[]): void | PromiseLike<void> {
this.#migrations = migrations;
}
@ -398,35 +440,23 @@ class DefaultReporter implements Required<EmigrateReporter> {
}
onNewMigration(migration: MigrationMetadata, _content: string): Awaitable<void> {
console.log(getMigrationText(migration));
}
onMigrationRemoveStart(migration: MigrationMetadata): Awaitable<void> {
console.log(getMigrationText(migration));
}
onMigrationRemoveSuccess(migration: MigrationMetadataFinished): Awaitable<void> {
console.log(getMigrationText(migration));
}
onMigrationRemoveError(migration: MigrationMetadataFinished, _error: Error): Awaitable<void> {
console.error(getMigrationText(migration));
console.log(getMigrationText(this.#parameters.command, migration));
}
onMigrationStart(migration: MigrationMetadata): void | PromiseLike<void> {
console.log(getMigrationText(migration, migration));
console.log(getMigrationText(this.#parameters.command, migration, migration));
}
onMigrationSuccess(migration: MigrationMetadataFinished): void | PromiseLike<void> {
console.log(getMigrationText(migration));
console.log(getMigrationText(this.#parameters.command, migration));
}
onMigrationError(migration: MigrationMetadataFinished, _error: Error): void | PromiseLike<void> {
console.error(getMigrationText(migration));
console.error(getMigrationText(this.#parameters.command, migration));
}
onMigrationSkip(migration: MigrationMetadataFinished): void | PromiseLike<void> {
console.log(getMigrationText(migration));
console.log(getMigrationText(this.#parameters.command, migration));
}
onFinished(migrations: MigrationMetadataFinished[], error?: Error | undefined): void | PromiseLike<void> {
@ -441,6 +471,6 @@ class DefaultReporter implements Required<EmigrateReporter> {
}
}
const reporterDefault = interactive ? new DefaultFancyReporter() : new DefaultReporter();
const reporterDefault: EmigrateReporter = interactive ? new DefaultFancyReporter() : new DefaultReporter();
export default reporterDefault;

View file

@ -0,0 +1,15 @@
import type { EmigrateReporter } from '@emigrate/types';
import { type Config } from '../types.js';
import * as reporters from './index.js';
export const getStandardReporter = (reporter?: Config['reporter']): EmigrateReporter | undefined => {
if (!reporter) {
return reporters.pretty;
}
if (typeof reporter === 'string' && reporter in reporters) {
return reporters[reporter as keyof typeof reporters];
}
return undefined;
};

View file

@ -0,0 +1,2 @@
export { default as pretty } from './default.js';
export { default as json } from './json.js';

View file

@ -0,0 +1,60 @@
import { type ReporterInitParameters, type EmigrateReporter, type MigrationMetadataFinished } from '@emigrate/types';
import { toSerializedError } from '../errors.js';
class JsonReporter implements EmigrateReporter {
#parameters!: ReporterInitParameters;
#startTime!: number;
onInit(parameters: ReporterInitParameters): void {
this.#startTime = Date.now();
this.#parameters = parameters;
}
onFinished(migrations: MigrationMetadataFinished[], error?: Error | undefined): void {
const { command, version } = this.#parameters;
let numberDoneMigrations = 0;
let numberSkippedMigrations = 0;
let numberFailedMigrations = 0;
let numberPendingMigrations = 0;
for (const migration of migrations) {
// eslint-disable-next-line unicorn/prefer-switch
if (migration.status === 'done') {
numberDoneMigrations++;
} else if (migration.status === 'skipped') {
numberSkippedMigrations++;
} else if (migration.status === 'failed') {
numberFailedMigrations++;
} else {
numberPendingMigrations++;
}
}
const result = {
command,
version,
numberTotalMigrations: migrations.length,
numberDoneMigrations,
numberSkippedMigrations,
numberFailedMigrations,
numberPendingMigrations,
success: !error,
startTime: this.#startTime,
endTime: Date.now(),
error: error ? toSerializedError(error) : undefined,
migrations: migrations.map((migration) => ({
name: migration.filePath,
status: migration.status,
duration: 'duration' in migration ? migration.duration : 0,
error: 'error' in migration ? toSerializedError(migration.error) : undefined,
})),
};
console.log(JSON.stringify(result, undefined, 2));
}
}
const jsonReporter: EmigrateReporter = new JsonReporter();
export default jsonReporter;

View file

@ -0,0 +1,134 @@
import { mock, type Mock } from 'node:test';
import path from 'node:path';
import assert from 'node:assert';
import {
type SerializedError,
type EmigrateReporter,
type FailedMigrationHistoryEntry,
type MigrationHistoryEntry,
type MigrationMetadata,
type NonFailedMigrationHistoryEntry,
type Storage,
} from '@emigrate/types';
import { toSerializedError } from './errors.js';
export type Mocked<T> = {
// @ts-expect-error - This is a mock
[K in keyof T]: Mock<T[K]>;
};
export async function noop(): Promise<void> {
// noop
}
export function getErrorCause(error: Error | undefined): Error | SerializedError | undefined {
if (error?.cause instanceof Error) {
return error.cause;
}
if (typeof error?.cause === 'object' && error.cause !== null) {
return error.cause as unknown as SerializedError;
}
return undefined;
}
export function getMockedStorage(historyEntries: Array<string | MigrationHistoryEntry>): Mocked<Storage> {
return {
lock: mock.fn(async (migrations) => migrations),
unlock: mock.fn(async () => {
// void
}),
getHistory: mock.fn(async function* () {
yield* toEntries(historyEntries);
}),
remove: mock.fn(),
onSuccess: mock.fn(),
onError: mock.fn(),
end: mock.fn(),
};
}
export function getMockedReporter(): Mocked<Required<EmigrateReporter>> {
return {
onFinished: mock.fn(noop),
onInit: mock.fn(noop),
onAbort: mock.fn(noop),
onCollectedMigrations: mock.fn(noop),
onLockedMigrations: mock.fn(noop),
onNewMigration: mock.fn(noop),
onMigrationStart: mock.fn(noop),
onMigrationSuccess: mock.fn(noop),
onMigrationError: mock.fn(noop),
onMigrationSkip: mock.fn(noop),
};
}
export function toMigration(cwd: string, directory: string, name: string): MigrationMetadata {
return {
name,
filePath: `${cwd}/${directory}/${name}`,
relativeFilePath: `${directory}/${name}`,
extension: path.extname(name),
directory,
cwd,
};
}
export function toMigrations(cwd: string, directory: string, names: string[]): MigrationMetadata[] {
return names.map((name) => toMigration(cwd, directory, name));
}
export function toEntry(name: MigrationHistoryEntry): MigrationHistoryEntry;
export function toEntry<S extends MigrationHistoryEntry['status']>(
name: string,
status?: S,
): S extends 'failed' ? FailedMigrationHistoryEntry : NonFailedMigrationHistoryEntry;
export function toEntry(name: string | MigrationHistoryEntry, status?: 'done' | 'failed'): MigrationHistoryEntry {
if (typeof name !== 'string') {
return name.status === 'failed' ? name : name;
}
if (status === 'failed') {
return {
name,
status,
date: new Date(),
error: { name: 'Error', message: 'Failed' },
};
}
return {
name,
status: status ?? 'done',
date: new Date(),
};
}
export function toEntries(
names: Array<string | MigrationHistoryEntry>,
status?: MigrationHistoryEntry['status'],
): MigrationHistoryEntry[] {
return names.map((name) => (typeof name === 'string' ? toEntry(name, status) : name));
}
export function assertErrorEqualEnough(actual?: Error | SerializedError, expected?: Error, message?: string): void {
if (expected === undefined) {
assert.strictEqual(actual, undefined);
return;
}
const {
cause: actualCause,
stack: actualStack,
...actualError
} = actual instanceof Error ? toSerializedError(actual) : actual ?? {};
const { cause: expectedCause, stack: expectedStack, ...expectedError } = toSerializedError(expected);
// @ts-expect-error Ignore
const { stack: actualCauseStack, ...actualCauseRest } = actualCause ?? {};
// @ts-expect-error Ignore
const { stack: expectedCauseStack, ...expectedCauseRest } = expectedCause ?? {};
assert.deepStrictEqual(actualError, expectedError, message);
assert.deepStrictEqual(actualCauseRest, expectedCauseRest, message ? `${message} (cause)` : undefined);
}

View file

@ -1,4 +1,7 @@
import { type EmigrateStorage, type Awaitable, type Plugin, type EmigrateReporter } from '@emigrate/types';
import type * as reporters from './reporters/index.js';
export type StandardReporter = keyof typeof reporters;
export type EmigratePlugin = Plugin;
@ -6,11 +9,13 @@ type StringOrModule<T> = string | T | (() => Awaitable<T>) | (() => Awaitable<{
export type Config = {
storage?: StringOrModule<EmigrateStorage>;
reporter?: StringOrModule<EmigrateReporter>;
reporter?: StandardReporter | StringOrModule<EmigrateReporter>;
plugins?: Array<StringOrModule<EmigratePlugin>>;
directory?: string;
template?: string;
extension?: string;
color?: boolean;
abortRespite?: number;
};
export type EmigrateConfig = Config & {

View file

@ -1 +1 @@
export const withLeadingPeriod = (string: string) => (string.startsWith('.') ? string : `.${string}`);
export const withLeadingPeriod = (string: string): string => (string.startsWith('.') ? string : `.${string}`);

View file

@ -1,8 +1,3 @@
{
"extends": "@emigrate/tsconfig/build.json",
"compilerOptions": {
"outDir": "dist"
},
"include": ["src"],
"exclude": ["node_modules", "dist"]
"extends": "@emigrate/tsconfig/build.json"
}

View file

@ -1,5 +1,103 @@
# @emigrate/mysql
## 0.3.3
### Patch Changes
- 26240f4: Make sure we can initialize multiple running instances of Emigrate using @emigrate/mysql concurrently without issues with creating the history table (for instance in a Kubernetes environment and/or with a Percona cluster).
- d779286: Upgrade TypeScript to v5.5 and enable [isolatedDeclarations](https://devblogs.microsoft.com/typescript/announcing-typescript-5-5/#isolated-declarations)
- 26240f4: Either lock all or none of the migrations to run to make sure they run in order when multiple instances of Emigrate runs concurrently (for instance in a Kubernetes environment)
- Updated dependencies [d779286]
- @emigrate/plugin-tools@0.9.8
- @emigrate/types@0.12.2
## 0.3.2
### Patch Changes
- 57498db: Unreference all connections when run using Bun, to not keep the process open unnecessarily long
## 0.3.1
### Patch Changes
- ca154fa: Minimize package size by excluding \*.tsbuildinfo files
- Updated dependencies [ca154fa]
- @emigrate/plugin-tools@0.9.7
- @emigrate/types@0.12.2
## 0.3.0
### Minor Changes
- 4442604: Automatically create the database if it doesn't exist, and the user have the permissions to do so
### Patch Changes
- aef2d7c: Avoid "CREATE TABLE IF NOT EXISTS" as it's too locking in a clustered database when running it concurrently
## 0.2.8
### Patch Changes
- 17feb2d: Only unreference connections in a Bun environment as it crashes Node for some reason, without even throwing an error that is
## 0.2.7
### Patch Changes
- 198aa54: Unreference all connections automatically so that they don't hinder the process from exiting. This is especially needed in Bun environments as it seems to handle sockets differently regarding this matter than NodeJS.
## 0.2.6
### Patch Changes
- db656c2: Enable NPM provenance
- Updated dependencies [db656c2]
- @emigrate/plugin-tools@0.9.6
- @emigrate/types@0.12.1
## 0.2.5
### Patch Changes
- f8a5cc7: Make sure the storage initialization crashes when a database connection can't be established
- Updated dependencies [94ad9fe]
- @emigrate/types@0.12.0
- @emigrate/plugin-tools@0.9.5
## 0.2.4
### Patch Changes
- Updated dependencies [ce15648]
- @emigrate/types@0.11.0
- @emigrate/plugin-tools@0.9.4
## 0.2.3
### Patch Changes
- Updated dependencies [f9a16d8]
- @emigrate/types@0.10.0
- @emigrate/plugin-tools@0.9.3
## 0.2.2
### Patch Changes
- Updated dependencies [a6c6e6d]
- @emigrate/types@0.9.1
- @emigrate/plugin-tools@0.9.2
## 0.2.1
### Patch Changes
- 3a8b06b: Don't use the `bun` key in `exports` as that would mean we have to include both built files and source files in each package, which is a bit wasteful. Maybe reconsider in the future if we can package only source files.
- Updated dependencies [3a8b06b]
- @emigrate/plugin-tools@0.9.1
## 0.2.0
### Minor Changes

View file

@ -17,7 +17,13 @@ This plugin is actually three different Emigrate plugins in one:
Install the plugin in your project, alongside the Emigrate CLI:
```bash
npm install --save-dev @emigrate/cli @emigrate/mysql
npm install @emigrate/cli @emigrate/mysql
# or
pnpm add @emigrate/cli @emigrate/mysql
# or
yarn add @emigrate/cli @emigrate/mysql
# or
bun add @emigrate/cli @emigrate/mysql
```
## Usage

View file

@ -1,8 +1,9 @@
{
"name": "@emigrate/mysql",
"version": "0.2.0",
"version": "0.3.3",
"publishConfig": {
"access": "public"
"access": "public",
"provenance": true
},
"description": "A MySQL plugin for Emigrate. Uses a MySQL database for storing migration history. Can load and generate .sql migration files.",
"main": "dist/index.js",
@ -10,18 +11,22 @@
"type": "module",
"exports": {
".": {
"bun": "./src/index.ts",
"import": "./dist/index.js",
"types": "./dist/index.d.ts"
}
},
"files": [
"dist"
"dist",
"!dist/*.tsbuildinfo",
"!dist/**/*.test.js",
"!dist/tests/*"
],
"scripts": {
"build": "tsc --pretty",
"build:watch": "tsc --pretty --watch",
"lint": "xo --cwd=../.. $(pwd)"
"lint": "xo --cwd=../.. $(pwd)",
"integration": "glob -c \"node --import tsx --test-reporter spec --test\" \"./src/**/*.integration.ts\"",
"integration:watch": "glob -c \"node --watch --import tsx --test-reporter spec --test\" \"./src/**/*.integration.ts\""
},
"keywords": [
"emigrate",
@ -43,7 +48,9 @@
"mysql2": "3.6.5"
},
"devDependencies": {
"@emigrate/tsconfig": "workspace:*"
"@emigrate/tsconfig": "workspace:*",
"@types/bun": "1.1.2",
"bun-types": "1.1.8"
},
"volta": {
"extends": "../../package.json"

View file

@ -0,0 +1,103 @@
import assert from 'node:assert';
import path from 'node:path';
import { before, after, describe, it } from 'node:test';
import type { MigrationMetadata } from '@emigrate/types';
import { startDatabase, stopDatabase } from './tests/database.js';
import { createMysqlStorage } from './index.js';
let db: { port: number; host: string };
const toEnd = new Set<{ end: () => Promise<void> }>();
describe('emigrate-mysql', async () => {
before(
async () => {
db = await startDatabase();
},
{ timeout: 60_000 },
);
after(
async () => {
for (const storage of toEnd) {
// eslint-disable-next-line no-await-in-loop
await storage.end();
}
toEnd.clear();
await stopDatabase();
},
{ timeout: 10_000 },
);
describe('migration locks', async () => {
it('either locks none or all of the given migrations', async () => {
const { initializeStorage } = createMysqlStorage({
table: 'migrations',
connection: {
host: db.host,
user: 'emigrate',
password: 'emigrate',
database: 'emigrate',
port: db.port,
},
});
const [storage1, storage2] = await Promise.all([initializeStorage(), initializeStorage()]);
toEnd.add(storage1);
toEnd.add(storage2);
const migrations = toMigrations('/emigrate', 'migrations', [
'2023-10-01-01-test.js',
'2023-10-01-02-test.js',
'2023-10-01-03-test.js',
'2023-10-01-04-test.js',
'2023-10-01-05-test.js',
'2023-10-01-06-test.js',
'2023-10-01-07-test.js',
'2023-10-01-08-test.js',
'2023-10-01-09-test.js',
'2023-10-01-10-test.js',
'2023-10-01-11-test.js',
'2023-10-01-12-test.js',
'2023-10-01-13-test.js',
'2023-10-01-14-test.js',
'2023-10-01-15-test.js',
'2023-10-01-16-test.js',
'2023-10-01-17-test.js',
'2023-10-01-18-test.js',
'2023-10-01-19-test.js',
'2023-10-01-20-test.js',
]);
const [locked1, locked2] = await Promise.all([storage1.lock(migrations), storage2.lock(migrations)]);
assert.strictEqual(
locked1.length === 0 || locked2.length === 0,
true,
'One of the processes should have no locks',
);
assert.strictEqual(
locked1.length === 20 || locked2.length === 20,
true,
'One of the processes should have all locks',
);
});
});
});
function toMigration(cwd: string, directory: string, name: string): MigrationMetadata {
return {
name,
filePath: `${cwd}/${directory}/${name}`,
relativeFilePath: `${directory}/${name}`,
extension: path.extname(name),
directory,
cwd,
};
}
function toMigrations(cwd: string, directory: string, names: string[]): MigrationMetadata[] {
return names.map((name) => toMigration(cwd, directory, name));
}

View file

@ -1,5 +1,6 @@
import process from 'node:process';
import fs from 'node:fs/promises';
import { setTimeout } from 'node:timers/promises';
import {
createConnection,
createPool,
@ -9,10 +10,13 @@ import {
type Pool,
type ResultSetHeader,
type RowDataPacket,
type Connection,
} from 'mysql2/promise';
import { getTimestampPrefix, sanitizeMigrationName } from '@emigrate/plugin-tools';
import {
type Awaitable,
type MigrationMetadata,
type MigrationFunction,
type EmigrateStorage,
type LoaderPlugin,
type Storage,
@ -40,27 +44,39 @@ export type MysqlLoaderOptions = {
connection: ConnectionOptions | string;
};
const getConnection = async (connection: ConnectionOptions | string) => {
if (typeof connection === 'string') {
const uri = new URL(connection);
const getConnection = async (options: ConnectionOptions | string) => {
let connection: Connection;
if (typeof options === 'string') {
const uri = new URL(options);
// client side connectTimeout is unstable in mysql2 library
// it throws an error you can't catch and crashes node
// best to leave this at 0 (disabled)
uri.searchParams.set('connectTimeout', '0');
uri.searchParams.set('multipleStatements', 'true');
uri.searchParams.set('flags', '-FOUND_ROWS');
return createConnection(uri.toString());
}
return createConnection({
...connection,
connection = await createConnection(uri.toString());
} else {
connection = await createConnection({
...options,
// client side connectTimeout is unstable in mysql2 library
// it throws an error you can't catch and crashes node
// best to leave this at 0 (disabled)
connectTimeout: 0,
multipleStatements: true,
flags: ['-FOUND_ROWS'],
});
}
if (process.isBun) {
// @ts-expect-error the connection is not in the types but it's there
// eslint-disable-next-line @typescript-eslint/no-unsafe-call
connection.connection.stream.unref();
}
return connection;
};
const getPool = (connection: PoolOptions | string) => {
@ -71,6 +87,7 @@ const getPool = (connection: PoolOptions | string) => {
// it throws an error you can't catch and crashes node
// best to leave this at 0 (disabled)
uri.searchParams.set('connectTimeout', '0');
uri.searchParams.set('flags', '-FOUND_ROWS');
return createPool(uri.toString());
}
@ -81,6 +98,7 @@ const getPool = (connection: PoolOptions | string) => {
// it throws an error you can't catch and crashes node
// best to leave this at 0 (disabled)
connectTimeout: 0,
flags: ['-FOUND_ROWS'],
});
};
@ -91,8 +109,8 @@ type HistoryEntry = {
error?: SerializedError;
};
const lockMigration = async (pool: Pool, table: string, migration: MigrationMetadata) => {
const [result] = await pool.execute<ResultSetHeader>({
const lockMigration = async (connection: Connection, table: string, migration: MigrationMetadata) => {
const [result] = await connection.execute<ResultSetHeader>({
sql: `
INSERT INTO ${escapeId(table)} (name, status, date)
VALUES (?, ?, NOW())
@ -155,40 +173,186 @@ const deleteMigration = async (pool: Pool, table: string, migration: MigrationMe
return result.affectedRows === 1;
};
const initializeTable = async (pool: Pool, table: string) => {
const getDatabaseName = (config: ConnectionOptions | string) => {
if (typeof config === 'string') {
const uri = new URL(config);
return uri.pathname.replace(/^\//u, '');
}
return config.database ?? '';
};
const setDatabaseName = <T extends ConnectionOptions | string>(config: T, databaseName: string): T => {
if (typeof config === 'string') {
const uri = new URL(config);
uri.pathname = `/${databaseName}`;
return uri.toString() as T;
}
if (typeof config === 'object') {
return {
...config,
database: databaseName,
};
}
throw new Error('Invalid connection config');
};
const initializeDatabase = async (config: ConnectionOptions | string) => {
let connection: Connection | undefined;
try {
connection = await getConnection(config);
await connection.query('SELECT 1');
await connection.end();
} catch (error) {
await connection?.end();
// The ER_BAD_DB_ERROR error code is thrown when the database does not exist but the user might have the permissions to create it
// Otherwise the error code is ER_DBACCESS_DENIED_ERROR
if (error && typeof error === 'object' && 'code' in error && error.code === 'ER_BAD_DB_ERROR') {
const databaseName = getDatabaseName(config);
const informationSchemaConfig = setDatabaseName(config, 'information_schema');
const informationSchemaConnection = await getConnection(informationSchemaConfig);
try {
await informationSchemaConnection.query(`CREATE DATABASE ${escapeId(databaseName)}`);
// Any database creation error here will be propagated
} finally {
await informationSchemaConnection.end();
}
} else {
// In this case we don't know how to handle the error, so we rethrow it
throw error;
}
}
};
const lockWaitTimeout = 10; // seconds
const isHistoryTableExisting = async (connection: Connection, table: string) => {
const [result] = await connection.execute<RowDataPacket[]>({
sql: `
SELECT
1 as table_exists
FROM
information_schema.tables
WHERE
table_schema = DATABASE()
AND table_name = ?
`,
values: [table],
});
return result[0]?.['table_exists'] === 1;
};
const initializeTable = async (config: ConnectionOptions | string, table: string) => {
const connection = await getConnection(config);
if (await isHistoryTableExisting(connection, table)) {
await connection.end();
return;
}
const lockName = `emigrate_init_table_lock_${table}`;
const [lockResult] = await connection.query<RowDataPacket[]>(`SELECT GET_LOCK(?, ?) AS got_lock`, [
lockName,
lockWaitTimeout,
]);
const didGetLock = lockResult[0]?.['got_lock'] === 1;
if (didGetLock) {
try {
// This table definition is compatible with the one used by the immigration-mysql package
await pool.execute(`
await connection.execute(`
CREATE TABLE IF NOT EXISTS ${escapeId(table)} (
name varchar(255) not null primary key,
status varchar(32),
date datetime not null
) Engine=InnoDB;
`);
} finally {
await connection.query(`SELECT RELEASE_LOCK(?)`, [lockName]);
await connection.end();
}
return;
}
// Didn't get the lock, wait to see if the table was created by another process
const maxWait = lockWaitTimeout * 1000; // milliseconds
const checkInterval = 250; // milliseconds
const start = Date.now();
try {
while (Date.now() - start < maxWait) {
// eslint-disable-next-line no-await-in-loop
if (await isHistoryTableExisting(connection, table)) {
return;
}
// eslint-disable-next-line no-await-in-loop
await setTimeout(checkInterval);
}
throw new Error(`Timeout waiting for table ${table} to be created by other process`);
} finally {
await connection.end();
}
};
export const createMysqlStorage = ({ table = defaultTable, connection }: MysqlStorageOptions): EmigrateStorage => {
return {
async initializeStorage() {
await initializeDatabase(connection);
await initializeTable(connection, table);
const pool = getPool(connection);
try {
await initializeTable(pool, table);
} catch (error) {
await pool.end();
throw error;
if (process.isBun) {
pool.on('connection', (connection) => {
// @ts-expect-error stream is not in the types but it's there
// eslint-disable-next-line @typescript-eslint/no-unsafe-call
connection.stream.unref();
});
}
const storage: Storage = {
async lock(migrations) {
const connection = await pool.getConnection();
try {
await connection.beginTransaction();
const lockedMigrations: MigrationMetadata[] = [];
for await (const migration of migrations) {
if (await lockMigration(pool, table, migration)) {
if (await lockMigration(connection, table, migration)) {
lockedMigrations.push(migration);
}
}
if (lockedMigrations.length === migrations.length) {
await connection.commit();
return lockedMigrations;
}
await connection.rollback();
return [];
} catch (error) {
await connection.rollback();
throw error;
} finally {
connection.release();
}
},
async unlock(migrations) {
for await (const migration of migrations) {
@ -247,17 +411,6 @@ export const createMysqlStorage = ({ table = defaultTable, connection }: MysqlSt
};
};
export const { initializeStorage } = createMysqlStorage({
table: process.env['MYSQL_TABLE'],
connection: process.env['MYSQL_URL'] ?? {
host: process.env['MYSQL_HOST'],
port: process.env['MYSQL_PORT'] ? Number.parseInt(process.env['MYSQL_PORT'], 10) : undefined,
user: process.env['MYSQL_USER'],
password: process.env['MYSQL_PASSWORD'],
database: process.env['MYSQL_DATABASE'],
},
});
export const createMysqlLoader = ({ connection }: MysqlLoaderOptions): LoaderPlugin => {
return {
loadableExtensions: ['.sql'],
@ -276,7 +429,16 @@ export const createMysqlLoader = ({ connection }: MysqlLoaderOptions): LoaderPlu
};
};
export const { loadableExtensions, loadMigration } = createMysqlLoader({
export const generateMigration: GenerateMigrationFunction = async (name) => {
return {
filename: `${getTimestampPrefix()}_${sanitizeMigrationName(name)}.sql`,
content: `-- Migration: ${name}
`,
};
};
const storage = createMysqlStorage({
table: process.env['MYSQL_TABLE'],
connection: process.env['MYSQL_URL'] ?? {
host: process.env['MYSQL_HOST'],
port: process.env['MYSQL_PORT'] ? Number.parseInt(process.env['MYSQL_PORT'], 10) : undefined,
@ -286,13 +448,22 @@ export const { loadableExtensions, loadMigration } = createMysqlLoader({
},
});
export const generateMigration: GenerateMigrationFunction = async (name) => {
return {
filename: `${getTimestampPrefix()}_${sanitizeMigrationName(name)}.sql`,
content: `-- Migration: ${name}
`,
};
};
const loader = createMysqlLoader({
connection: process.env['MYSQL_URL'] ?? {
host: process.env['MYSQL_HOST'],
port: process.env['MYSQL_PORT'] ? Number.parseInt(process.env['MYSQL_PORT'], 10) : undefined,
user: process.env['MYSQL_USER'],
password: process.env['MYSQL_PASSWORD'],
database: process.env['MYSQL_DATABASE'],
},
});
// eslint-disable-next-line prefer-destructuring
export const initializeStorage: () => Promise<Storage> = storage.initializeStorage;
// eslint-disable-next-line prefer-destructuring
export const loadableExtensions: string[] = loader.loadableExtensions;
// eslint-disable-next-line prefer-destructuring
export const loadMigration: (migration: MigrationMetadata) => Awaitable<MigrationFunction> = loader.loadMigration;
const defaultExport: EmigrateStorage & LoaderPlugin & GeneratorPlugin = {
initializeStorage,

View file

@ -0,0 +1,49 @@
/* eslint @typescript-eslint/naming-convention:0, import/no-extraneous-dependencies: 0 */
import process from 'node:process';
import { GenericContainer, type StartedTestContainer } from 'testcontainers';
let container: StartedTestContainer | undefined;
export const startDatabase = async (): Promise<{ port: number; host: string }> => {
if (process.env['CI']) {
const config = {
port: process.env['MYSQL_PORT'] ? Number.parseInt(process.env['MYSQL_PORT'], 10) : 3306,
// eslint-disable-next-line @typescript-eslint/prefer-nullish-coalescing
host: process.env['MYSQL_HOST'] || 'localhost',
};
console.log(`Connecting to MySQL from environment variables: ${JSON.stringify(config)}`);
return config;
}
if (!container) {
console.log('Starting MySQL container...');
const containerSetup = new GenericContainer('mysql:8.2')
.withEnvironment({
MYSQL_ROOT_PASSWORD: 'admin',
MYSQL_USER: 'emigrate',
MYSQL_PASSWORD: 'emigrate',
MYSQL_DATABASE: 'emigrate',
})
.withTmpFs({ '/var/lib/mysql': 'rw' })
.withCommand(['--sql-mode=NO_ENGINE_SUBSTITUTION', '--default-authentication-plugin=mysql_native_password'])
.withExposedPorts(3306)
.withReuse();
container = await containerSetup.start();
console.log('MySQL container started');
}
return { port: container.getMappedPort(3306), host: container.getHost() };
};
export const stopDatabase = async (): Promise<void> => {
if (container) {
console.log('Stopping MySQL container...');
await container.stop();
console.log('MySQL container stopped');
container = undefined;
}
};

View file

@ -1,8 +1,3 @@
{
"extends": "@emigrate/tsconfig/build.json",
"compilerOptions": {
"outDir": "dist"
},
"include": ["src"],
"exclude": ["node_modules", "dist"]
"extends": "@emigrate/tsconfig/build.json"
}

View file

@ -1,5 +1,69 @@
# @emigrate/plugin-generate-js
## 0.3.8
### Patch Changes
- Updated dependencies [d779286]
- @emigrate/plugin-tools@0.9.8
- @emigrate/types@0.12.2
## 0.3.7
### Patch Changes
- Updated dependencies [ca154fa]
- @emigrate/plugin-tools@0.9.7
- @emigrate/types@0.12.2
## 0.3.6
### Patch Changes
- Updated dependencies [db656c2]
- @emigrate/plugin-tools@0.9.6
- @emigrate/types@0.12.1
## 0.3.5
### Patch Changes
- Updated dependencies [94ad9fe]
- @emigrate/types@0.12.0
- @emigrate/plugin-tools@0.9.5
## 0.3.4
### Patch Changes
- Updated dependencies [ce15648]
- @emigrate/types@0.11.0
- @emigrate/plugin-tools@0.9.4
## 0.3.3
### Patch Changes
- Updated dependencies [f9a16d8]
- @emigrate/types@0.10.0
- @emigrate/plugin-tools@0.9.3
## 0.3.2
### Patch Changes
- Updated dependencies [a6c6e6d]
- @emigrate/types@0.9.1
- @emigrate/plugin-tools@0.9.2
## 0.3.1
### Patch Changes
- 3a8b06b: Don't use the `bun` key in `exports` as that would mean we have to include both built files and source files in each package, which is a bit wasteful. Maybe reconsider in the future if we can package only source files.
- Updated dependencies [3a8b06b]
- @emigrate/plugin-tools@0.9.1
## 0.3.0
### Minor Changes

View file

@ -7,7 +7,13 @@ This package contains an Emigrate plugin for generating migration files using Ja
Install the package:
```bash
npm install --save-dev @emigrate/plugin-generate-js
npm install @emigrate/cli @emigrate/plugin-generate-js
# or
pnpm add @emigrate/cli @emigrate/plugin-generate-js
# or
yarn add @emigrate/cli @emigrate/plugin-generate-js
# or
bun add @emigrate/cli @emigrate/plugin-generate-js
```
Use the plugin with the `emigrate new` command:

View file

@ -1,6 +1,6 @@
{
"name": "@emigrate/plugin-generate-js",
"version": "0.3.0",
"version": "0.3.8",
"publishConfig": {
"access": "public"
},
@ -10,7 +10,6 @@
"type": "module",
"exports": {
".": {
"bun": "./src/index.ts",
"import": "./dist/index.js",
"types": "./dist/index.d.ts"
}

View file

@ -1,8 +1,3 @@
{
"extends": "@emigrate/tsconfig/build.json",
"compilerOptions": {
"outDir": "dist"
},
"include": ["src"],
"exclude": ["node_modules", "dist"]
"extends": "@emigrate/tsconfig/build.json"
}

View file

@ -1,5 +1,62 @@
# @emigrate/plugin-tools
## 0.9.8
### Patch Changes
- d779286: Upgrade TypeScript to v5.5 and enable [isolatedDeclarations](https://devblogs.microsoft.com/typescript/announcing-typescript-5-5/#isolated-declarations)
- @emigrate/types@0.12.2
## 0.9.7
### Patch Changes
- ca154fa: Minimize package size by excluding \*.tsbuildinfo files
- Updated dependencies [ca154fa]
- @emigrate/types@0.12.2
## 0.9.6
### Patch Changes
- db656c2: Enable NPM provenance
- Updated dependencies [db656c2]
- @emigrate/types@0.12.1
## 0.9.5
### Patch Changes
- Updated dependencies [94ad9fe]
- @emigrate/types@0.12.0
## 0.9.4
### Patch Changes
- Updated dependencies [ce15648]
- @emigrate/types@0.11.0
## 0.9.3
### Patch Changes
- Updated dependencies [f9a16d8]
- @emigrate/types@0.10.0
## 0.9.2
### Patch Changes
- Updated dependencies [a6c6e6d]
- @emigrate/types@0.9.1
## 0.9.1
### Patch Changes
- 3a8b06b: Don't use the `bun` key in `exports` as that would mean we have to include both built files and source files in each package, which is a bit wasteful. Maybe reconsider in the future if we can package only source files.
## 0.9.0
### Minor Changes

View file

@ -1,8 +1,9 @@
{
"name": "@emigrate/plugin-tools",
"version": "0.9.0",
"version": "0.9.8",
"publishConfig": {
"access": "public"
"access": "public",
"provenance": true
},
"description": "",
"main": "dist/index.js",
@ -10,13 +11,13 @@
"type": "module",
"exports": {
".": {
"bun": "./src/index.ts",
"import": "./dist/index.js",
"types": "./dist/index.d.ts"
}
},
"files": [
"dist"
"dist",
"!dist/*.tsbuildinfo"
],
"scripts": {
"build": "tsc --pretty",

View file

@ -204,7 +204,7 @@ const load = async <T>(
*
* @returns A timestamp string in the format YYYYMMDDHHmmssmmm
*/
export const getTimestampPrefix = () => new Date().toISOString().replaceAll(/[-:ZT.]/g, '');
export const getTimestampPrefix = (): string => new Date().toISOString().replaceAll(/[-:ZT.]/g, '');
/**
* A utility function to sanitize a migration name so that it can be used as a filename
@ -212,7 +212,7 @@ export const getTimestampPrefix = () => new Date().toISOString().replaceAll(/[-:
* @param name A migration name to sanitize
* @returns A sanitized migration name that can be used as a filename
*/
export const sanitizeMigrationName = (name: string) =>
export const sanitizeMigrationName = (name: string): string =>
name
.replaceAll(/[\W/\\:|*?'"<>_]+/g, '_')
.trim()

View file

@ -1,8 +1,3 @@
{
"extends": "@emigrate/tsconfig/build.json",
"compilerOptions": {
"outDir": "dist"
},
"include": ["src"],
"exclude": ["node_modules", "dist"]
"extends": "@emigrate/tsconfig/build.json"
}

View file

@ -1,5 +1,79 @@
# @emigrate/postgres
## 0.3.2
### Patch Changes
- d779286: Upgrade TypeScript to v5.5 and enable [isolatedDeclarations](https://devblogs.microsoft.com/typescript/announcing-typescript-5-5/#isolated-declarations)
- Updated dependencies [d779286]
- @emigrate/plugin-tools@0.9.8
- @emigrate/types@0.12.2
## 0.3.1
### Patch Changes
- ca154fa: Minimize package size by excluding \*.tsbuildinfo files
- Updated dependencies [ca154fa]
- @emigrate/plugin-tools@0.9.7
- @emigrate/types@0.12.2
## 0.3.0
### Minor Changes
- 4442604: Automatically create the database if it doesn't exist, and the user have the permissions to do so
## 0.2.6
### Patch Changes
- db656c2: Enable NPM provenance
- Updated dependencies [db656c2]
- @emigrate/plugin-tools@0.9.6
- @emigrate/types@0.12.1
## 0.2.5
### Patch Changes
- f8a5cc7: Make sure the storage initialization crashes when a database connection can't be established
- Updated dependencies [94ad9fe]
- @emigrate/types@0.12.0
- @emigrate/plugin-tools@0.9.5
## 0.2.4
### Patch Changes
- Updated dependencies [ce15648]
- @emigrate/types@0.11.0
- @emigrate/plugin-tools@0.9.4
## 0.2.3
### Patch Changes
- Updated dependencies [f9a16d8]
- @emigrate/types@0.10.0
- @emigrate/plugin-tools@0.9.3
## 0.2.2
### Patch Changes
- Updated dependencies [a6c6e6d]
- @emigrate/types@0.9.1
- @emigrate/plugin-tools@0.9.2
## 0.2.1
### Patch Changes
- 3a8b06b: Don't use the `bun` key in `exports` as that would mean we have to include both built files and source files in each package, which is a bit wasteful. Maybe reconsider in the future if we can package only source files.
- Updated dependencies [3a8b06b]
- @emigrate/plugin-tools@0.9.1
## 0.2.0
### Minor Changes

View file

@ -17,7 +17,13 @@ This plugin is actually three different Emigrate plugins in one:
Install the plugin in your project, alongside the Emigrate CLI:
```bash
npm install --save-dev @emigrate/cli @emigrate/postgres
npm install @emigrate/cli @emigrate/postgres
# or
pnpm add @emigrate/cli @emigrate/postgres
# or
yarn add @emigrate/cli @emigrate/postgres
# or
bun add @emigrate/cli @emigrate/postgres
```
## Usage

View file

@ -1,8 +1,9 @@
{
"name": "@emigrate/postgres",
"version": "0.2.0",
"version": "0.3.2",
"publishConfig": {
"access": "public"
"access": "public",
"provenance": true
},
"description": "A PostgreSQL plugin for Emigrate. Uses a PostgreSQL database for storing migration history. Can load and generate .sql migration files.",
"main": "dist/index.js",
@ -10,13 +11,13 @@
"type": "module",
"exports": {
".": {
"bun": "./src/index.ts",
"import": "./dist/index.js",
"types": "./dist/index.d.ts"
}
},
"files": [
"dist"
"dist",
"!dist/*.tsbuildinfo"
],
"scripts": {
"build": "tsc --pretty",

View file

@ -11,6 +11,8 @@ import {
type GeneratorPlugin,
type SerializedError,
type MigrationHistoryEntry,
type Awaitable,
type MigrationFunction,
} from '@emigrate/types';
const defaultTable = 'migrations';
@ -32,12 +34,12 @@ export type PostgresLoaderOptions = {
connection: ConnectionOptions | string;
};
const getPool = (connection: ConnectionOptions | string) => {
if (typeof connection === 'string') {
return postgres(connection);
}
const getPool = async (connection: ConnectionOptions | string): Promise<Sql> => {
const sql = typeof connection === 'string' ? postgres(connection) : postgres(connection);
return postgres(connection);
await sql`SELECT 1`;
return sql;
};
const lockMigration = async (sql: Sql, table: string, migration: MigrationMetadata) => {
@ -92,6 +94,64 @@ const deleteMigration = async (sql: Sql, table: string, migration: MigrationMeta
return result.count === 1;
};
const getDatabaseName = (config: ConnectionOptions | string) => {
if (typeof config === 'string') {
const uri = new URL(config);
return uri.pathname.replace(/^\//u, '');
}
return config.database ?? '';
};
const setDatabaseName = <T extends ConnectionOptions | string>(config: T, databaseName: string): T => {
if (typeof config === 'string') {
const uri = new URL(config);
uri.pathname = `/${databaseName}`;
return uri.toString() as T;
}
if (typeof config === 'object') {
return {
...config,
database: databaseName,
};
}
throw new Error('Invalid connection config');
};
const initializeDatabase = async (config: ConnectionOptions | string) => {
let sql: Sql | undefined;
try {
sql = await getPool(config);
await sql.end();
} catch (error) {
await sql?.end();
// The error code 3D000 means that the database does not exist, but the user might have the permissions to create it
if (error && typeof error === 'object' && 'code' in error && error.code === '3D000') {
const databaseName = getDatabaseName(config);
const postgresConfig = setDatabaseName(config, 'postgres');
const postgresSql = await getPool(postgresConfig);
try {
await postgresSql`CREATE DATABASE ${postgresSql(databaseName)}`;
// Any database creation error here will be propagated
} finally {
await postgresSql.end();
}
} else {
// In this case we don't know how to handle the error, so we rethrow it
throw error;
}
}
};
const initializeTable = async (sql: Sql, table: string) => {
const [row] = await sql<Array<{ exists: 1 }>>`
SELECT 1 as exists
@ -122,7 +182,9 @@ export const createPostgresStorage = ({
}: PostgresStorageOptions): EmigrateStorage => {
return {
async initializeStorage() {
const sql = getPool(connection);
await initializeDatabase(connection);
const sql = await getPool(connection);
try {
await initializeTable(sql, table);
@ -195,23 +257,12 @@ export const createPostgresStorage = ({
};
};
export const { initializeStorage } = createPostgresStorage({
table: process.env['POSTGRES_TABLE'],
connection: process.env['POSTGRES_URL'] ?? {
host: process.env['POSTGRES_HOST'],
port: process.env['POSTGRES_PORT'] ? Number.parseInt(process.env['POSTGRES_PORT'], 10) : undefined,
user: process.env['POSTGRES_USER'],
password: process.env['POSTGRES_PASSWORD'],
database: process.env['POSTGRES_DB'],
},
});
export const createPostgresLoader = ({ connection }: PostgresLoaderOptions): LoaderPlugin => {
return {
loadableExtensions: ['.sql'],
async loadMigration(migration) {
return async () => {
const sql = getPool(connection);
const sql = await getPool(connection);
try {
// @ts-expect-error The "simple" option is not documented, but it exists
@ -224,7 +275,16 @@ export const createPostgresLoader = ({ connection }: PostgresLoaderOptions): Loa
};
};
export const { loadableExtensions, loadMigration } = createPostgresLoader({
export const generateMigration: GenerateMigrationFunction = async (name) => {
return {
filename: `${getTimestampPrefix()}_${sanitizeMigrationName(name)}.sql`,
content: `-- Migration: ${name}
`,
};
};
const storage = createPostgresStorage({
table: process.env['POSTGRES_TABLE'],
connection: process.env['POSTGRES_URL'] ?? {
host: process.env['POSTGRES_HOST'],
port: process.env['POSTGRES_PORT'] ? Number.parseInt(process.env['POSTGRES_PORT'], 10) : undefined,
@ -234,13 +294,22 @@ export const { loadableExtensions, loadMigration } = createPostgresLoader({
},
});
export const generateMigration: GenerateMigrationFunction = async (name) => {
return {
filename: `${getTimestampPrefix()}_${sanitizeMigrationName(name)}.sql`,
content: `-- Migration: ${name}
`,
};
};
const loader = createPostgresLoader({
connection: process.env['POSTGRES_URL'] ?? {
host: process.env['POSTGRES_HOST'],
port: process.env['POSTGRES_PORT'] ? Number.parseInt(process.env['POSTGRES_PORT'], 10) : undefined,
user: process.env['POSTGRES_USER'],
password: process.env['POSTGRES_PASSWORD'],
database: process.env['POSTGRES_DB'],
},
});
// eslint-disable-next-line prefer-destructuring
export const initializeStorage: () => Promise<Storage> = storage.initializeStorage;
// eslint-disable-next-line prefer-destructuring
export const loadableExtensions: string[] = loader.loadableExtensions;
// eslint-disable-next-line prefer-destructuring
export const loadMigration: (migration: MigrationMetadata) => Awaitable<MigrationFunction> = loader.loadMigration;
const defaultExport: EmigrateStorage & LoaderPlugin & GeneratorPlugin = {
initializeStorage,

View file

@ -1,8 +1,3 @@
{
"extends": "@emigrate/tsconfig/build.json",
"compilerOptions": {
"outDir": "dist"
},
"include": ["src"],
"exclude": ["node_modules", "dist"]
"extends": "@emigrate/tsconfig/build.json"
}

View file

@ -1,5 +1,83 @@
# @emigrate/reporter-pino
## 0.6.5
### Patch Changes
- d779286: Upgrade TypeScript to v5.5 and enable [isolatedDeclarations](https://devblogs.microsoft.com/typescript/announcing-typescript-5-5/#isolated-declarations)
- @emigrate/types@0.12.2
## 0.6.4
### Patch Changes
- ca154fa: Minimize package size by excluding \*.tsbuildinfo files
- Updated dependencies [ca154fa]
- @emigrate/types@0.12.2
## 0.6.3
### Patch Changes
- 081ab34: Make sure Pino outputs logs in Bun environments
## 0.6.2
### Patch Changes
- 1065322: Show correct status for migrations for the "list" and "new" commands
## 0.6.1
### Patch Changes
- db656c2: Enable NPM provenance
- Updated dependencies [db656c2]
- @emigrate/types@0.12.1
## 0.6.0
### Minor Changes
- 86e0d52: Adapt to the new Reporter interface, i.e. the removal of the "remove" command related methods
### Patch Changes
- ef45be9: Show number of skipped migrations correctly in the command output
- Updated dependencies [94ad9fe]
- @emigrate/types@0.12.0
## 0.5.0
### Minor Changes
- a4da353: Handle the new onAbort method
### Patch Changes
- Updated dependencies [ce15648]
- @emigrate/types@0.11.0
## 0.4.3
### Patch Changes
- Updated dependencies [f9a16d8]
- @emigrate/types@0.10.0
## 0.4.2
### Patch Changes
- Updated dependencies [a6c6e6d]
- @emigrate/types@0.9.1
## 0.4.1
### Patch Changes
- 3a8b06b: Don't use the `bun` key in `exports` as that would mean we have to include both built files and source files in each package, which is a bit wasteful. Maybe reconsider in the future if we can package only source files.
## 0.4.0
### Minor Changes

View file

@ -8,7 +8,13 @@ Which is great both in production environments and for piping the output to othe
Install the reporter in your project, alongside the Emigrate CLI:
```bash
npm install --save-dev @emigrate/cli @emigrate/reporter-pino
npm install @emigrate/cli @emigrate/reporter-pino
# or
pnpm add @emigrate/cli @emigrate/reporter-pino
# or
yarn add @emigrate/cli @emigrate/reporter-pino
# or
bun add @emigrate/cli @emigrate/reporter-pino
```
## Usage

View file

@ -1,8 +1,9 @@
{
"name": "@emigrate/reporter-pino",
"version": "0.4.0",
"version": "0.6.5",
"publishConfig": {
"access": "public"
"access": "public",
"provenance": true
},
"description": "A Pino reporter for Emigrate for logging the migration process.",
"main": "dist/index.js",
@ -10,13 +11,13 @@
"type": "module",
"exports": {
".": {
"bun": "./src/index.ts",
"import": "./dist/index.js",
"types": "./dist/index.d.ts"
}
},
"files": [
"dist"
"dist",
"!dist/*.tsbuildinfo"
],
"scripts": {
"build": "tsc --pretty",
@ -40,7 +41,9 @@
"pino": "8.16.2"
},
"devDependencies": {
"@emigrate/tsconfig": "workspace:*"
"@emigrate/tsconfig": "workspace:*",
"@types/bun": "1.0.5",
"bun-types": "1.0.26"
},
"volta": {
"extends": "../../package.json"

View file

@ -52,11 +52,16 @@ class PinoReporter implements Required<EmigrateReporter> {
scope: command,
version,
},
transport: process.isBun ? { target: 'pino/file', options: { destination: 1 } } : undefined,
});
this.#logger.info({ parameters }, `Emigrate "${command}" initialized${parameters.dry ? ' (dry-run)' : ''}`);
}
onAbort(reason: Error): Awaitable<void> {
this.#logger.error({ reason }, `Emigrate "${this.#command}" shutting down`);
}
onCollectedMigrations(migrations: MigrationMetadata[]): Awaitable<void> {
this.#migrations = migrations;
}
@ -65,29 +70,40 @@ class PinoReporter implements Required<EmigrateReporter> {
const migrations = this.#migrations ?? [];
if (migrations.length === 0) {
this.#logger.info('No pending migrations found');
this.#logger.info('No migrations found');
return;
}
const statusText = this.#command === 'list' ? 'migrations are pending' : 'pending migrations to run';
if (migrations.length === lockedMigrations.length) {
this.#logger.info(
{ migrationCount: lockedMigrations.length },
`${lockedMigrations.length} pending migrations to run`,
);
this.#logger.info({ migrationCount: lockedMigrations.length }, `${lockedMigrations.length} ${statusText}`);
return;
}
const nonLockedMigrations = migrations.filter(
(migration) => !lockedMigrations.some((lockedMigration) => lockedMigration.name === migration.name),
);
const failedMigrations = nonLockedMigrations.filter(
(migration) => 'status' in migration && migration.status === 'failed',
);
const unlockableCount = this.#command === 'up' ? nonLockedMigrations.length - failedMigrations.length : 0;
let skippedCount = 0;
let failedCount = 0;
for (const migration of migrations) {
const isLocked = lockedMigrations.some((lockedMigration) => lockedMigration.name === migration.name);
if (isLocked) {
continue;
}
if ('status' in migration) {
if (migration.status === 'failed') {
failedCount += 1;
} else if (migration.status === 'skipped') {
skippedCount += 1;
}
}
}
const parts = [
`${lockedMigrations.length} of ${migrations.length} pending migrations to run`,
unlockableCount > 0 ? `(${unlockableCount} locked)` : '',
failedMigrations.length > 0 ? `(${failedMigrations.length} failed)` : '',
`${lockedMigrations.length} of ${migrations.length} ${statusText}`,
skippedCount > 0 ? `(${skippedCount} skipped)` : '',
failedCount > 0 ? `(${failedCount} failed)` : '',
].filter(Boolean);
this.#logger.info({ migrationCount: lockedMigrations.length }, parts.join(' '));
@ -100,27 +116,28 @@ class PinoReporter implements Required<EmigrateReporter> {
);
}
onMigrationRemoveStart(migration: MigrationMetadata): Awaitable<void> {
this.#logger.debug({ migration: migration.relativeFilePath }, `Removing migration: ${migration.name}`);
}
onMigrationRemoveSuccess(migration: MigrationMetadataFinished): Awaitable<void> {
this.#logger.info({ migration: migration.relativeFilePath }, `Successfully removed migration: ${migration.name}`);
}
onMigrationRemoveError(migration: MigrationMetadataFinished, error: Error): Awaitable<void> {
this.#logger.error(
{ migration: migration.relativeFilePath, [this.errorKey]: error },
`Failed to remove migration: ${migration.name}`,
);
}
onMigrationStart(migration: MigrationMetadata): Awaitable<void> {
this.#logger.info({ migration: migration.relativeFilePath }, `${migration.name} (running)`);
let status = 'running';
if (this.#command === 'remove') {
status = 'removing';
} else if (this.#command === 'new') {
status = 'creating';
}
this.#logger.info({ migration: migration.relativeFilePath }, `${migration.name} (${status})`);
}
onMigrationSuccess(migration: MigrationMetadataFinished): Awaitable<void> {
this.#logger.info({ migration: migration.relativeFilePath }, `${migration.name} (${migration.status})`);
let status = 'done';
if (this.#command === 'remove') {
status = 'removed';
} else if (this.#command === 'new') {
status = 'created';
}
this.#logger.info({ migration: migration.relativeFilePath }, `${migration.name} (${status})`);
}
onMigrationError(migration: MigrationMetadataFinished, error: Error): Awaitable<void> {
@ -170,16 +187,15 @@ class PinoReporter implements Required<EmigrateReporter> {
}
}
const result =
this.#command === 'remove'
? { removed: done, failed, skipped, pending, total }
: { done, failed, skipped, pending, total };
if (error) {
this.#logger.error(
{ result: { failed, done, skipped, pending, total }, [this.errorKey]: error },
`Emigrate "${this.#command}" failed`,
);
this.#logger.error({ result, [this.errorKey]: error }, `Emigrate "${this.#command}" failed`);
} else {
this.#logger.info(
{ result: { failed, done, skipped, pending, total } },
`Emigrate "${this.#command}" finished successfully`,
);
this.#logger.info({ result }, `Emigrate "${this.#command}" finished successfully`);
}
}
}
@ -188,6 +204,8 @@ export const createPinoReporter = (options: PinoReporterOptions = {}): EmigrateR
return new PinoReporter(options);
};
export default createPinoReporter({
const defaultExport: EmigrateReporter = createPinoReporter({
level: process.env['LOG_LEVEL'],
});
export default defaultExport;

View file

@ -1,8 +1,3 @@
{
"extends": "@emigrate/tsconfig/build.json",
"compilerOptions": {
"outDir": "dist"
},
"include": ["src"],
"exclude": ["node_modules", "dist"]
"extends": "@emigrate/tsconfig/build.json"
}

View file

@ -1,5 +1,55 @@
# @emigrate/storage-fs
## 0.4.7
### Patch Changes
- ca154fa: Minimize package size by excluding \*.tsbuildinfo files
- Updated dependencies [ca154fa]
- @emigrate/types@0.12.2
## 0.4.6
### Patch Changes
- db656c2: Enable NPM provenance
- Updated dependencies [db656c2]
- @emigrate/types@0.12.1
## 0.4.5
### Patch Changes
- Updated dependencies [94ad9fe]
- @emigrate/types@0.12.0
## 0.4.4
### Patch Changes
- Updated dependencies [ce15648]
- @emigrate/types@0.11.0
## 0.4.3
### Patch Changes
- Updated dependencies [f9a16d8]
- @emigrate/types@0.10.0
## 0.4.2
### Patch Changes
- Updated dependencies [a6c6e6d]
- @emigrate/types@0.9.1
## 0.4.1
### Patch Changes
- 3a8b06b: Don't use the `bun` key in `exports` as that would mean we have to include both built files and source files in each package, which is a bit wasteful. Maybe reconsider in the future if we can package only source files.
## 0.4.0
### Minor Changes

View file

@ -7,7 +7,13 @@ A file system storage plugin for Emigrate, suitable for simple migration setups.
Install the storage plugin in your project, alongside the Emigrate CLI:
```bash
npm install --save-dev @emigrate/cli @emigrate/storage-fs
npm install @emigrate/cli @emigrate/storage-fs
# or
pnpm add @emigrate/cli @emigrate/storage-fs
# or
yarn add @emigrate/cli @emigrate/storage-fs
# or
bun add @emigrate/cli @emigrate/storage-fs
```
## Usage

View file

@ -1,8 +1,9 @@
{
"name": "@emigrate/storage-fs",
"version": "0.4.0",
"version": "0.4.7",
"publishConfig": {
"access": "public"
"access": "public",
"provenance": true
},
"description": "A storage plugin for Emigrate for storing the migration history in a file",
"main": "dist/index.js",
@ -10,13 +11,13 @@
"type": "module",
"exports": {
".": {
"bun": "./src/index.ts",
"import": "./dist/index.js",
"types": "./dist/index.d.ts"
}
},
"files": [
"dist"
"dist",
"!dist/*.tsbuildinfo"
],
"scripts": {
"build": "tsc --pretty",

Some files were not shown because too many files have changed in this diff Show more