Compare commits

...

150 commits

Author SHA1 Message Date
52844d7a09 ci(mysql): add @emigrate/mysql integration tests to GitHub Actions
Some checks failed
Deploy to GitHub Pages / build (push) Failing after 2m38s
Deploy to GitHub Pages / deploy (push) Has been skipped
Integration Tests / Emigrate MySQL integration tests (push) Failing after 4m0s
Release / Release (push) Failing after 12s
CI / Build and Test (push) Has been cancelled
2025-04-25 09:48:34 +02:00
github-actions[bot]
fa3fb20dc5 chore(release): version packages 2025-04-24 16:06:29 +02:00
26240f49ff fix(mysql): make sure migrations are run in order when run concurrently
Now we either lock all or none of the migrations to run,
to make sure they are not out of order when multiple instances of Emigrate run concurrently.
2025-04-24 15:57:44 +02:00
6eb60177c5 fix: use another changesets-action version 2024-08-09 16:03:34 +02:00
b3b603b2fc feat: make aggregated GitHub releases instead of one per package
And also publish packages with unreleased changes tagged with `next` to NPM
2024-08-09 15:49:22 +02:00
bb9d674cd7 chore: turn off Turbo's UI as it messes with the terminal and is not as intuitive as it seems 2024-06-27 16:05:45 +02:00
c151031d41 chore(deps): upgrade Turbo and opt out from telemetry 2024-06-27 16:05:45 +02:00
dependabot[bot]
48181d88b7 chore(deps): bump turbo from 1.10.16 to 2.0.5
Bumps [turbo](https://github.com/vercel/turbo) from 1.10.16 to 2.0.5.
- [Release notes](https://github.com/vercel/turbo/releases)
- [Changelog](https://github.com/vercel/turbo/blob/main/release.md)
- [Commits](https://github.com/vercel/turbo/compare/v1.10.16...v2.0.5)

---
updated-dependencies:
- dependency-name: turbo
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-06-27 16:05:45 +02:00
d779286084 chore(deps): upgrade TypeScript to v5.5 and enable isolatedDeclarations 2024-06-27 15:38:50 +02:00
ef848a0553 chore(deps): re-add the specific PNPM version for the deploy workflow 2024-06-27 13:27:34 +02:00
4d12402595 chore(deps): make sure the correct PNPM version is used (everywhere) 2024-06-27 11:55:33 +02:00
be5c4d28b6 chore(deps): make sure the correct PNPM version is used 2024-06-27 11:47:40 +02:00
2cefa2508b chore(deps): upgrade PNPM to v9.4.0 2024-06-27 11:12:21 +02:00
0ff9f60d59 chore(deps): upgrade all action dependencies
Closes #70, #128, #135, #145
2024-06-27 10:59:47 +02:00
github-actions[bot]
31693ddb3c chore(release): version packages 2024-06-25 09:21:37 +02:00
57498db248 fix(mysql): close database connections gracefully when using Bun 2024-06-25 08:22:56 +02:00
github-actions[bot]
cf620a191d chore(release): version packages 2024-05-30 10:16:07 +02:00
ca154fadeb fix: exclude tsbuildinfo files from published packages for smaller bundles 2024-05-30 10:12:37 +02:00
github-actions[bot]
f300f147fa chore(release): version packages 2024-05-29 16:23:49 +02:00
44426042cf feat(mysql,postgres): automatically create the database if it doesn't exist (fixes #147) 2024-05-29 16:19:32 +02:00
aef2d7c861 fix(mysql): handle table initialization better in clustered database environments
The CREATE TABLE IF NOT EXISTS yields more locks than checking if the table exists using a SELECT first before running CREATE TABLE.
This makes more sense as the table usually already exists, so we optimize for the happy path.
2024-05-29 15:10:59 +02:00
github-actions[bot]
e396266f3d chore(release): version packages 2024-04-04 14:46:54 +02:00
081ab34cb4 fix(reporter-pino): make sure the Pino reporter outputs logs in Bun environments 2024-04-04 14:43:38 +02:00
dependabot[bot]
520fdd94ef chore(deps): bump changesets/action from 1.4.5 to 1.4.6
Bumps [changesets/action](https://github.com/changesets/action) from 1.4.5 to 1.4.6.
- [Release notes](https://github.com/changesets/action/releases)
- [Changelog](https://github.com/changesets/action/blob/main/CHANGELOG.md)
- [Commits](https://github.com/changesets/action/compare/v1.4.5...v1.4.6)

---
updated-dependencies:
- dependency-name: changesets/action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-03-15 10:23:19 +01:00
github-actions[bot]
d1bd8fc74f chore(release): version packages 2024-03-15 09:40:25 +01:00
41522094dd fix(cli): handle the case where the config is returned as an object with a nested default property 2024-02-19 10:59:02 +01:00
dependabot[bot]
6763f338ce chore(deps): bump the commitlint group with 2 updates
Bumps the commitlint group with 2 updates: [@commitlint/cli](https://github.com/conventional-changelog/commitlint/tree/HEAD/@commitlint/cli) and [@commitlint/config-conventional](https://github.com/conventional-changelog/commitlint/tree/HEAD/@commitlint/config-conventional).


Updates `@commitlint/cli` from 18.4.3 to 18.6.1
- [Release notes](https://github.com/conventional-changelog/commitlint/releases)
- [Changelog](https://github.com/conventional-changelog/commitlint/blob/master/@commitlint/cli/CHANGELOG.md)
- [Commits](https://github.com/conventional-changelog/commitlint/commits/v18.6.1/@commitlint/cli)

Updates `@commitlint/config-conventional` from 18.4.3 to 18.6.1
- [Release notes](https://github.com/conventional-changelog/commitlint/releases)
- [Changelog](https://github.com/conventional-changelog/commitlint/blob/master/@commitlint/config-conventional/CHANGELOG.md)
- [Commits](https://github.com/conventional-changelog/commitlint/commits/v18.6.1/@commitlint/config-conventional)

---
updated-dependencies:
- dependency-name: "@commitlint/cli"
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: commitlint
- dependency-name: "@commitlint/config-conventional"
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: commitlint
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-02-14 11:55:46 +01:00
github-actions[bot]
6c4e441eff chore(release): version packages 2024-02-13 13:00:15 +01:00
57a099169e fix(cli): cleanup AbortSignal event listeners to avoid MaxListenersExceededWarning 2024-02-12 20:59:26 +01:00
github-actions[bot]
ae9e8b1b04 chore(release): version packages 2024-02-12 13:56:28 +01:00
1065322435 fix(pino): show correct statuses for the "list" and "new" commands 2024-02-12 13:47:55 +01:00
17feb2d2c2 fix(mysql): only unreference connections in a Bun environment as it has problems with Node for some reason 2024-02-12 13:35:18 +01:00
dependabot[bot]
98e3ed5c1b chore(deps): bump pnpm/action-setup from 2.4.0 to 3.0.0
Bumps [pnpm/action-setup](https://github.com/pnpm/action-setup) from 2.4.0 to 3.0.0.
- [Release notes](https://github.com/pnpm/action-setup/releases)
- [Commits](https://github.com/pnpm/action-setup/compare/v2.4.0...v3.0.0)

---
updated-dependencies:
- dependency-name: pnpm/action-setup
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-02-12 09:05:01 +01:00
1d33d65135 docs(cli): change URL path from /commands/ to /cli/ 2024-02-09 14:48:51 +01:00
0c597fd7a8 docs(cli): add a main page for Emigrate's CLI 2024-02-09 14:48:51 +01:00
github-actions[bot]
0360d0b82f chore(release): version packages 2024-02-09 14:05:35 +01:00
c838ffb7f3 fix(typescript): load config written in TypeScript without the typescript package when using Bun, Deno or tsx 2024-02-09 14:00:24 +01:00
198aa545eb fix(mysql): unreference all connections so that the process can exit cleanly
In a NodeJS environment it will just work as before, but in a Bun environment it will make the "forced exit" error message disappear and remove the 10 s waiting period when migrations are done.
2024-02-09 13:13:27 +01:00
e7ec75d9e1 docs(faq): add note on using Emigrate for existing databases 2024-02-06 09:29:53 +01:00
b62c692846 docs(reporters): add "json" reporter and rename "default" to "pretty" 2024-02-06 09:22:35 +01:00
18382ce961 feat(reporters): add built-in "json" reporter and rename "default" to "pretty" 2024-02-06 09:22:35 +01:00
github-actions[bot]
4e8ac5294d chore(release): version packages 2024-02-05 15:51:38 +01:00
61cbcbd691 fix(cli): force exiting after 10 seconds should not change the exit code
If all migrations have been run successfully we want the exit code to be 0 even though we had to force exit the process.
This is because on some platforms (e.g. Bun) all handles are not cleaned up the same as in NodeJS, so lets be forgiving.
2024-02-05 15:48:55 +01:00
github-actions[bot]
f720aae83d chore(release): version packages 2024-02-05 15:14:33 +01:00
543b7f6f77 fix(bun): import setTimeout/setInterval from "node:timers" for .unref() to correctly work 2024-02-05 15:12:30 +01:00
db656c2310 chore: enable NPM provenance 2024-02-05 15:08:47 +01:00
github-actions[bot]
ff89dd4f86 chore(release): version packages 2024-02-05 14:54:05 +01:00
f8a5cc728d fix(storage): make sure the storage initialization crashes when db connection can't be established 2024-02-05 14:50:17 +01:00
f6761fe434 chore: add missing docs changeset 2024-02-05 14:29:30 +01:00
ef45be9233 fix(reporters): show number of skipped migrations correctly in command output 2024-02-05 14:17:30 +01:00
69bd88afdb chore: allow many parameters in test files 2024-01-26 16:09:49 +01:00
0faebbe647 docs(cli): document the relative file path support for the "remove" command 2024-01-26 16:09:49 +01:00
2f6b4d23e0 fix(reporter-default): don't dim decimal points in durations in the default reporter 2024-01-26 16:09:49 +01:00
1f139fd975 feat(remove): rework the "remove" command to be more similar to "up" and "list"
The old reporter methods related to the "remove" command is not used anymore and instead the shared `onMigrationStart`, `onMigrationSuccess` and `onMigrationError` methods are used.
Some preparation has also been made to support for removing multiple migrations at once in the future, similar to how the `--from` and `--to` CLI options work for the "up" command.
2024-01-26 16:09:49 +01:00
86e0d52e5c feat(reporter-pino): adapt to the new Reporter interface 2024-01-26 16:09:49 +01:00
94ad9feae9 feat(types): simplify the EmigrateReporter interface by removing the "remove" specific methods 2024-01-26 16:09:49 +01:00
f2d4bb346e fix(cli): make sure errors passed to the storage are serialized correctly 2024-01-26 16:09:49 +01:00
f1b9098750 fix(migrations): don't include folders when collecting migrations
It should be possible to have folders inside your migrations folder
2024-01-26 09:26:49 +01:00
9109238b86 feat(cli): improve the "up" commands --from and --to options
The given values can either be migration names or relative paths to migration files.
The given migration must exist to avoid accidentally running migrations that wasn't intended to run.
2024-01-26 09:13:03 +01:00
github-actions[bot]
986456b038 chore(release): version packages 2024-01-23 11:44:05 +01:00
b56b6daf73 fix(cli): handle migration history entries without file extensions correctly
...even when the migration file names include periods in their names.
2024-01-23 11:36:47 +01:00
github-actions[bot]
ea327bbc49 chore(release): version packages 2024-01-22 13:49:54 +01:00
121492b303 fix(cli): sort migrations lexicographically for real 2024-01-22 13:48:09 +01:00
github-actions[bot]
bddb2d6b14 chore(release): version packages 2024-01-22 11:32:48 +01:00
a4da353d5a feat(cli): add graceful process abort
Using an AbortSignal and Promise.race we abandon running migrations that take longer to complete after the process is aborted than the given abortRespite period
2024-01-22 11:30:06 +01:00
ce15648251 feat(types): add type for the onAbort Reporter method 2024-01-22 11:30:06 +01:00
github-actions[bot]
576dfbb124 chore(release): version packages 2024-01-19 13:48:24 +01:00
49d8925778 fix(docs): remove access control from package config 2024-01-19 13:43:59 +01:00
98adcda37e fix(reporters): use better wording in the header in the default reporter
Also show the number of skipped migrations
2024-01-19 13:43:59 +01:00
cbc35bd646 chore: start writing changesets for the documentation 2024-01-19 13:43:59 +01:00
e739e453d7 docs: add Baseline guide 2024-01-19 13:43:59 +01:00
f515c8a854 feat(cli): add --no-execution option to the "up" command
...which can be used to log manually run migrations as successful or for baselining a database.
2024-01-19 13:43:59 +01:00
e71c318ea5 test(up): structure the up tests in a better way 2024-01-19 13:43:59 +01:00
9ef0fa2776 feat(cli): add --from and --to options to limit what migrations to run 2024-01-19 13:43:59 +01:00
02c142e39a feat(up): add --limit option to limit the number of migrations to run 2024-01-19 13:43:59 +01:00
bf4d596980 fix(cli): clarify which options that takes parameters 2024-01-19 13:43:59 +01:00
github-actions[bot]
424d3e9903 chore(release): version packages 2024-01-18 15:25:51 +01:00
73a8a42e5f fix(history): support a migration history with entries without file extensions (.js is assumed in such case) 2024-01-18 15:18:35 +01:00
github-actions[bot]
114979f154 chore(release): version packages 2024-01-18 14:52:48 +01:00
dependabot[bot]
b083e88bac chore(deps): bump cosmiconfig from 8.3.6 to 9.0.0
Bumps [cosmiconfig](https://github.com/cosmiconfig/cosmiconfig) from 8.3.6 to 9.0.0.
- [Release notes](https://github.com/cosmiconfig/cosmiconfig/releases)
- [Changelog](https://github.com/cosmiconfig/cosmiconfig/blob/main/CHANGELOG.md)
- [Commits](https://github.com/cosmiconfig/cosmiconfig/compare/cosmiconfig-v8.3.6...v9.0.0)

---
updated-dependencies:
- dependency-name: cosmiconfig
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-18 14:50:14 +01:00
cbc3193626 chore(deps): downgrade Turbo to 1.10.16 as it did work in the CI env 2024-01-18 11:13:28 +01:00
1b8439a530 ci: and does this make any difference? 2024-01-18 11:09:12 +01:00
891402c7d4 ci: does this make the release flow work? 2024-01-18 11:02:07 +01:00
github-actions[bot]
9130af7b12 chore(release): version packages 2024-01-18 10:50:05 +01:00
83dc618c2e fix(cli): remove --enable-source-maps flag 2024-01-18 10:46:04 +01:00
dependabot[bot]
a6e096bcbc chore(deps): bump turbo from 1.10.16 to 1.11.3
Bumps [turbo](https://github.com/vercel/turbo) from 1.10.16 to 1.11.3.
- [Release notes](https://github.com/vercel/turbo/releases)
- [Changelog](https://github.com/vercel/turbo/blob/main/release.md)
- [Commits](https://github.com/vercel/turbo/compare/v1.10.16...v1.11.3)

---
updated-dependencies:
- dependency-name: turbo
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-16 14:06:06 +01:00
dependabot[bot]
9bfd0e44c3 chore(deps): bump typescript from 5.2.2 to 5.3.3
Bumps [typescript](https://github.com/Microsoft/TypeScript) from 5.2.2 to 5.3.3.
- [Release notes](https://github.com/Microsoft/TypeScript/releases)
- [Commits](https://github.com/Microsoft/TypeScript/compare/v5.2.2...v5.3.3)

---
updated-dependencies:
- dependency-name: typescript
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-15 10:21:31 +01:00
dependabot[bot]
af83bf6d7f chore(deps): bump tsx from 4.6.2 to 4.7.0
Bumps [tsx](https://github.com/privatenumber/tsx) from 4.6.2 to 4.7.0.
- [Release notes](https://github.com/privatenumber/tsx/releases)
- [Changelog](https://github.com/privatenumber/tsx/blob/develop/release.config.cjs)
- [Commits](https://github.com/privatenumber/tsx/compare/v4.6.2...v4.7.0)

---
updated-dependencies:
- dependency-name: tsx
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-15 09:57:03 +01:00
dependabot[bot]
a5264ab3d4 chore(deps): bump lint-staged from 15.1.0 to 15.2.0
Bumps [lint-staged](https://github.com/okonet/lint-staged) from 15.1.0 to 15.2.0.
- [Release notes](https://github.com/okonet/lint-staged/releases)
- [Changelog](https://github.com/lint-staged/lint-staged/blob/master/CHANGELOG.md)
- [Commits](https://github.com/okonet/lint-staged/compare/v15.1.0...v15.2.0)

---
updated-dependencies:
- dependency-name: lint-staged
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-15 09:49:09 +01:00
dependabot[bot]
0cce84743d chore(deps): bump actions/checkout from 3 to 4
Bumps [actions/checkout](https://github.com/actions/checkout) from 3 to 4.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v3...v4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-11 06:44:34 +01:00
a130248687 docs: update loader plugin intro after adding TypeScript support 2024-01-08 11:06:59 +01:00
github-actions[bot]
3c54917c35 chore(release): version packages 2023-12-28 09:20:03 +01:00
9a605a85f1 feat: add support for TypeScript migration files
And add a guide to the documentation on how to set it up for NodeJS
2023-12-20 15:27:03 +01:00
github-actions[bot]
59eb90b8cb chore(release): version packages 2023-12-20 11:24:17 +01:00
9f91bdcfa0 feat(cli): add the --import option for importing modules/packages before commands are run
Can for instance be used to load environment variables using Dotenv
2023-12-20 11:08:27 +01:00
e6e4433018 feat(cli): rename extension short option from -e to -x
BREAKING CHANGE: if you've been using the `-e` short option you should change it to `-x` or use the long option name `--extension`
2023-12-20 09:27:43 +01:00
f9a16d87a1 feat: add color option to CLI and configuration file
The option is used to force enable/disable color output and is passed to the reporter which should respect it
2023-12-20 09:11:01 +01:00
7bae76f496 docs: include Deno usage instructions in the documentation 2023-12-19 15:40:05 +01:00
github-actions[bot]
e8e35215be chore(release): version packages 2023-12-19 14:51:40 +01:00
a6c6e6dc78 fix(types): forgot about the bun key in one package 2023-12-19 14:49:29 +01:00
github-actions[bot]
e67ce0de1e chore(release): version packages 2023-12-19 14:41:04 +01:00
beb6cf7719 chore(deps): upgrade ansis package 2023-12-19 14:34:54 +01:00
3a8b06b3b1 fix: revert usage of bun key in package.json exports 2023-12-19 14:29:42 +01:00
github-actions[bot]
747f9dbddb chore(release): version packages 2023-12-19 14:09:15 +01:00
ce6946cac4 feat: support for Bun 2023-12-19 14:06:00 +01:00
github-actions[bot]
c284cc48d1 chore(release): version packages 2023-12-19 13:33:11 +01:00
17c4723bb8 feat(postgres): implement the first version of the PostgreSQL plugin 2023-12-19 13:27:57 +01:00
3d34b8ba13 chore(deps): remove unused @manypkg/cli package 2023-12-19 10:33:05 +01:00
f8e13f0d66 docs: add some basic documentation for the storage plugin api 2023-12-19 10:29:58 +01:00
ca6834d95f refactor: use a custom Link component to be able to use absolute URLs everywhere
...that supports any `base` property
2023-12-19 09:57:23 +01:00
bdf831b008 docs: use correct hrefs for all link elements 2023-12-18 16:36:05 +01:00
58316ba6f8 docs: and yet another 2023-12-18 16:30:34 +01:00
e186c1fbce docs: a few more links to fix 2023-12-18 16:28:46 +01:00
03ec8f2599 docs: make all links relative to support mounting anywhere 2023-12-18 16:25:14 +01:00
3bffb98750 ci: and now then? 2023-12-18 16:02:44 +01:00
319901c7ac ci: how about now? 2023-12-18 15:57:51 +01:00
bd6aea8a42 ci: do we get the correct values? 2023-12-18 15:53:44 +01:00
afa20f5bef ci: is this the way? 2023-12-18 15:48:21 +01:00
65be64329d ci: this is the way 2023-12-18 15:47:11 +01:00
9c0cbb0d53 ci: set site and base correctly 2023-12-18 15:42:29 +01:00
e245d6f18a ci: only set site and base during deployment 2023-12-18 15:38:42 +01:00
f63fa9d864 ci: add GitHub Pages deployment 2023-12-18 15:32:53 +01:00
665f0ad9cf docs: fix some faulty links 2023-12-18 15:32:53 +01:00
5911331889 docs(generators): add some documentation for generator plugins 2023-12-18 15:32:53 +01:00
13e370362a docs: fix some links 2023-12-18 15:32:53 +01:00
7da778c767 docs(reporters): write some documentation for the reporters 2023-12-18 15:32:53 +01:00
1843bf893d docs: remove staggering of cards 2023-12-18 15:32:53 +01:00
43f4df5f37 docs(style): use Tailwind to customize the docs colors 2023-12-18 15:32:53 +01:00
bf52bd0d3c docs(commands): use more idiomatic commands per package manager 2023-12-18 15:32:53 +01:00
418737f97d docs: add logo 2023-12-18 15:32:53 +01:00
445fe69e60 docs: update all command docs with package manager variants 2023-12-18 15:32:53 +01:00
1fc24269f4 docs: add a basic FAQ section 2023-12-18 15:32:53 +01:00
2a82897ba8 docs: add documentation for the commands: up, list, new and remove 2023-12-18 15:32:53 +01:00
c460ae7459 docs: split the getting started guide into two separate pages in the "introduction" section 2023-12-18 15:32:53 +01:00
99d189aeb9 docs: move all plugin types under the same "Plugins" category 2023-12-18 15:32:53 +01:00
d5c6e9b1db docs: first commit for the docs 2023-12-18 15:32:53 +01:00
987374dbd5
chore(release): version packages (#43)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2023-12-15 14:43:10 +01:00
cae6d11d53
feat(types): move Emigrate types to separate package and improve types (#41)
* feat(types): move Emigrate types to separate package

Also refactor the types to use discriminating unions for easier error handling and such.
Errors passed to storage plugins should now be serialized and storage plugins are expected to return already serialized errors on failed history entries.

* fix(mysql): handle the new type changes

* fix(storage-fs): handle the new type changes

* feat(cli): better error handling and types

Adapt to the new types from the @emigrate/types package, like discriminating union types and serializing and deserializing errors
2023-12-15 13:03:35 +01:00
afe56594c5
chore(release): version packages (#40)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2023-12-15 12:55:48 +01:00
dependabot[bot]
499d6685e6
chore(deps): bump tsx from 4.1.2 to 4.6.2 (#25)
Bumps [tsx](https://github.com/privatenumber/tsx) from 4.1.2 to 4.6.2.
- [Release notes](https://github.com/privatenumber/tsx/releases)
- [Changelog](https://github.com/privatenumber/tsx/blob/develop/release.config.cjs)
- [Commits](https://github.com/privatenumber/tsx/compare/v4.1.2...v4.6.2)

---
updated-dependencies:
- dependency-name: tsx
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-15 11:55:00 +01:00
dependabot[bot]
98e167972b
chore(deps): bump the commitlint group with 2 updates (#32)
Bumps the commitlint group with 2 updates: [@commitlint/cli](https://github.com/conventional-changelog/commitlint/tree/HEAD/@commitlint/cli) and [@commitlint/config-conventional](https://github.com/conventional-changelog/commitlint/tree/HEAD/@commitlint/config-conventional).


Updates `@commitlint/cli` from 18.4.2 to 18.4.3
- [Release notes](https://github.com/conventional-changelog/commitlint/releases)
- [Changelog](https://github.com/conventional-changelog/commitlint/blob/master/@commitlint/cli/CHANGELOG.md)
- [Commits](https://github.com/conventional-changelog/commitlint/commits/v18.4.3/@commitlint/cli)

Updates `@commitlint/config-conventional` from 18.4.2 to 18.4.3
- [Release notes](https://github.com/conventional-changelog/commitlint/releases)
- [Changelog](https://github.com/conventional-changelog/commitlint/blob/master/@commitlint/config-conventional/CHANGELOG.md)
- [Commits](https://github.com/conventional-changelog/commitlint/commits/v18.4.3/@commitlint/config-conventional)

---
updated-dependencies:
- dependency-name: "@commitlint/cli"
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: commitlint
- dependency-name: "@commitlint/config-conventional"
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: commitlint
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-15 08:58:10 +01:00
dependabot[bot]
210858540d
chore(deps): bump prettier from 3.1.0 to 3.1.1 (#33)
Bumps [prettier](https://github.com/prettier/prettier) from 3.1.0 to 3.1.1.
- [Release notes](https://github.com/prettier/prettier/releases)
- [Changelog](https://github.com/prettier/prettier/blob/main/CHANGELOG.md)
- [Commits](https://github.com/prettier/prettier/compare/3.1.0...3.1.1)

---
updated-dependencies:
- dependency-name: prettier
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-15 08:50:41 +01:00
1434be5d5e
feat(reporter): print Emigrate CLI's version number and relative paths to migrations (#39)
* feat(reporter-default): print CLI version number

* feat(reporter-default): print relative paths to migrations instead of only the file names

This makes the output clickable in most shells

* feat(reporter-pino): include the Emigrate CLI version in each log
2023-12-14 13:45:02 +01:00
480796e95b
chore(release): version packages (#36)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2023-12-14 13:29:02 +01:00
bad4e252f3
feat(reporters): pass the CLI's version number to reporters (#38) 2023-12-14 13:11:55 +01:00
dependabot[bot]
bf34cc427a
chore(deps): bump @types/node from 20.9.2 to 20.10.4 (#29)
Bumps [@types/node](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node) from 20.9.2 to 20.10.4.
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node)

---
updated-dependencies:
- dependency-name: "@types/node"
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-14 12:54:50 +01:00
2b9a16d6fd
fix(cli): make main command non-strict to let other commands handle their options (#37) 2023-12-14 11:47:36 +01:00
960ce08674
feat(cli): add --help and --version options to main command (#35) 2023-12-14 10:49:54 +01:00
dependabot[bot]
dac43ce95d
chore(deps): bump @changesets/cli from 2.26.2 to 2.27.1 (#34)
Bumps [@changesets/cli](https://github.com/changesets/changesets) from 2.26.2 to 2.27.1.
- [Release notes](https://github.com/changesets/changesets/releases)
- [Changelog](https://github.com/changesets/changesets/blob/main/docs/modifying-changelog-format.md)
- [Commits](https://github.com/changesets/changesets/compare/@changesets/cli@2.26.2...@changesets/cli@2.27.1)

---
updated-dependencies:
- dependency-name: "@changesets/cli"
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-13 15:59:51 +01:00
27b319fa96 chore: group commitlint packages for dependabot 2023-12-12 16:20:44 +01:00
140 changed files with 18260 additions and 4621 deletions

View file

@ -5,7 +5,7 @@ end_of_line = lf
insert_final_newline = true
trim_trailing_whitespace = true
[*.md]
[*.{md,mdx}]
trim_trailing_whitespace = false
[*.{ts,js,tsx,json,yml}]

View file

@ -10,6 +10,10 @@ updates:
versioning-strategy: increase
commit-message:
prefix: 'chore(deps)'
groups:
commitlint:
patterns:
- '@commitlint/*'
reviewers:
- 'aboviq/maintainers'

View file

@ -13,6 +13,7 @@ jobs:
env:
TURBO_TOKEN: ${{ secrets.TURBO_TOKEN }}
TURBO_TEAM: ${{ secrets.TURBO_TEAM }}
DO_NOT_TRACK: 1
steps:
- name: Check out code
@ -20,14 +21,12 @@ jobs:
with:
fetch-depth: 2
- uses: pnpm/action-setup@v2.4.0
with:
version: 8.3.1
- uses: pnpm/action-setup@v4.0.0
- name: Setup Node.js environment
uses: actions/setup-node@v4
with:
node-version: 20.9.0
node-version: 22.15.0
cache: 'pnpm'
- name: Install dependencies

47
.github/workflows/deploy.yaml vendored Normal file
View file

@ -0,0 +1,47 @@
name: Deploy to GitHub Pages
on:
# Trigger the workflow every time you push to the `main` branch
# Using a different branch name? Replace `main` with your branchs name
push:
branches: [main]
# Allows you to run this workflow manually from the Actions tab on GitHub.
workflow_dispatch:
# Allow this job to clone the repo and create a page deployment
permissions:
actions: read
contents: read
pages: write
id-token: write
env:
ASTRO_SITE: ${{ vars.ASTRO_SITE }}
ASTRO_BASE: ${{ vars.ASTRO_BASE }}
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout your repository using git
uses: actions/checkout@v4
- name: Show vars
run: |
echo $ASTRO_SITE
echo $ASTRO_BASE
- name: Install, build, and upload your site output
uses: withastro/action@v2
with:
path: ./docs # The root location of your Astro project inside the repository. (optional)
package-manager: pnpm@9.4.0 # The Node package manager that should be used to install dependencies and build your site. Automatically detected based on your lockfile. (optional)
deploy:
needs: build
runs-on: ubuntu-latest
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
steps:
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v4

62
.github/workflows/integration.yaml vendored Normal file
View file

@ -0,0 +1,62 @@
name: Integration Tests
on:
push:
branches: ['main', 'changeset-release/main']
pull_request:
jobs:
mysql_integration:
name: Emigrate MySQL integration tests
timeout-minutes: 15
runs-on: ubuntu-latest
env:
TURBO_TOKEN: ${{ secrets.TURBO_TOKEN }}
TURBO_TEAM: ${{ secrets.TURBO_TEAM }}
DO_NOT_TRACK: 1
services:
mysql:
image: mysql:8.0
env:
MYSQL_ROOT_PASSWORD: root
MYSQL_DATABASE: emigrate
MYSQL_USER: emigrate
MYSQL_PASSWORD: emigrate
ports:
- 3306:3306
options: --health-cmd="mysqladmin ping -h localhost" --health-interval=10s --health-timeout=5s --health-retries=5
steps:
- name: Check out code
uses: actions/checkout@v4
with:
fetch-depth: 2
- uses: pnpm/action-setup@v4.0.0
- name: Setup Node.js environment
uses: actions/setup-node@v4
with:
node-version: 22.15.0
cache: 'pnpm'
- name: Install dependencies
run: pnpm install
- name: Wait for MySQL to be ready
run: |
for i in {1..30}; do
nc -z localhost 3306 && echo "MySQL is up!" && break
echo "Waiting for MySQL..."
sleep 2
done
- name: Build package
run: pnpm build --filter @emigrate/mysql
- name: Integration Tests
env:
MYSQL_HOST: '127.0.0.1'
MYSQL_PORT: 3306
run: pnpm --filter @emigrate/mysql integration

View file

@ -15,31 +15,58 @@ jobs:
contents: write
packages: write
pull-requests: write
actions: read
id-token: write
steps:
- name: Checkout Repo
uses: actions/checkout@v4
with:
token: ${{ secrets.PAT_GITHUB_TOKEN }}
persist-credentials: false
fetch-depth: 0
- uses: pnpm/action-setup@v2.4.0
with:
version: 8.3.1
- uses: pnpm/action-setup@v4.0.0
- name: Setup Node.js environment
uses: actions/setup-node@v4
with:
node-version: 20.9.0
node-version: 22.15.0
cache: 'pnpm'
- name: Install Dependencies
run: pnpm install
- name: Create Release Pull Request
uses: changesets/action@v1
id: changesets
uses: aboviq/changesets-action@v1.5.2
with:
publish: pnpm run release
commit: 'chore(release): version packages'
title: 'chore(release): version packages'
createGithubReleases: aggregate
env:
GITHUB_TOKEN: ${{ secrets.PAT_GITHUB_TOKEN }}
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Release to @next tag on npm
if: github.ref_name == 'main' && steps.changesets.outputs.published != 'true'
run: |
git checkout main
CHANGESET_FILE=$(git diff-tree --no-commit-id --name-only HEAD -r ".changeset/*-*-*.md")
if [ -z "$CHANGESET_FILE" ]; then
echo "No changesets found, skipping release to @next tag"
exit 0
fi
AFFECTED_PACKAGES=$(sed -n '/---/,/---/p' "$CHANGESET_FILE" | sed '/---/d')
if [ -z "$AFFECTED_PACKAGES" ]; then
echo "No packages affected by changesets, skipping release to @next tag"
exit 0
fi
pnpm changeset version --snapshot next
pnpm changeset publish --tag next
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
GITHUB_TOKEN: ${{ secrets.PAT_GITHUB_TOKEN }}

View file

@ -4,6 +4,8 @@
It's effectively a successor of [klei-migrate](https://www.npmjs.com/package/klei-migrate) and [Immigration](https://www.npmjs.com/package/immigration).
📖 Read the [documentation](https://emigrate.dev) for more information!
## Features
- Database agnostic
@ -28,15 +30,79 @@ It's effectively a successor of [klei-migrate](https://www.npmjs.com/package/kle
Install the Emigrate CLI in your project:
```bash
npm install --save-dev @emigrate/cli
npm install @emigrate/cli
# or
pnpm add @emigrate/cli
# or
yarn add @emigrate/cli
# or
bun add @emigrate/cli
```
## Usage
```text
Usage: emigrate up [options]
Run all pending migrations
Options:
-h, --help Show this help message and exit
-d, --directory <path> The directory where the migration files are located (required)
-i, --import <module> Additional modules/packages to import before running the migrations (can be specified multiple times)
For example if you want to use Dotenv to load environment variables or when using TypeScript
-s, --storage <name> The storage to use for where to store the migration history (required)
-p, --plugin <name> The plugin(s) to use (can be specified multiple times)
-r, --reporter <name> The reporter to use for reporting the migration progress
-l, --limit <count> Limit the number of migrations to run
-f, --from <name/path> Start running migrations from the given migration name or relative file path to a migration file,
the given name or path needs to exist. The same migration and those after it lexicographically will be run
-t, --to <name/path> Skip migrations after the given migration name or relative file path to a migration file,
the given name or path needs to exist. The same migration and those before it lexicographically will be run
--dry List the pending migrations that would be run without actually running them
--color Force color output (this option is passed to the reporter)
--no-color Disable color output (this option is passed to the reporter)
--no-execution Mark the migrations as executed and successful without actually running them,
which is useful if you want to mark migrations as successful after running them manually
--abort-respite <sec> The number of seconds to wait before abandoning running migrations after the command has been aborted (default: 10)
Examples:
emigrate up --directory src/migrations -s fs
emigrate up -d ./migrations --storage @emigrate/mysql
emigrate up -d src/migrations -s postgres -r json --dry
emigrate up -d ./migrations -s mysql --import dotenv/config
emigrate up --limit 1
emigrate up --to 20231122120529381_some_migration_file.js
emigrate up --to 20231122120529381_some_migration_file.js --no-execution
```
### Examples
Create a new migration:
```bash
emigrate new -d migrations -e .js create some fancy table
npx emigrate new -d migrations create some fancy table
# or
pnpm emigrate new -d migrations create some fancy table
# or
yarn emigrate new -d migrations create some fancy table
# or
bunx --bun emigrate new -d migrations create some fancy table
```
Will create a new empty JavaScript migration file with the name "YYYYMMDDHHmmssuuu_create_some_fancy_table.js" in the `migrations` directory.

21
docs/.gitignore vendored Normal file
View file

@ -0,0 +1,21 @@
# build output
dist/
# generated types
.astro/
# dependencies
node_modules/
# logs
npm-debug.log*
yarn-debug.log*
yarn-error.log*
pnpm-debug.log*
# environment variables
.env
.env.production
# macOS-specific files
.DS_Store

4
docs/.vscode/extensions.json vendored Normal file
View file

@ -0,0 +1,4 @@
{
"recommendations": ["astro-build.astro-vscode"],
"unwantedRecommendations": []
}

11
docs/.vscode/launch.json vendored Normal file
View file

@ -0,0 +1,11 @@
{
"version": "0.2.0",
"configurations": [
{
"command": "./node_modules/.bin/astro dev",
"name": "Development server",
"request": "launch",
"type": "node-terminal"
}
]
}

43
docs/CHANGELOG.md Normal file
View file

@ -0,0 +1,43 @@
# @emigrate/docs
## 1.0.0
### Major Changes
- 1d33d65: Rename the URL path "/commands/" to "/cli/" to make it more clear that those pages are the documentation for the CLI. This change is a BREAKING CHANGE because it changes the URL path of the pages.
### Minor Changes
- 0c597fd: Add a separate page for the Emigrate CLI itself, with all the commands as sub pages
## 0.4.0
### Minor Changes
- b62c692: Add documentation for the built-in "json" reporter
- b62c692: The "default" reporter is now named "pretty"
- e7ec75d: Add note in FAQ on using Emigrate for existing databases
### Patch Changes
- c838ffb: Add note on how to write Emigrate's config using TypeScript in a production environment without having `typescript` installed.
## 0.3.0
### Minor Changes
- f6761fe: Document the changes to the "remove" command, specifically that it also accepts relative file paths now
- 9109238: Document the changes to the "up" command's `--from` and `--to` options, specifically that they can take relative file paths and that the given migration must exist.
## 0.2.0
### Minor Changes
- a4da353: Document the --abort-respite CLI option and the corresponding abortRespite config
## 0.1.0
### Minor Changes
- cbc35bd: Add first version of the [Baseline guide](https://emigrate.dev/guides/baseline)
- cbc35bd: Document the new --limit, --from and --to options for the ["up" command](https://emigrate.dev/cli/up/)

54
docs/README.md Normal file
View file

@ -0,0 +1,54 @@
# Starlight Starter Kit: Basics
[![Built with Starlight](https://astro.badg.es/v2/built-with-starlight/tiny.svg)](https://starlight.astro.build)
```
npm create astro@latest -- --template starlight
```
[![Open in StackBlitz](https://developer.stackblitz.com/img/open_in_stackblitz.svg)](https://stackblitz.com/github/withastro/starlight/tree/main/examples/basics)
[![Open with CodeSandbox](https://assets.codesandbox.io/github/button-edit-lime.svg)](https://codesandbox.io/p/sandbox/github/withastro/starlight/tree/main/examples/basics)
[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fwithastro%2Fstarlight%2Ftree%2Fmain%2Fexamples%2Fbasics&project-name=my-starlight-docs&repository-name=my-starlight-docs)
> 🧑‍🚀 **Seasoned astronaut?** Delete this file. Have fun!
## 🚀 Project Structure
Inside of your Astro + Starlight project, you'll see the following folders and files:
```
.
├── public/
├── src/
│ ├── assets/
│ ├── content/
│ │ ├── docs/
│ │ └── config.ts
│ └── env.d.ts
├── astro.config.mjs
├── package.json
└── tsconfig.json
```
Starlight looks for `.md` or `.mdx` files in the `src/content/docs/` directory. Each file is exposed as a route based on its file name.
Images can be added to `src/assets/` and embedded in Markdown with a relative link.
Static assets, like favicons, can be placed in the `public/` directory.
## 🧞 Commands
All commands are run from the root of the project, from a terminal:
| Command | Action |
| :------------------------ | :----------------------------------------------- |
| `npm install` | Installs dependencies |
| `npm run dev` | Starts local dev server at `localhost:4321` |
| `npm run build` | Build your production site to `./dist/` |
| `npm run preview` | Preview your build locally, before deploying |
| `npm run astro ...` | Run CLI commands like `astro add`, `astro check` |
| `npm run astro -- --help` | Get help using the Astro CLI |
## 👀 Want to learn more?
Check out [Starlights docs](https://starlight.astro.build/), read [the Astro documentation](https://docs.astro.build), or jump into the [Astro Discord server](https://astro.build/chat).

232
docs/astro.config.mjs Normal file
View file

@ -0,0 +1,232 @@
import process from 'node:process';
import { defineConfig } from 'astro/config';
import starlight from '@astrojs/starlight';
import tailwind from '@astrojs/tailwind';
const base = process.env.ASTRO_BASE || '';
// https://astro.build/config
export default defineConfig({
site: process.env.ASTRO_SITE ?? 'http://localhost:4321',
base: base || undefined,
integrations: [
starlight({
title: 'Emigrate',
favicon: '/favicon.ico',
customCss: ['./src/tailwind.css'],
head: [
{
tag: 'link',
attrs: {
rel: 'apple-touch-icon',
type: 'image/png',
href: `${base}/apple-touch-icon.png`,
sizes: '180x180',
},
},
{
tag: 'link',
attrs: {
rel: 'icon',
type: 'image/png',
href: `${base}/favicon-32x32.png`,
sizes: '32x32',
},
},
{
tag: 'link',
attrs: {
rel: 'icon',
type: 'image/png',
href: `${base}/favicon-16x16.png`,
sizes: '16x16',
},
},
{
tag: 'link',
attrs: {
rel: 'manifest',
href: `${base}/site.webmanifest`,
},
},
],
social: {
github: 'https://github.com/aboviq/emigrate',
},
editLink: {
baseUrl: 'https://github.com/aboviq/emigrate/edit/main/docs/',
},
components: {
PageTitle: './src/components/PageTitle.astro',
},
sidebar: [
{
label: 'Introduction',
items: [
{
label: "What's Emigrate?",
link: '/intro/whats-emigrate/',
},
{
label: 'Quick Start',
link: '/intro/quick-start/',
},
{
label: 'FAQ',
link: '/intro/faq/',
},
],
},
{
label: 'Command Line Interface',
items: [
{
label: 'Introduction',
link: '/cli/',
},
{
label: 'Commands',
items: [
{
label: 'emigrate up',
link: '/cli/up/',
},
{
label: 'emigrate list',
link: '/cli/list/',
},
{
label: 'emigrate new',
link: '/cli/new/',
},
{
label: 'emigrate remove',
link: '/cli/remove/',
},
],
},
],
},
{
label: 'Guides',
items: [
{
label: 'Using TypeScript',
link: '/guides/typescript/',
},
{
label: 'Baseline existing database',
link: '/guides/baseline/',
},
],
},
{
label: 'Plugins',
items: [
{
label: 'Plugins Introduction',
link: '/plugins/',
},
{
label: 'Storage Plugins',
collapsed: true,
items: [
{
label: 'Storage Plugins',
link: '/plugins/storage/',
},
{
label: 'File System',
link: '/plugins/storage/file-system/',
},
{
label: 'PostgreSQL',
link: '/plugins/storage/postgres/',
},
{
label: 'MySQL',
link: '/plugins/storage/mysql/',
},
],
},
{
label: 'Loader Plugins',
collapsed: true,
items: [
{
label: 'Loader Plugins',
link: '/plugins/loaders/',
},
{
label: 'Default Loader',
link: '/plugins/loaders/default/',
},
{
label: 'PostgreSQL Loader',
link: '/plugins/loaders/postgres/',
},
{
label: 'MySQL Loader',
link: '/plugins/loaders/mysql/',
},
],
},
{
label: 'Reporters',
collapsed: true,
items: [
{
label: 'Reporters',
link: '/plugins/reporters/',
},
{
label: 'Pretty Reporter (default)',
link: '/plugins/reporters/pretty/',
},
{
label: 'JSON Reporter',
link: '/plugins/reporters/json/',
},
{
label: 'Pino Reporter',
link: '/plugins/reporters/pino/',
},
],
},
{
label: 'Generator Plugins',
collapsed: true,
items: [
{
label: 'Generator Plugins',
link: '/plugins/generators/',
},
{
label: 'JavaScript Generator',
link: '/plugins/generators/js/',
},
{
label: 'PostgreSQL Generator',
link: '/plugins/generators/postgres/',
},
{
label: 'MySQL Generator',
link: '/plugins/generators/mysql/',
},
],
},
],
},
{
label: 'Reference',
autogenerate: {
directory: 'reference',
},
},
],
}),
tailwind({
applyBaseStyles: false,
}),
],
});

26
docs/package.json Normal file
View file

@ -0,0 +1,26 @@
{
"name": "@emigrate/docs",
"private": true,
"type": "module",
"version": "1.0.0",
"scripts": {
"dev": "astro dev",
"start": "astro dev",
"build": "astro check && astro build",
"preview": "astro preview",
"astro": "astro"
},
"dependencies": {
"@astrojs/check": "^0.7.0",
"@astrojs/starlight": "^0.15.0",
"@astrojs/starlight-tailwind": "2.0.1",
"@astrojs/tailwind": "^5.0.3",
"astro": "^4.0.1",
"sharp": "^0.32.5",
"tailwindcss": "^3.3.6"
},
"volta": {
"extends": "../package.json"
},
"packageManager": "pnpm@9.4.0"
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 44 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 252 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 39 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 948 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 KiB

BIN
docs/public/favicon.ico Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 15 KiB

View file

@ -0,0 +1,11 @@
{
"name": "Emigrate",
"short_name": "Emigrate",
"icons": [
{ "src": "/android-chrome-192x192.png", "sizes": "192x192", "type": "image/png" },
{ "src": "/android-chrome-512x512.png", "sizes": "512x512", "type": "image/png" }
],
"theme_color": "#ffffff",
"background_color": "#ffffff",
"display": "standalone"
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 57 KiB

View file

@ -0,0 +1,11 @@
---
interface Props {
href: string;
}
const base = import.meta.env.BASE_URL === '/' || !import.meta.env.BASE_URL ? '' : import.meta.env.BASE_URL;
const site = import.meta.env.SITE ?? '';
const { href } = Astro.props;
---
<a href={`${site}${base}${href}`}><slot /></a>

View file

@ -0,0 +1,27 @@
---
import type { Props } from '@astrojs/starlight/props';
const { title } = Astro.props.entry.data;
const isCode = title.startsWith('`') && title.endsWith('`');
---
<h1 id="_top">{isCode ? <code>{title.slice(1, -1)}</code> : title}</h1>
<style>
h1 {
margin-top: 1rem;
font-size: var(--sl-text-h1);
line-height: var(--sl-line-height-headings);
font-weight: 600;
color: var(--sl-color-white);
}
code {
font-family: var(--sl-font-system-mono);
background-color: var(--sl-color-bg-inline-code);
border: 1px solid var(--sl-color-gray-3);
color: var(--sl-color-gray-1);
padding: 0.25rem 0.5rem;
border-radius: 0.25rem;
}
</style>

View file

@ -0,0 +1,7 @@
import { defineCollection } from 'astro:content';
import { docsSchema, i18nSchema } from '@astrojs/starlight/schema';
export const collections = {
docs: defineCollection({ schema: docsSchema() }),
i18n: defineCollection({ type: 'data', schema: i18nSchema() }),
};

View file

@ -0,0 +1,73 @@
---
title: "CLI Introduction"
description: "Some basic information about the Emigrate CLI."
---
import { Tabs, TabItem, LinkCard } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
Emigrate comes with a CLI that you can use to manage your migrations. The CLI is a powerful tool that allows you to create, run, and manage migrations.
### Installing the Emigrate CLI
<Tabs>
<TabItem label="npm">
```bash
npm install @emigrate/cli
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm add @emigrate/cli
```
</TabItem>
<TabItem label="yarn">
```bash
yarn add @emigrate/cli
```
</TabItem>
<TabItem label="bun">
```bash
bun add @emigrate/cli
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
</TabItem>
</Tabs>
### Existing commands
<LinkCard
href="up/"
title="emigrate up"
description="The command for executing migrations, or showing pending migrations in dry run mode."
/>
<LinkCard
href="list/"
title="emigrate list"
description="The command for listing all migrations and their status."
/>
<LinkCard
href="new/"
title="emigrate new"
description="The command for creating new migration files."
/>
<LinkCard
href="remove/"
title="emigrate remove"
description="The command for removing migrations from the migration history."
/>

View file

@ -0,0 +1,106 @@
---
title: "`emigrate list`"
description: "List migrations and their status."
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
The `list` command is used to list _all_ migrations, i.e. both already run migrations and migrations that haven't been run yet.
Emigrate takes all migration files in the given directory that haven't been run yet and all migrations from the migration history.
It then sorts the migrations by filename in ascending order and outputs them and their respective status one by one.
## Usage
<Tabs>
<TabItem label="npm">
```bash
npx emigrate list [options]
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate list [options]
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate list [options]
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate list [options]
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
```bash
deno task emigrate list [options]
```
</TabItem>
</Tabs>
## Options
### `-h`, `--help`
Show command help and exit
### `-d`, `--directory <path>`
The directory where the migration files are located. The given path should be absolute or relative to the current working directory.
### `-i`, `--import <module>`
A module to import before listing the migrations. This option can be specified multiple times.
Can for instance be used to load environment variables using [dotenv](https://github.com/motdotla/dotenv) with `--import dotenv/config`.
### `-s`, `--storage <name>`
The <Link href="/plugins/storage/">storage plugin</Link> to use, which is responsible for where to store the migration history.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
- `@emigrate/storage-`
- `emigrate-storage-`
- `@emigrate/plugin-storage-`
- `@emigrate/`
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-storage-somedb` package, you can specify either `emigrate-storage-somedb` or just `somedb` as the name.
In case you have both a `emigrate-storage-somedb` and a `somedb` package installed, the `emigrate-storage-somedb` package will be used.
### `-r`, `--reporter <name>`
**type:** `"pretty" | "json" | string`
**default:** `"pretty"`
The <Link href="/plugins/reporters/">reporter</Link> to use for listing the migrations.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
- `@emigrate/reporter-`
- `emigrate-reporter-`
- `@emigrate/`
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-reporter-somereporter` package, you can specify either `emigrate-reporter-somereporter` or just `somereporter` as the name.
### `--color`, `--no-color`
Force enable/disable colored output, option is passed to the reporter which should respect it.

View file

@ -0,0 +1,121 @@
---
title: "`emigrate new`"
description: "Create new migration."
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
The `new` command is used to create a new migration file in the given directory.
The migration file can be based on a template, generated by a <Link href="/plugins/generators/">generator plugin</Link>, or just be an empty file.
## Usage
<Tabs>
<TabItem label="npm">
```bash
npx emigrate new [options] <name>
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate new [options] <name>
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate new [options] <name>
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate new [options] <name>
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
```bash
deno task emigrate new [options] <name>
```
</TabItem>
</Tabs>
## Arguments
### `<name>`
The name of the migration. The name will be used to generate the migration filename.
The name can be provided as a single argument or as multiple arguments.
If multiple arguments are provided, the arguments will be joined with a space before being passed to the filename generator (the default filename generator replaces all whitespace and non-path safe characters with underscores).
## Options
### `-h`, `--help`
Show command help and exit
### `-d`, `--directory <path>`
The directory where the migration files are located. The given path should be absolute or relative to the current working directory.
### `-t`, `--template <path>`
The template file to use for generating the migration file. The given path should be absolute or relative to the current working directory.
The template can contain a `{{name}}` placeholder which will be replaced with the migration name provided to the command. The generated file will have the same extension as the template file, unless the [`--extension`](#-x---extension-ext) option is used.
### `-x`, `--extension <ext>`
The extension to use for the migration file. Unless the [`--template`](#-t---template-path) option is also specified the file will be empty.
If both the `--template` and `--extension` options are specified, the extension will override the template file extension.
**Example:** `--extension .sql` will generate a file with the `.sql` extension.
### `-p`, `--plugin <name>`
The <Link href="/plugins/generators">generator plugin</Link> to use. The generator plugin is responsible for generating the migration filename and its contents.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
- `@emigrate/plugin-`
- `emigrate-plugin-`
- `@emigrate/`
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-plugin-someplugin` package, you can specify either `emigrate-plugin-someplugin` or just `someplugin` as the name.
In case you have both a `emigrate-plugin-someplugin` and a `someplugin` package installed, the `emigrate-plugin-someplugin` package will be used.
### `-r`, `--reporter <name>`
**type:** `"pretty" | "json" | string`
**default:** `"pretty"`
The <Link href="/plugins/reporters/">reporter</Link> to use for listing the migrations.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
- `@emigrate/reporter-`
- `emigrate-reporter-`
- `@emigrate/`
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-reporter-somereporter` package, you can specify either `emigrate-reporter-somereporter` or just `somereporter` as the name.
### `--color`, `--no-color`
Force enable/disable colored output, option is passed to the reporter which should respect it.

View file

@ -0,0 +1,115 @@
---
title: "`emigrate remove`"
description: "Remove a migration from the history."
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
The `remove` command is used to remove a migration from the history. This is useful if you want to retry a migration that has failed.
## Usage
<Tabs>
<TabItem label="npm">
```bash
npx emigrate remove [options] <name/path>
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate remove [options] <name/path>
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate remove [options] <name/path>
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate remove [options] <name/path>
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
```bash
deno task emigrate remove [options] <name/path>
```
</TabItem>
</Tabs>
## Arguments
### `<name/path>`
The name of the migration file to remove, including the extension, e.g. `20200101000000_some_migration.js`, or a relative file path to a migration file to remove, e.g: `migrations/20200101000000_some_migration.js`.
Using relative file paths is useful in terminals that support autocomplete, and also when you copy and use the relative migration file path from the output of the <Link href="/cli/list/">`list`</Link> command.
## Options
### `-h`, `--help`
Show command help and exit
### `-d`, `--directory <path>`
The directory where the migration files are located. The given path should be absolute or relative to the current working directory.
### `-f`, `--force`
Force removal of the migration history entry even if the migration file does not exist or it's in a non-failed state.
### `-i`, `--import <module>`
A module to import before remove the migration. This option can be specified multiple times.
Can for instance be used to load environment variables using [dotenv](https://github.com/motdotla/dotenv) with `--import dotenv/config`.
### `-s`, `--storage <name>`
The <Link href="/plugins/storage/">storage plugin</Link> to use, which is responsible for where to store the migration history.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
- `@emigrate/storage-`
- `emigrate-storage-`
- `@emigrate/plugin-storage-`
- `@emigrate/`
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-storage-somedb` package, you can specify either `emigrate-storage-somedb` or just `somedb` as the name.
In case you have both a `emigrate-storage-somedb` and a `somedb` package installed, the `emigrate-storage-somedb` package will be used.
### `-r`, `--reporter <name>`
**type:** `"pretty" | "json" | string`
**default:** `"pretty"`
The <Link href="/plugins/reporters/">reporter</Link> to use for listing the migrations.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
- `@emigrate/reporter-`
- `emigrate-reporter-`
- `@emigrate/`
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-reporter-somereporter` package, you can specify either `emigrate-reporter-somereporter` or just `somereporter` as the name.
### `--color`, `--no-color`
Force enable/disable colored output, option is passed to the reporter which should respect it.

View file

@ -0,0 +1,183 @@
---
title: "`emigrate up`"
description: "Run migrations"
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
The `up` command is used to either list or run all pending migrations, i.e. migrations that haven't been run yet.
Emigrate takes all migration files in the given directory and compares them to the migration history so that it knows which migrations are pending.
It then sorts the pending migrations by filename in ascending order and runs them one by one.
If any of the migrations fail, the command will be aborted and the rest of the migrations will not be run.
## Usage
<Tabs>
<TabItem label="npm">
```bash
npx emigrate up [options]
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate up [options]
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate up [options]
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate up [options]
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
```bash
deno task emigrate up [options]
```
</TabItem>
</Tabs>
## Options
### `-h`, `--help`
Show command help and exit
### `--dry`
List the pending migrations that would be run without actually running them
### `-l, --limit <count>`
**type:** `number`
Limit the number of migrations to run. Can be combined with `--dry` which will show "pending" for the migrations that would be run if not in dry-run mode,
and "skipped" for the migrations that also haven't been run but won't because of the set limit.
### `-d`, `--directory <path>`
The directory where the migration files are located. The given path should be absolute or relative to the current working directory.
### `-f`, `--from <name/path>`
The name of the migration to start from. This can be used to run only a subset of the pending migrations.
The given migration need to exist and is compared in lexicographical order with all migrations, the migration with the same name and those lexicographically after it will be migrated.
It's okay to use an already executed migration as the "from" migration, it won't be executed again.
The reason for why the given migration name must exist and cannot be just a prefix is to avoid accidentally running migrations that you didn't intend to run.
The given name can also be a relative path to a migration file, which makes it easier to use with terminals that support tab completion
or when copying the output from Emigrate and using it directly as the value of the `--from` option.
Relative paths are resolved relative to the current working directory.
Can be combined with `--dry` which will show "pending" for the migrations that would be run if not in dry-run mode,
and "skipped" for the migrations that also haven't been run but won't because of the set "from".
### `-t`, `--to <name/path>`
The name of the migration to end at. This can be used to run only a subset of the pending migrations.
The given migration name need to exist and is compared in lexicographical order with all migrations, the migration with the same name and those lexicographically before it will be migrated.
It's okay to use an already executed migration as the "to" migration, it won't be executed again.
The reason for why the given migration name must exist and cannot be just a prefix is to avoid accidentally running migrations that you didn't intend to run.
The given name can also be a relative path to a migration file, which makes it easier to use with terminals that support tab completion
or when copying the output from Emigrate and using it directly as the value of the `--to` option.
Relative paths are resolved relative to the current working directory.
Can be combined with `--dry` which will show "pending" for the migrations that would be run if not in dry-run mode,
and "skipped" for the migrations that also haven't been run but won't because of the set "to".
### `-i`, `--import <module>`
A module to import before running the migrations. This option can be specified multiple times.
Can for instance be used to load environment variables using [dotenv](https://github.com/motdotla/dotenv) with `--import dotenv/config`,
or for running migrations in NodeJS written in TypeScript with [tsx](https://github.com/privatenumber/tsx) (`--import tsx`), see the <Link href="/guides/typescript/">TypeScript guide</Link> for more information.
### `-s`, `--storage <name>`
The <Link href="/plugins/storage/">storage plugin</Link> to use, which is responsible for where to store the migration history.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
- `@emigrate/storage-`
- `emigrate-storage-`
- `@emigrate/plugin-storage-`
- `@emigrate/`
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-storage-somedb` package, you can specify either `emigrate-storage-somedb` or just `somedb` as the name.
In case you have both a `emigrate-storage-somedb` and a `somedb` package installed, the `emigrate-storage-somedb` package will be used.
### `-p`, `--plugin <name>`
The <Link href="/plugins/loaders/">loader plugin(s)</Link> to use. Can be specified multiple times to use multiple plugins.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
- `@emigrate/plugin-`
- `emigrate-plugin-`
- `@emigrate/`
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-plugin-someplugin` package, you can specify either `emigrate-plugin-someplugin` or just `someplugin` as the name.
In case you have both a `emigrate-plugin-someplugin` and a `someplugin` package installed, the `emigrate-plugin-someplugin` package will be used.
### `-r`, `--reporter <name>`
**type:** `"pretty" | "json" | string`
**default:** `"pretty"`
The <Link href="/plugins/reporters/">reporter</Link> to use for reporting the migration progress.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:
- `@emigrate/reporter-`
- `emigrate-reporter-`
- `@emigrate/`
And then try to load the module/package with the given name.
For example, if you want to use the `emigrate-reporter-somereporter` package, you can specify either `emigrate-reporter-somereporter` or just `somereporter` as the name.
### `--color`, `--no-color`
Force enable/disable colored output, option is passed to the reporter which should respect it.
### `--no-execution`
Mark the migrations as executed and successful without actually running them,
which is useful if you want to mark migrations as successful after running them manually
:::tip
See the <Link href="/guides/baseline/">Baseline guide</Link> for example usage of the `--no-execution` option
:::
### `--abort-respite`
**type:** `number`
**default:** `10`
Customize the number of seconds to wait before abandoning a running migration when the process is about to shutdown, for instance when the user presses `Ctrl+C` or when the container is being stopped (if running inside a container).

View file

@ -0,0 +1,255 @@
---
title: Baseline
description: A guide on how to baseline an existing database at a specific version
---
import { Tabs, TabItem, LinkCard } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
A common scenario is to have an existing database that you want to start managing with Emigrate. This is called baselining.
## Baselining an existing database schema
Let's assume you have a PostgreSQL database with the following schema:
```sql
CREATE TABLE public.users (
id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL,
email VARCHAR(255) NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE TABLE public.posts (
id SERIAL PRIMARY KEY,
user_id INTEGER NOT NULL REFERENCES public.users(id),
title VARCHAR(255) NOT NULL,
body TEXT NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
```
<LinkCard
href="../../plugins/storage/postgres/"
title="PostgreSQL Storage Plugin"
description="See how to configure the PostgreSQL storage plugin here..."
/>
<LinkCard
href="../../plugins/storage/"
title="Storage Plugins"
description="Learn more about storage plugins here..."
/>
### Create a baseline migration
You can baseline this database by first creating a baseline migration (here we name it "baseline"):
<Tabs>
<TabItem label="npm">
```bash
npx emigrate new --plugin postgres baseline
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate new --plugin postgres baseline
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate new --plugin postgres baseline
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate new --plugin postgres baseline
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
```bash
deno task emigrate new --plugin postgres baseline
```
</TabItem>
</Tabs>
Which will generate an empty migration file in your migration directory:
```sql title="migrations/20240118123456789_baseline.sql"
-- Migration: baseline
```
You can then add the SQL statements for your database schema to this migration file:
```sql title="migrations/20240118123456789_baseline.sql"
-- Migration: baseline
CREATE TABLE public.users (
id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL,
email VARCHAR(255) NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE TABLE public.posts (
id SERIAL PRIMARY KEY,
user_id INTEGER NOT NULL REFERENCES public.users(id),
title VARCHAR(255) NOT NULL,
body TEXT NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
```
### Log the baseline migration
For new environments this baseline migration will automatically be run when you run <Link href="/cli/up/">`emigrate up`</Link>.
For any existing environments you will need to run `emigrate up` with the <Link href="/cli/up/#--no-execution">`--no-execution`</Link> flag to prevent the migration from being executed and only log the migration:
<Tabs>
<TabItem label="npm">
```bash
npx emigrate up --no-execution
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate up --no-execution
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate up --no-execution
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate up --no-execution
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
```bash
deno task emigrate up --no-execution
```
</TabItem>
</Tabs>
In case you have already added more migration files to your migration directory you can limit the "up" command to just log the baseline migration by specifying the <Link href="/cli/up/#-t---to-name">`--to`</Link> option:
<Tabs>
<TabItem label="npm">
```bash
npx emigrate up --no-execution --to 20240118123456789_baseline.sql
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate up --no-execution --to 20240118123456789_baseline.sql
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate up --no-execution --to 20240118123456789_baseline.sql
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate up --no-execution --to 20240118123456789_baseline.sql
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
```bash
deno task emigrate up --no-execution --to 20240118123456789_baseline.sql
```
</TabItem>
</Tabs>
### Verify the baseline migration status
You can verify the status of the baseline migration by running the <Link href="/cli/list/">`emigrate list`</Link> command:
<Tabs>
<TabItem label="npm">
```bash
npx emigrate list
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate list
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate list
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate list
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
```bash
deno task emigrate list
```
</TabItem>
</Tabs>
Which should output something like this:
```txt title="emigrate list"
Emigrate list v0.14.1 /your/project/path
✔ migrations/20240118123456789_baseline.sql (done)
1 done (1 total)
```
### Happy migrating!
You can now start adding new migrations to your migration directory and run <Link href="/cli/up/">`emigrate up`</Link> to apply them to your database.
Which should be part of your CD pipeline to ensure that your database schema is always up to date in each environment.

View file

@ -0,0 +1,136 @@
---
title: Using TypeScript
description: A guide on how to support migration files written in TypeScript
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
:::tip[Using Bun or Deno?]
If you are using [Bun](https://bun.sh) or [Deno](https://deno.land) you are already good to go as they both support TypeScript out of the box!
:::
If you're using NodeJS you have at least the two following options to support running TypeScript migration files in NodeJS.
## Using `tsx`
If you want to be able to write and run migration files written in TypeScript an easy way is to install the [`tsx`](https://github.com/privatenumber/tsx) package.
### Installing `tsx`
<Tabs>
<TabItem label="npm">
```bash
npm install tsx
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm add tsx
```
</TabItem>
<TabItem label="yarn">
```bash
yarn add tsx
```
</TabItem>
</Tabs>
:::note
You must install `tsx` as an ordinary dependency, not as a dev dependency,
in case you are pruning your development dependencies before deploying your application (which you should).
:::
### Loading TypeScript migrations
After installing `tsx` you can load it in two ways.
#### Via CLI
Using the <Link href="/cli/up/#-i---import-module">`--import`</Link> flag you can load `tsx` before running your migration files.
<Tabs>
<TabItem label="npm">
```bash
npx emigrate up --import tsx
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate up --import tsx
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate up --import tsx
```
</TabItem>
</Tabs>
:::note
This method is necessary if you want to write your configuration file in TypeScript without having `typescript` installed in your production environment, as `tsx` must be loaded before the configuration file is loaded.
:::
#### Via configuration file
You can also directly import `tsx` in your configuration file (will only work if you're not using TypeScript for your configuration file).
```js title="emigrate.config.js" {1}
import 'tsx';
export default {
// ...
};
```
Then you can run your migration files as usual:
<Tabs>
<TabItem label="npm">
```bash
npx emigrate up
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate up
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate up
```
</TabItem>
</Tabs>
## Building TypeScript migrations
If you don't want to have `tsx` (or similar) as a dependency included in your production environment then
you can build your TypeScript migration files using the [`tsc`](https://www.typescriptlang.org/docs/handbook/compiler-options.html) compiler or
some other tool that are already part of your build process when transpiling your TypeScript code to JavaScript.
Assume that you have all of your migrations in a `src/migrations` directory and you have built them to a `dist/migrations` directory.
Then you can run your migration files by pointing to the `dist/migrations` directory:
<Tabs>
<TabItem label="npm">
```bash
npx emigrate up -d dist/migrations
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate up -d dist/migrations
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate up -d dist/migrations
```
</TabItem>
</Tabs>
:::note
If you're mixing languages for your migration files, e.g. you have both `.sql` and `.ts` files in `src/migrations`, make sure that they are all copied to the destination directory if not part of the TypeScript build process.
:::

View file

@ -0,0 +1,39 @@
---
title: Effortless database changes with Emigrate
description: Adapt any and all of your databases to your needs at any scale. Modern, flexible, and easy to use.
template: splash
hero:
tagline: Adapt any and all of your databases to new business needs.<br>Emigrate is a modern migration tool that's flexible, scalable, and easy to use.
image:
file: ../../assets/emigrate.png
actions:
- text: Quick Start
link: intro/quick-start/
icon: right-arrow
variant: primary
- text: View on GitHub
link: https://github.com/aboviq/emigrate
icon: external
---
import { Card, CardGrid } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
## Key features
<CardGrid>
<Card title="Migrate any database" icon="bars">
Emigrate is a database agnostic migration tool.<br />
You are in full control over what your migrations do.
</Card>
<Card title="Supports any architecture" icon="random">
Migrate many databases from one repository or many repositories to one database. There's no need to synchronize deployments.
</Card>
<Card title="SQL, JavaScript, TypeScript, etc." icon="document">
Write your migration files using the <Link href="/plugins/loaders/">language of your choice</Link>.
And mix and match them as you need. E.g. `SQL` for database schema changes, and `JavaScript` for data transformation.
</Card>
<Card title="Customize to your setup" icon="puzzle">
Emigrate is designed to be flexible and customizable to suite any environment and setup using its <Link href="/plugins/">plugin system</Link>.
</Card>
</CardGrid>

View file

@ -0,0 +1,25 @@
---
title: "FAQ"
description: "Frequently asked questions about Emigrate."
---
import Link from '@components/Link.astro';
## Why no `down` migrations?
> Always forward never backwards.
Many other migration tools support `down` (or undo) migrations, but in all the years we have been
doing migrations we have never needed to rollback a migration in production,
in that case we would just write a new migration to fix the problem.
In our experience the only time `down` migrations are useful is in development,
and in such case you just revert the migration manually and fix the `up` migration.
The benefit of this is that you don't have to worry about writing `down` migrations, and you can focus on writing the `up` migrations.
This way you will only ever have to write `down` migrations when they are really necessary instead of for every migration
(which makes it the exception rather than the rule, which is closer to the truth).
## Can I use Emigrate with my existing database?
Yes, you can use Emigrate with an existing database. See the <Link href="/guides/baseline/">Baseline guide</Link> for more information.

View file

@ -0,0 +1,280 @@
---
title: Quick Start
description: Get going with Emigrate quickly
---
import { Tabs, TabItem, LinkCard } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
:::note
The following guide will be even simpler soon with the release of a initialization command.
But for now, this is the way to go.
:::
<LinkCard
href="../whats-emigrate/"
title="What's Emigrate?"
description="Learn more about Emigrate and what it can do for you."
/>
### Install the Emigrate CLI
<Tabs>
<TabItem label="npm">
```bash
npm install @emigrate/cli
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm add @emigrate/cli
```
</TabItem>
<TabItem label="yarn">
```bash
yarn add @emigrate/cli
```
</TabItem>
<TabItem label="bun">
```bash
bun add @emigrate/cli
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
</TabItem>
</Tabs>
### Pick a storage plugin
Emigrate uses a <Link href="/plugins/storage/">storage plugin</Link> to store the migration history.
Install the plugin you want to use, for example the <Link href="/plugins/storage/postgres/">PostgreSQL Storage</Link>:
<Tabs>
<TabItem label="npm">
```bash
npm install @emigrate/postgres
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm add @emigrate/postgres
```
</TabItem>
<TabItem label="yarn">
```bash
yarn add @emigrate/postgres
```
</TabItem>
<TabItem label="bun">
```bash
bun add @emigrate/postgres
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {4}
{
"dependencies": {
"@emigrate/cli": "*",
"@emigrate/postgres": "*"
}
}
```
</TabItem>
</Tabs>
### Create your first migration
<LinkCard
href="../../guides/baseline/"
title="Baseline your database"
description="Learn how to create a baseline of your existing database."
/>
Create a new migration file in your project using:
<Tabs>
<TabItem label="npm">
```bash title="Create a new migration file"
npx emigrate new --plugin postgres create users table
```
</TabItem>
<TabItem label="pnpm">
```bash title="Create a new migration file"
pnpm emigrate new --plugin postgres create users table
```
</TabItem>
<TabItem label="yarn">
```bash title="Create a new migration file"
yarn emigrate new --plugin postgres create users table
```
</TabItem>
<TabItem label="bun">
```bash title="Create a new migration file"
bunx --bun emigrate new --plugin postgres create users table
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash title="Create a new migration file"
deno task emigrate new --plugin postgres create users table
```
</TabItem>
</Tabs>
```txt title="emigrate new"
Emigrate new v0.10.0 /your/project/path
✔ migrations/20231215125421364_create_users_table.sql (done) 3ms
1 created
```
:::note
The `postgres` plugin is used here to generate a migration file with the `.sql` extension.
Otherwise the file would have the `.js` extension by default.
:::
:::tip[Did you know?]
You can avoid typing `--plugin postgres` by configuring Emigrate using an `emigrate.config.js` file.
See <Link href="/reference/configuration/">Configuration</Link> for more information.
:::
#### Fill the migration file
Open the migration file in your editor and fill it with your SQL query:
```sql title="migrations/20231215125421364_create_users_table.sql" {2-7}
-- Migration: create users table
CREATE TABLE users (
id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL,
email VARCHAR(255) NOT NULL
);
```
:::note
There's no magic about the first line comment as when using Liquibase, it's just a comment and can be erased.
:::
### Show migration status
To show both pending and already applied migrations (or previously failed), use the `list` command:
<Tabs>
<TabItem label="npm">
```bash title="Show all migrations"
npx emigrate list --storage postgres
```
</TabItem>
<TabItem label="pnpm">
```bash title="Show all migrations"
pnpm emigrate list --storage postgres
```
</TabItem>
<TabItem label="yarn">
```bash title="Show all migrations"
yarn emigrate list --storage postgres
```
</TabItem>
<TabItem label="bun">
```bash title="Show all migrations"
bunx --bun emigrate list --storage postgres
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash title="Show all migrations"
deno task emigrate list --storage postgres
```
</TabItem>
</Tabs>
```txt title="emigrate list"
Emigrate list v0.10.0 /your/project/path
✔ migrations/20231211090830577_another_table.sql (done)
migrations/20231215125421364_create_users_table.sql (pending)
1 done | 1 pending (2 total)
```
### Running the migrations
A good way to test your configuration is to run the migrations in dry mode:
<Tabs>
<TabItem label="npm">
```bash title="Show pending migrations"
npx emigrate up --storage postgres --plugin postgres --dry
```
</TabItem>
<TabItem label="pnpm">
```bash title="Show pending migrations"
pnpm emigrate up --storage postgres --plugin postgres --dry
```
</TabItem>
<TabItem label="yarn">
```bash title="Show pending migrations"
yarn emigrate up --storage postgres --plugin postgres --dry
```
</TabItem>
<TabItem label="bun">
```bash title="Show pending migrations"
bunx --bun emigrate up --storage postgres --plugin postgres --dry
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash title="Show pending migrations"
deno task emigrate up --storage postgres --plugin postgres --dry
```
</TabItem>
</Tabs>
:::note
This will connect to the database using some default values. For ways to configure the connection, see <Link href="/reference/configuration/">Configuration</Link>.
:::
:::caution
Without the `--dry` flag this will run the migration and change your database!
Be sure to configure the connection correctly and use the `--dry` flag to test your configuration.
:::
:::tip[Did you know?]
In the example above the `@emigrate/postgres` plugin is used twice, once for the `--storage` option as a <Link href="/plugins/storage/">Storage Plugin</Link>
and once for the `--plugin` option as a <Link href="/plugins/loaders/">Loader Plugin</Link> to be able to read `.sql` files.
:::

View file

@ -0,0 +1,20 @@
---
title: What's Emigrate?
description: An introduction to Emigrate, the modern database agnostic migration tool.
---
import Link from '@components/Link.astro';
Emigrate is written in [TypeScript](https://www.typescriptlang.org) and is a migration tool for any database or data.
* It's database agnostic - you can use it with any database, or even with non-database data.
* It can be run on multiple platforms - currently [NodeJS](https://nodejs.org), [Bun](https://bun.sh) and [Deno](https://deno.com) is supported, but more platforms is planned.
* It's the successor of [klei-migrate](https://github.com/klei/migrate) and is designed to be compatible with [Immigration](https://github.com/blakeembrey/node-immigration) and many of its storage plugins, as well as [Migrate](https://github.com/tj/node-migrate).
* It supports migration files written using <Link href="/plugins/loaders/default/">CommonJS or ES Modules out of the box</Link>, with any of the following extensions: `.js`, `.cjs` or `.mjs`, and supports [async functions](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/async_function), [Promises](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise) or using the [NodeJS Callback Pattern](https://nodejs.org/en/learn/asynchronous-work/javascript-asynchronous-programming-and-callbacks#handling-errors-in-callbacks).
* Other languages can be used by using a <Link href="/plugins/loaders/">Loader Plugin</Link>.
* The migration history can be stored anywhere using a <Link href="/plugins/storage/">Storage Plugin</Link>.
* The output can be customized using <Link href="/plugins/reporters/">Reporters</Link>.
:::tip[Did you know?]
Thanks to <Link href="/plugins/">the plugin system</Link> you can even write migrations in plain SQL! So no need for Java-based tools like Liquibase or Flyway.
:::

View file

@ -0,0 +1,26 @@
---
title: "Generator Plugins"
---
import { LinkCard, CardGrid } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
A generator plugin is a plugin that generates new migration files.
They are responsible for both generating the new file's name and its contents.
:::tip[Did you know?]
Many of the <Link href="/plugins/loaders/">Loader Plugins</Link> usually include a generator plugin as well.
The generator is responsible for generating migration files in a specific format and the loader is responsible for loading the same format.
:::
## Available Generator Plugins
<CardGrid>
<LinkCard title="JavaScript generator" href="js/" description="A generator that generates .js migration files (using ESM and default export)" />
<LinkCard title="PostgreSQL generator" href="postgres/" description="A generator that generates .sql migration files" />
<LinkCard title="MySQL generator" href="mysql/" description="A generator that generates .sql migration files" />
</CardGrid>
:::note
Instead of having to install a generator plugin, you can also use the much simpler <Link href="/cli/new/#-t---template-path">`--template`</Link> option to specify a custom template file for new migrations.
:::

View file

@ -0,0 +1,82 @@
---
title: "JavaScript Generator"
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
A <Link href="/plugins/generators/">generator plugin</Link> for generating new migration files in JavaScript.
## Installation
<Tabs>
<TabItem label="npm">
```bash
npm install @emigrate/plugin-generate-js
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm add @emigrate/plugin-generate-js
```
</TabItem>
<TabItem label="yarn">
```bash
yarn add @emigrate/plugin-generate-js
```
</TabItem>
<TabItem label="bun">
```bash
bun add @emigrate/plugin-generate-js
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/plugin-generate-js": "*"
}
}
```
</TabItem>
</Tabs>
## Usage
<Tabs>
<TabItem label="npm">
```bash
npx emigrate new --plugin generate-js create some fancy table
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate new --plugin generate-js create some fancy table
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate new --plugin generate-js create some fancy table
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate new --plugin generate-js create some fancy table
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash
deno task emigrate new --plugin generate-js create some fancy table
```
</TabItem>
</Tabs>
For more information see <Link href="/cli/new/">the `new` command</Link>'s documentation.

View file

@ -0,0 +1,82 @@
---
title: "MySQL Generator"
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
The MySQL generator creates new migration files with the `.sql` extension. In the same package you can find the <Link href="/plugins/loaders/mysql/">MySQL Loader</Link> and the <Link href="/plugins/storage/mysql/">MySQL Storage</Link>.
## Installation
<Tabs>
<TabItem label="npm">
```bash
npm install @emigrate/mysql
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm add @emigrate/mysql
```
</TabItem>
<TabItem label="yarn">
```bash
yarn add @emigrate/mysql
```
</TabItem>
<TabItem label="bun">
```bash
bun add @emigrate/mysql
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/mysql": "*"
}
}
```
</TabItem>
</Tabs>
## Usage
<Tabs>
<TabItem label="npm">
```bash
npx emigrate new --plugin mysql create some fancy table
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate new --plugin mysql create some fancy table
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate new --plugin mysql create some fancy table
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate new --plugin mysql create some fancy table
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash
deno task emigrate new --plugin mysql create some fancy table
```
</TabItem>
</Tabs>
For more information see <Link href="/cli/new/">the `new` command</Link>'s documentation.

View file

@ -0,0 +1,82 @@
---
title: "PostgreSQL Generator"
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
The PostgreSQL generator creates new migration files with the `.sql` extension. In the same package you can find the <Link href="/plugins/loaders/postgres/">PostgreSQL Loader</Link> and the <Link href="/plugins/storage/postgres/">PostgreSQL Storage</Link>.
## Installation
<Tabs>
<TabItem label="npm">
```bash
npm install @emigrate/postgres
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm add @emigrate/postgres
```
</TabItem>
<TabItem label="yarn">
```bash
yarn add @emigrate/postgres
```
</TabItem>
<TabItem label="bun">
```bash
bun add @emigrate/postgres
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/postgres": "*"
}
}
```
</TabItem>
</Tabs>
## Usage
<Tabs>
<TabItem label="npm">
```bash
npx emigrate new --plugin postgres create some fancy table
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate new --plugin postgres create some fancy table
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate new --plugin postgres create some fancy table
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate new --plugin postgres create some fancy table
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash
deno task emigrate new --plugin postgres create some fancy table
```
</TabItem>
</Tabs>
For more information see <Link href="/cli/new/">the `new` command</Link>'s documentation.

View file

@ -0,0 +1,35 @@
---
title: The Plugin System
---
import { LinkCard } from '@astrojs/starlight/components';
Emigrate uses a plugin system to allow you to extend and customize the functionality so that it fits your needs.
## The types of plugins
Emigrate supports different types of plugins that all have different purposes.
<LinkCard
href="storage/"
title="Storage Plugins"
description="The most important plugin type. A storage plugin is responsible for storing and handling the migration history state."
/>
<LinkCard
href="loaders/"
title="Loader Plugins"
description="A loader plugin is responsible for loading migration files with a specific format and transforming them into a JavaScript function."
/>
<LinkCard
href="reporters/"
title="Reporters"
description="A reporter is responsible for the output of the migration process. Use a different reporter if you want the output suitable for log shipping or to be machine readable."
/>
<LinkCard
href="generators/"
title="Generator Plugins"
description="A generator plugin generates new migration files. Usually included in loader plugin packages so the same package can be used for both creating and loading migrations in a certain format."
/>

View file

@ -0,0 +1,236 @@
---
title: Default Loader Plugin
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
The default loader plugin is responsible for importing migration files written in JavaScript or TypeScript.
Migration files can be written using either CommonJS or ES Modules.
## Supported extensions
The default loader plugin supports the following extensions:
* `.js` - either CommonJS or ES Modules depending on your package.json's [`type` field](https://nodejs.org/api/packages.html#type)
* `.cjs` - CommonJS
* `.mjs` - ES Modules
* `.ts` - either CommonJS or ES Modules written in TypeScript
* `.cts` - CommonJS written in TypeScript
* `.mts` - ES Modules written in TypeScript
:::note
To enable TypeScript support in NodeJS you also need to follow the <Link href="/guides/typescript/">TypeScript setup guide</Link>.
:::
## Supported exports
The default loader plugin supports the following exports:
### ES Modules
#### Default export
Exporting a function as the default export.
<Tabs>
<TabItem label="Async/Await">
```js
import { database } from 'some-database';
export default async function someMigration() {
await database.query(...);
await database.query(...);
}
```
</TabItem>
<TabItem label="Promises">
```js
import { database } from 'some-database';
export default function someMigration() {
// Remember to return the promise
return database.query(...)
.then(() => {
return database.query(...);
})
}
```
</TabItem>
<TabItem label="Callback">
```js
import { database } from 'some-database';
export default function someMigration(done) {
database.query(..., (err) => {
if (err) {
return done(err);
}
database.query(..., (err) => {
if (err) {
return done(err);
}
done();
});
});
}
```
</TabItem>
</Tabs>
#### Named export
Exporting a function named `up`.
<Tabs>
<TabItem label="Async/Await">
```js
import { database } from 'some-database';
export const up = async () => {
await database.query(...);
await database.query(...);
};
```
</TabItem>
<TabItem label="Promises">
```js
import { database } from 'some-database';
export const up = () => {
// Remember to return the promise
return database.query(...)
.then(() => {
return database.query(...);
})
};
```
</TabItem>
<TabItem label="Callback">
```js
import { database } from 'some-database';
export const up = (done) => {
database.query(..., (err) => {
if (err) {
return done(err);
}
database.query(..., (err) => {
if (err) {
return done(err);
}
done();
});
});
}
```
</TabItem>
</Tabs>
### CommonJS
#### `module.exports`
Exporting a function as the module.
<Tabs>
<TabItem label="Async/Await">
```js
const { database } = require('some-database');
module.exports = async function someMigration() {
await database.query(...);
await database.query(...);
}
```
</TabItem>
<TabItem label="Promises">
```js
const { database } = require('some-database');
module.exports = function someMigration() {
// Remember to return the promise
return database.query(...)
.then(() => {
return database.query(...);
})
}
```
</TabItem>
<TabItem label="Callback">
```js
const { database } = require('some-database');
module.exports = function someMigration(done) {
database.query(..., (err) => {
if (err) {
return done(err);
}
database.query(..., (err) => {
if (err) {
return done(err);
}
done();
});
});
}
```
</TabItem>
</Tabs>
#### `exports.up`
Exporting an `up` function.
<Tabs>
<TabItem label="Async/Await">
```js
const { database } = require('some-database');
exports.up = async () => {
await database.query(...);
await database.query(...);
};
```
</TabItem>
<TabItem label="Promises">
```js
const { database } = require('some-database');
exports.up = () => {
// Remember to return the promise
return database.query(...)
.then(() => {
return database.query(...);
})
};
```
</TabItem>
<TabItem label="Callback">
```js
const { database } = require('some-database');
exports.up = (done) => {
database.query(..., (err) => {
if (err) {
return done(err);
}
database.query(..., (err) => {
if (err) {
return done(err);
}
done();
});
});
}
```
</TabItem>
</Tabs>

View file

@ -0,0 +1,34 @@
---
title: Loader Plugins
---
import { LinkCard, CardGrid } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
Loader plugins are used to transform any file type into a JavaScript function that will be called when the migration file is executed.
Out of the box, Emigrate supports the following file extensions: `.js`, `.cjs`, `.mjs`, `.ts`, `.cts` and `.mts`. And both CommonJS and ES Modules are supported. See the <Link href="/plugins/loaders/default/">Default Loader</Link> for more information.
## Using a loader plugin
You can specify a loader plugin via the `--plugin` (or `-p` for short) option:
```bash
npx emigrate up --plugin mysql
```
Or set it up in your configuration file, see <Link href="/reference/configuration/#plugins">Plugin configuration</Link> for more information.
:::tip[Did you know?]
You can specify multiple loader plugins at the same time, which is needed when you mix file types in your migrations folder.
For example, you can use the `postgres` or `mysql` loader for `.sql` files and a `yaml` loader for `.yml` files.
The <Link href="/plugins/loaders/default/">default loader</Link> will be used for all other file types, and doesn't need to be specified.
:::
## Available Loader Plugins
<CardGrid>
<LinkCard title="Default Loader" href="default/" description="The loader responsible for loading .js, .cjs, .mjs, .ts, .cts and .mts files" />
<LinkCard title="PostgreSQL Loader" href="postgres/" description="Can load and execute .sql files against a PostgreSQL database" />
<LinkCard title="MySQL Loader" href="mysql/" description="Can load and execute .sql files against a MySQL database" />
</CardGrid>

View file

@ -0,0 +1,132 @@
---
title: MySQL Loader Plugin
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
The MySQL loader plugin transforms `.sql` files into JavaScript functions that Emigrate can use to execute the migrations. In the same package you can find the <Link href="/plugins/generators/mysql/">MySQL Generator</Link> and the <Link href="/plugins/storage/mysql/">MySQL Storage</Link>.
## Installation
<Tabs>
<TabItem label="npm">
```bash
npm install @emigrate/mysql
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm add @emigrate/mysql
```
</TabItem>
<TabItem label="yarn">
```bash
yarn add @emigrate/mysql
```
</TabItem>
<TabItem label="bun">
```bash
bun add @emigrate/mysql
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/mysql": "*"
}
}
```
</TabItem>
</Tabs>
## Configuration
The MySQL loader plugin can be configured either using environment variables or by configuring the plugin directly in the <Link href="/reference/configuration/">`emigrate.config.js` file</Link>.
### Configuration file
```js title="emigrate.config.js" {1,4-8}
import { createMysqlLoader } from '@emigrate/mysql';
export default {
plugins: [
createMysqlLoader({
connection: { ... },
}),
],
};
```
#### Options
##### `connection` (required)
**type:** `object | string`
The connection options to use for connecting to the MySQL database when the SQL statements from the migration files are executed. This can either be a connection URI or an object with connection options.
For a list of supported connection options, see the [mysql documentation](https://github.com/mysqljs/mysql#connection-options).
### Environment variables
The following environment variables are supported:
| Variable | Description | Default |
| ---------------- | --------------------------------------------------------------------------------------------------- | -------------- |
| `MYSQL_URL` | The full URI for connecting to a MySQL database, e.g: `"mysql://user:pass@127.0.0.1:3306/database"` | |
| `MYSQL_HOST` | The host on which the MySQL server instance is running | `"localhost"` |
| `MYSQL_USER` | The MySQL user account to use for the authentication | |
| `MYSQL_PASSWORD` | The MySQL user password to use for the authentication | |
| `MYSQL_PORT` | The network port on which the MySQL server is listening | `3306` |
| `MYSQL_DATABASE` | The MySQL database to use for the connection | |
:::note
The `MYSQL_URL` environment variable takes precedence over the other environment variables. If `MYSQL_URL` is set, the other environment variables are ignored.
:::
The environment variables are used when the plugin is used using the `--plugin` command line option:
<Tabs>
<TabItem label="npm">
```bash
npx emigrate list --plugin mysql
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate list --plugin mysql
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate list --plugin mysql
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate list --plugin mysql
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash
deno task emigrate list --plugin mysql
```
</TabItem>
</Tabs>
Or when specifying the plugin in the <Link href="/reference/configuration/">`emigrate.config.js` file</Link> as a string:
```js title="emigrate.config.js" {2}
export default {
plugins: ['mysql'],
};
```

View file

@ -0,0 +1,132 @@
---
title: PostgreSQL Loader Plugin
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
The PostgreSQL loader plugin transforms `.sql` files into JavaScript functions that Emigrate can use to execute the migrations. In the same package you can find the <Link href="/plugins/generators/postgres/">PostgreSQL Generator</Link> and the <Link href="/plugins/storage/postgres/">PostgreSQL Storage</Link>.
## Installation
<Tabs>
<TabItem label="npm">
```bash
npm install @emigrate/postgres
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm add @emigrate/postgres
```
</TabItem>
<TabItem label="yarn">
```bash
yarn add @emigrate/postgres
```
</TabItem>
<TabItem label="bun">
```bash
bun add @emigrate/postgres
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/postgres": "*"
}
}
```
</TabItem>
</Tabs>
## Configuration
The PostgreSQL loader plugin can be configured either using environment variables or by configuring the plugin directly in the <Link href="/reference/configuration/">`emigrate.config.js` file</Link>.
### Configuration file
```js title="emigrate.config.js" {1,4-8}
import { createPostgresLoader } from '@emigrate/postgres';
export default {
plugins: [
createPostgresLoader({
connection: { ... },
}),
],
};
```
#### Options
##### `connection` (required)
**type:** `object | string`
The connection options to use for connecting to the PostgreSQL database when the SQL statements from the migration files are executed. This can either be a connection URI or an object with connection options.
For a list of supported connection options, see the [postgres documentation](https://github.com/porsager/postgres#connection).
### Environment variables
The following environment variables are supported:
| Variable | Description | Default |
| ------------------- | ----------------------------------------------------------------------------------------------------------- | ------------- |
| `POSTGRES_URL` | The full URI for connecting to a PostgreSQL database, e.g: `"postgres://user:pass@127.0.0.1:3306/database"` | |
| `POSTGRES_HOST` | The host on which the PostgreSQL server instance is running | `"localhost"` |
| `POSTGRES_USER` | The PostgreSQL user account to use for the authentication | |
| `POSTGRES_PASSWORD` | The PostgreSQL user password to use for the authentication | |
| `POSTGRES_PORT` | The network port on which the PostgreSQL server is listening | `5432` |
| `POSTGRES_DB` | The PostgreSQL database to use for the connection | |
:::note
The `POSTGRES_URL` environment variable takes precedence over the other environment variables. If `POSTGRES_URL` is set, the other environment variables are ignored.
:::
The environment variables are used when the plugin is used using the `--plugin` command line option:
<Tabs>
<TabItem label="npm">
```bash
npx emigrate list --plugin postgres
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate list --plugin postgres
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate list --plugin postgres
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate list --plugin postgres
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash
deno task emigrate list --plugin postgres
```
</TabItem>
</Tabs>
Or when specifying the plugin in the <Link href="/reference/configuration/">`emigrate.config.js` file</Link> as a string:
```js title="emigrate.config.js" {2}
export default {
plugins: ['postgres'],
};
```

View file

@ -0,0 +1,26 @@
---
title: Reporters
---
import { LinkCard, CardGrid } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
A reporter is a plugin that's responsible for printing the migration progress and results to the console.
## Using a reporter
You can specify a reporter via the `--reporter` (or `-r` for short) option:
```bash
npx emigrate list --reporter pino
```
Or set it up in your configuration file, see <Link href="/reference/configuration/#reporter">Reporter configuration</Link> for more information.
## Available Reporters
<CardGrid>
<LinkCard title="Pretty Reporter" description="The default reporter" href="pretty/" />
<LinkCard title="JSON Reporter" description="A built-in reporter for outputing a JSON object" href="json/" />
<LinkCard title="Pino Reporter" description="A reporter package for outputting new line delimited JSON" href="pino/" />
</CardGrid>

View file

@ -0,0 +1,102 @@
---
title: JSON Reporter
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
An Emigrate reporter that outputs a JSON object.
The reporter is included by default and does not need to be installed separately.
## Usage
### Via CLI
<Tabs>
<TabItem label="npm">
```bash
npx emigrate <command> --reporter json
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate <command> --reporter json
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate <command> --reporter json
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate <command> --reporter json
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash
deno task emigrate <command> --reporter json
```
</TabItem>
</Tabs>
See for instance the <Link href="/cli/up/#-r---reporter-name">Reporter Option</Link> for the `up` command for more information.
### Via configuration file
<Tabs>
<TabItem label="JavaScript">
```js title="emigrate.config.js"
/** @type {import('@emigrate/cli').EmigrateConfig} */
export default {
reporter: 'json',
};
```
</TabItem>
<TabItem label="TypeScript">
```ts title="emigrate.config.ts"
import { type EmigrateConfig } from '@emigrate/cli';
const config: EmigrateConfig = {
reporter: 'json',
};
export default config;
```
</TabItem>
</Tabs>
See <Link href="/reference/configuration/#reporter">Reporter Configuration</Link> for more information.
## Example output
```json
{
"command": "up",
"version": "0.17.2",
"numberTotalMigrations": 1,
"numberDoneMigrations": 0,
"numberSkippedMigrations": 0,
"numberFailedMigrations": 0,
"numberPendingMigrations": 1,
"success": true,
"startTime": 1707206599968,
"endTime": 1707206600005,
"migrations": [
{
"name": "/your/project/migrations/20240206075446123_some_other_table.sql",
"status": "pending",
"duration": 0
}
]
}
```

View file

@ -0,0 +1,125 @@
---
title: Pino Reporter
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
Emigrate's reporter that uses [Pino](https://getpino.io/#/) as the logger.
This is useful in production environments where you want all logs as JSON, which is suitable for log aggregators/shippers.
## Installation
<Tabs>
<TabItem label="npm">
```bash
npm install @emigrate/reporter-pino
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm add @emigrate/reporter-pino
```
</TabItem>
<TabItem label="yarn">
```bash
yarn add @emigrate/reporter-pino
```
</TabItem>
<TabItem label="bun">
```bash
bun add @emigrate/reporter-pino
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/reporter-pino": "*"
}
}
```
</TabItem>
</Tabs>
## Usage
:::tip
The `@emigrate/reporter-` prefix is optional when using this reporter.
:::
### Via CLI
<Tabs>
<TabItem label="npm">
```bash
npx emigrate <command> --reporter pino
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate <command> --reporter pino
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate <command> --reporter pino
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate <command> --reporter pino
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash
deno task emigrate <command> --reporter pino
```
</TabItem>
</Tabs>
See for instance the <Link href="/cli/up/#-r---reporter-name">Reporter Option</Link> for the `up` command for more information.
### Via configuration file
<Tabs>
<TabItem label="JavaScript">
```js title="emigrate.config.js"
/** @type {import('@emigrate/cli').EmigrateConfig} */
export default {
reporter: 'pino',
};
```
</TabItem>
<TabItem label="TypeScript">
```ts title="emigrate.config.ts"
import { type EmigrateConfig } from '@emigrate/cli';
const config: EmigrateConfig = {
reporter: 'pino',
};
export default config;
```
</TabItem>
</Tabs>
See <Link href="/reference/configuration/#reporter">Reporter Configuration</Link> for more information.
## Example output
```json
{"level":30,"time":1702907697803,"scope":"list","version":"0.10.0","name":"emigrate","parameters":{"cwd":"/your/project/dir","dry":false,"directory":"migration-folder"},"msg":"Emigrate \"list\" initialized"}
{"level":30,"time":1702907697836,"scope":"list","version":"0.10.0","name":"emigrate","migrationCount":1,"msg":"1 pending migrations to run"}
{"level":30,"time":1702907697836,"scope":"list","version":"0.10.0","name":"emigrate","migration":"migration-folder/20231218135441244_create_some_table.sql","msg":"20231218135441244_create_some_table.sql (pending)"}
{"level":30,"time":1702907697836,"scope":"list","version":"0.10.0","name":"emigrate","result":{"failed":0,"done":0,"skipped":0,"pending":1,"total":1},"msg":"Emigrate \"list\" finished successfully"}
```

View file

@ -0,0 +1,90 @@
---
title: Pretty Reporter (default)
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
Emigrate's default reporter. It recognizes if the current terminal is an interactive shell (or if it's a CI environment), if that's the case _no_ animations will be shown.
The reporter is included by default and does not need to be installed separately.
## Usage
By default, Emigrate uses the "pretty" reporter, but it can also be explicitly set by using the <Link href="/cli/up/#-r---reporter-name">`--reporter`</Link> flag.
<Tabs>
<TabItem label="npm">
```bash
npx emigrate <command> --reporter pretty
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate <command> --reporter pretty
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate <command> --reporter pretty
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate <command> --reporter pretty
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash
deno task emigrate <command> --reporter pretty
```
</TabItem>
</Tabs>
Or by setting it in the configuration file.
<Tabs>
<TabItem label="JavaScript">
```js title="emigrate.config.js"
/** @type {import('@emigrate/cli').EmigrateConfig} */
export default {
reporter: 'pretty',
};
```
</TabItem>
<TabItem label="TypeScript">
```ts title="emigrate.config.ts"
import { type EmigrateConfig } from '@emigrate/cli';
const config: EmigrateConfig = {
reporter: 'pretty',
};
export default config;
```
</TabItem>
</Tabs>
See <Link href="/reference/configuration/#reporter">Reporter Configuration</Link> for more information.
## Example output
```bash
Emigrate up v0.17.2 /your/working/directory (dry run)
1 pending migrations to run
migration-folder/20231218135441244_create_some_table.sql (pending)
1 pending (1 total)
```

View file

@ -0,0 +1,68 @@
---
title: File System Storage
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
The File System Storage is a storage driver that stores the migration history in a `.json` file on the local file system.
:::caution
This is suitable for simple setups, but for more advanced setups for instance where the application is deployed on multiple servers, you should use a database storage plugin instead.
:::
## Installation
<Tabs>
<TabItem label="npm">
```bash
npm install @emigrate/storage-fs
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm add @emigrate/storage-fs
```
</TabItem>
<TabItem label="yarn">
```bash
yarn add @emigrate/storage-fs
```
</TabItem>
<TabItem label="bun">
```bash
bun add @emigrate/storage-fs
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/storage-fs": "*"
}
}
```
</TabItem>
</Tabs>
## Configuration
The File System Storage can be configured easily in your <Link href="/reference/configuration/">`emigrate.config.js` file</Link>:
```js {1,4-6}
import storageFs from '@emigrate/storage-fs';
export default {
storage: storageFs({
filename: './migrations.json',
}),
};
```
### Options
#### `filename`
**type:** `string`
Should be a path either relative to the project root or an absolute path. It doesn't need to have the `.json` extension, but its contents will be in JSON format.

View file

@ -0,0 +1,35 @@
---
title: Storage Plugins
---
import { LinkCard, CardGrid } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
Storage plugins are used for storing and reading the migration history state.
Usually you'll want to store the migration history in the same database as the one which your migration files are targeting.
## Using a storage plugin
You can specify a storage plugin via the `--storage` (or `-s` for short) option:
```bash
npx emigrate list --storage postgres
```
Or set it up in your configuration file, see <Link href="/reference/configuration/#storage">Storage configuration</Link> for more information.
## Available storage plugins
<CardGrid>
<LinkCard title="File System" href="file-system/" description="The most basic storage plugin - for simple setups" />
<LinkCard title="PostgreSQL" href="postgres/" description="A storage plugin that uses a PostgreSQL database for storing the migration history state" />
<LinkCard title="MySQL" href="mysql/" description="A storage plugin that uses a MySQL database for storing the migration history state" />
</CardGrid>
:::note
More storage plugins are coming soon!
:::
:::tip[Is your database missing?]
Writing a storage plugin is easy! Check out the <Link href="/reference/storage-plugin-api/">Storage Plugin API</Link> for more information.
:::

View file

@ -0,0 +1,107 @@
---
title: MySQL Storage
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
The MySQL storage plugin uses a MySQL database to store the migration history (*duh*). In the same package you can find the <Link href="/plugins/loaders/mysql/">MySQL Loader</Link> and the <Link href="/plugins/generators/mysql/">MySQL Generator</Link>.
## Installation
<Tabs>
<TabItem label="npm">
```bash
npm install @emigrate/mysql
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm add @emigrate/mysql
```
</TabItem>
<TabItem label="yarn">
```bash
yarn add @emigrate/mysql
```
</TabItem>
<TabItem label="bun">
```bash
bun add @emigrate/mysql
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/mysql": "*"
}
}
```
</TabItem>
</Tabs>
## Configuration
The MySQL storage can be configured either using environment variables or by configuring the plugin directly in the <Link href="/reference/configuration/">`emigrate.config.js` file</Link>.
### Configuration file
```js title="emigrate.config.js" {1,4-7}
import { createMysqlStorage } from '@emigrate/mysql';
export default {
storage: createMysqlStorage({
table: 'migrations',
connection: { ... },
}),
};
```
#### Options
##### `table`
**type:** `string`
**default:** `"migrations"`
The name of the table to use for storing the migrations.
##### `connection` (required)
**type:** `object | string`
The connection options to use for connecting to the MySQL database. This can either be a connection URI or an object with connection options.
For a list of supported connection options, see the [mysql documentation](https://github.com/mysqljs/mysql#connection-options).
### Environment variables
The following environment variables are supported:
| Variable | Description | Default |
| ---------------- | --------------------------------------------------------------------------------------------------- | -------------- |
| `MYSQL_TABLE` | The name of the table to use for storing the migrations | `"migrations"` |
| `MYSQL_URL` | The full URI for connecting to a MySQL database, e.g: `"mysql://user:pass@127.0.0.1:3306/database"` | |
| `MYSQL_HOST` | The host on which the MySQL server instance is running | `"localhost"` |
| `MYSQL_USER` | The MySQL user account to use for the authentication | |
| `MYSQL_PASSWORD` | The MySQL user password to use for the authentication | |
| `MYSQL_PORT` | The network port on which the MySQL server is listening | `3306` |
| `MYSQL_DATABASE` | The MySQL database to use for the connection | |
:::note
The `MYSQL_URL` environment variable takes precedence over the other environment variables. If `MYSQL_URL` is set, the other environment variables are ignored, except for `MYSQL_TABLE`.
:::
The environment variables are used when the storage plugin is used using the `--storage` command line option:
```bash
npx emigrate list --storage mysql
```
Or when specifying the storage in the <Link href="/reference/configuration/">`emigrate.config.js` file</Link> as a string:
```js title="emigrate.config.js" {2}
export default {
storage: 'mysql',
};
```

View file

@ -0,0 +1,107 @@
---
title: PostgreSQL Storage
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
The PostgreSQL storage plugin uses a PostgreSQL database to store the migration history (*duh*). In the same package you can find the <Link href="/plugins/loaders/postgres/">PostgreSQL Loader</Link> and the <Link href="/plugins/generators/postgres/">PostgreSQL Generator</Link>.
## Installation
<Tabs>
<TabItem label="npm">
```bash
npm install @emigrate/postgres
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm add @emigrate/postgres
```
</TabItem>
<TabItem label="yarn">
```bash
yarn add @emigrate/postgres
```
</TabItem>
<TabItem label="bun">
```bash
bun add @emigrate/postgres
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"dependencies": {
"@emigrate/postgres": "*"
}
}
```
</TabItem>
</Tabs>
## Configuration
The PostgreSQL storage can be configured either using environment variables or by configuring the plugin directly in the <Link href="/reference/configuration/">`emigrate.config.js` file</Link>.
### Configuration file
```js title="emigrate.config.js" {1,4-7}
import { createPostgresStorage } from '@emigrate/postgres';
export default {
storage: createPostgresStorage({
table: 'migrations',
connection: { ... },
}),
};
```
#### Options
##### `table`
**type:** `string`
**default:** `"migrations"`
The name of the table to use for storing the migrations.
##### `connection` (required)
**type:** `object | string`
The connection options to use for connecting to the PostgreSQL database. This can either be a connection URI or an object with connection options.
For a list of supported connection options, see the [postgres documentation](https://github.com/porsager/postgres#connection).
### Environment variables
The following environment variables are supported:
| Variable | Description | Default |
| ------------------- | ----------------------------------------------------------------------------------------------------------- | -------------- |
| `POSTGRES_TABLE` | The name of the table to use for storing the migrations | `"migrations"` |
| `POSTGRES_URL` | The full URI for connecting to a PostgreSQL database, e.g: `"postgres://user:pass@127.0.0.1:3306/database"` | |
| `POSTGRES_HOST` | The host on which the PostgreSQL server instance is running | `"localhost"` |
| `POSTGRES_USER` | The PostgreSQL user account to use for the authentication | |
| `POSTGRES_PASSWORD` | The PostgreSQL user password to use for the authentication | |
| `POSTGRES_PORT` | The network port on which the PostgreSQL server is listening | `5432` |
| `POSTGRES_DB` | The PostgreSQL database to use for the connection | |
:::note
The `POSTGRES_URL` environment variable takes precedence over the other environment variables. If `POSTGRES_URL` is set, the other environment variables are ignored, except for `POSTGRES_TABLE`.
:::
The environment variables are used when the storage plugin is used using the `--storage` command line option:
```bash
npx emigrate list --storage postgres
```
Or when specifying the storage in the <Link href="/reference/configuration/">`emigrate.config.js` file</Link> as a string:
```js title="emigrate.config.js" {2}
export default {
storage: 'postgres',
};
```

View file

@ -0,0 +1,175 @@
---
title: Configuration Reference
description: How to configure Emigrate to your needs
sidebar:
order: 1
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
Emigrate can be configured using a configuration file, and it uses [Cosmiconfig](https://github.com/cosmiconfig/cosmiconfig) under the hood
so you can use a variety of formats and locations for your configuration file.
## Configure Emigrate
<Tabs>
<TabItem label="JavaScript">
```js title="emigrate.config.js"
/** @type {import('@emigrate/cli').EmigrateConfig} */
export default {
directory: 'migrations',
};
```
</TabItem>
<TabItem label="TypeScript">
```ts title="emigrate.config.ts"
import { type EmigrateConfig } from '@emigrate/cli';
const config: EmigrateConfig = {
directory: 'migrations',
};
export default config;
```
</TabItem>
</Tabs>
You can specify the following options:
### `directory`
**type:** `string`
Set the directory where your migrations are located, relative to the project root. This option is required by all Emigrate commands.
### `reporter`
**type:** `"pretty" | "json" | string | EmigrateReporter | Promise<EmigrateReporter> | (() => Promise<EmigrateReporter>)`
**default:** `"pretty"` - the default reporter
Set the reporter to use for the different commands. Specifying a <Link href="/plugins/reporters/">reporter</Link> is most useful in a CI or production environment where you either ship logs or want to have a machine-readable format.
```js title="emigrate.config.js" {2}
export default {
reporter: 'json',
};
```
If you want to use different reporters for different commands, you can use an object:
```js title="emigrate.config.js" {2-4}
export default {
up: {
reporter: 'json',
},
new: {
reporter: 'pretty', // Not really necessary, as it's the default
},
};
```
:::note
Commands that are not specified will use the default reporter.
:::
:::tip[Did you know?]
The default reporter automatically detects if the current environment is an interactive terminal or not, and will only render animations and similar if it is.
:::
### `color`
**type:** `boolean | undefined`
**default:** `undefined`
Set whether to force colors in the output or not. This option is passed to the reporter which should respect it.
```js title="emigrate.config.js" {2}
export default {
color: false,
};
```
### `storage`
**type:** `string | EmigrateStorage | Promise<EmigrateStorage> | (() => Promise<EmigrateStorage>)`
Set the <Link href="/plugins/storage/">storage plugin</Link> to use for storing and reading the migration history. This option is required by all Emigrate commands except `new` which doesn't use it.
```js title="emigrate.config.js" {2}
export default {
storage: 'mysql',
};
```
:::note
Each storage plugin can have its own configuration options, see the corresponding <Link href="/plugins/storage/#available-storage-plugins">Storage Plugin</Link> section for more information.
:::
### `plugins`
**type:** `Array<string | EmigratePlugin | Promise<EmigratePlugin> | (() => Promise<EmigratePlugin>)>`
Set the plugins to use for the different commands. There are different types of plugins, and some are only useful for specific commands.
In short:
* <Link href="/plugins/loaders/">Loader Plugins</Link> - are used for transforming non-JavaScript files into JavaScript files that can be executed by Node.js. These are only used by the `up` command.
* <Link href="/plugins/generators/">Generator Plugins</Link> - are used for generating new migration files. These are only used by the `new` command.
```js title="emigrate.config.js" {2}
export default {
plugins: ['typescript'],
};
```
:::tip[Did you know?]
The same package can expose multiple plugins, so you can specify the plugin only once and it can be used as both a loader and a generator (and storage, in the case of <Link href="/plugins/storage/mysql/">MySQL</Link> for instance).
:::
### `template`
**type:** `string`
Set the path to a template file to use when creating new migrations. This option is only used by the `new` command.
```js title="emigrate.config.js" {2-4}
export default {
new: {
template: 'path/to/template.js',
},
};
```
The migration file will use the template file's extension, unless the [extension](#extension) option is set.
### `extension`
**type:** `string`
Set the extension to use for new migrations. This option is only used by the `new` command.
```js title="emigrate.config.js" {2-4}
export default {
new: {
extension: '.ts',
},
};
```
Will create new migration files with the `.ts` extension.
### `abortRespite`
**type:** `number`
**default:** `10`
Customize the number of seconds to wait before abandoning a running migration when the process is about to shutdown, for instance when the user presses `Ctrl+C` or when the container is being stopped (if running inside a container).
```js title="emigrate.config.js" {2}
export default {
abortRespite: 10,
};
```

View file

@ -0,0 +1,93 @@
---
title: Migration Types
---
import Link from '@components/Link.astro';
These are common TypeScript types for migration files used by <Link href="/plugins/">plugins</Link>.
## `MigrationMetadata`
```ts
type MigrationMetadata = {
/**
* The name of the migration file
*
* @example 20210901123456000_create_users_table.js
*/
name: string;
/**
* The directory where the migration file is located, relative to the current working directory
*
* @example migrations
*/
directory: string;
/**
* The full absolute path to the migration file
*
* @example /home/user/project/migrations/20210901123456000_create_users_table.js
*/
filePath: string;
/**
* The relative path to the migration file, relative to the current working directory
*
* @example migrations/20210901123456000_create_users_table.js
*/
relativeFilePath: string;
/**
* The current working directory (the same as process.cwd())
*/
cwd: string;
/**
* The extension of the migration file, with a leading period
*
* @example .js
*/
extension: string;
};
```
## `MigrationMetadataFinished`
```ts
type MigrationMetadataFinished = MigrationMetadata & {
status: 'done' | 'failed' | 'skipped' | 'pending';
/**
* The duration of the migration in milliseconds
*/
duration: number;
/**
* The error that occurred during the migration, if `status` is `"failed"`
*/
error?: Error;
};
```
## `MigrationHistoryEntry`
```ts
type MigrationHistoryEntry = {
/**
* The name of the migration.
*
* @example 20210901123456000_create_users_table.js
*/
name: string;
/**
* The date when the migration was executed.
*/
date: Date;
/**
* The status of the migration.
*
* As an entry is only added to the history after the migration has finished, this will always be either `"done"` or `"failed"`.
*/
status: 'done' | 'failed';
/**
* The error that occurred during the migration, if `status` is `"failed"`
*
* This should be a plain object, as it is serialized when passed to the storage plugin's `onError` method.
*/
error?: Record<string, unknown>;
};
```

View file

@ -0,0 +1,86 @@
---
title: Storage Plugin API
---
import Link from '@components/Link.astro';
When writing a storage plugin, you will need to implement the following interface:
```ts
type EmigrateStorage = {
initializeStorage(): Promise<Storage>;
};
```
Where `Storage` is the following interface:
```ts
type Storage = {
/**
* Acquire a lock on the given migrations.
*
* To best support concurrent migrations (e.g. when multiple services are deployed at the same time and want to migrate the same database)
* the plugin should try to lock all migrations at once (i.e. in a transaction) and ignore migrations that are already locked (or done).
* The successfully locked migrations should be returned and are the migrations that will be executed.
*
* If one of the migrations to lock is in a failed state, the plugin should throw an error to abort the migration attempt.
*
* @returns The migrations that were successfully locked.
*/
lock(migrations: MigrationMetadata[]): Promise<MigrationMetadata[]>;
/**
* The unlock method is called after all migrations have been executed or when the process is interrupted (e.g. by a SIGTERM or SIGINT signal).
*
* Depending on the plugin implementation, the unlock method is usually a no-op for already succeeded or failed migrations.
*
* @param migrations The previously successfully locked migrations that should now be unlocked.
*/
unlock(migrations: MigrationMetadata[]): Promise<void>;
/**
* Remove a migration from the history.
*
* This is used to remove a migration from the history which is needed for failed migrations to be re-executed.
*
* @param migration The migration that should be removed from the history.
*/
remove(migration: MigrationMetadata): Promise<void>;
/**
* Get the history of previously executed migrations.
*
* For failed migrations, the error property should be set.
* Emigrate will not sort the history entries, so the plugin should return the entries in the order they were executed.
* The order doesn't affect the execution of migrations, but it does affect the order in which the history is displayed in the CLI.
* Migrations that have not yet been executed will always be run in alphabetical order.
*
* The history has two purposes:
* 1. To determine which migrations have already been executed.
* 2. To list the migration history in the CLI.
*/
getHistory(): AsyncIterable<MigrationHistoryEntry>;
/**
* Called when a migration has been successfully executed.
*
* @param migration The name of the migration that should be marked as done.
*/
onSuccess(migration: MigrationMetadataFinished): Promise<void>;
/**
* Called when a migration has failed.
*
* The passed error will be serialized so it's easily storable it in the history.
* If the original Error instance is needed it's available as the `error` property on the finished migration.
*
* @param migration The name of the migration that should be marked as failed.
* @param error The error that caused the migration to fail. Serialized for easy storage.
*/
onError(migration: MigrationMetadataFinished, error: SerializedError): Promise<void>;
/**
* Called when the command is finished or aborted (e.g. by a SIGTERM or SIGINT signal).
*
* Use this to clean up any resources like database connections or file handles.
*/
end(): Promise<void>;
};
```
See the <Link href="/reference/migration-types/">Migration Types</Link> page for more information about the `MigrationMetadata`, `MigrationMetadataFinished`, and `MigrationHistoryEntry` types.

3
docs/src/env.d.ts vendored Normal file
View file

@ -0,0 +1,3 @@
/* eslint-disable @typescript-eslint/triple-slash-reference */
/// <reference path="../.astro/types.d.ts" />
/// <reference types="astro/client" />

3
docs/src/tailwind.css Normal file
View file

@ -0,0 +1,3 @@
@tailwind base;
@tailwind components;
@tailwind utilities;

17
docs/tailwind.config.mjs Normal file
View file

@ -0,0 +1,17 @@
import starlightPlugin from '@astrojs/starlight-tailwind';
import colors from 'tailwindcss/colors';
/** @type {import('tailwindcss').Config} */
// eslint-disable-next-line import/no-anonymous-default-export
export default {
content: ['./src/**/*.{astro,html,js,jsx,md,mdx,svelte,ts,tsx,vue}'],
theme: {
extend: {
colors: {
accent: colors.orange,
gray: colors.slate,
},
},
},
plugins: [starlightPlugin()],
};

9
docs/tsconfig.json Normal file
View file

@ -0,0 +1,9 @@
{
"extends": "astro/tsconfigs/strictest",
"compilerOptions": {
"baseUrl": ".",
"paths": {
"@components/*": ["src/components/*"]
}
}
}

View file

@ -10,7 +10,7 @@
"build:watch": "turbo run build:watch",
"checks": "turbo run checks",
"release": "run-s build && changeset publish",
"format": "prettier --write \"**/*.{ts,tsx,md,json,js}\"",
"format": "prettier --write \"**/*.{ts,tsx,md,json,js,mjs}\"",
"lint": "turbo run lint",
"test": "turbo run test",
"test:watch": "turbo run test:watch",
@ -37,9 +37,10 @@
"bugs": "https://github.com/aboviq/emigrate/issues",
"license": "MIT",
"volta": {
"node": "20.9.0",
"pnpm": "8.10.2"
"node": "22.15.0",
"pnpm": "9.4.0"
},
"packageManager": "pnpm@9.4.0",
"engines": {
"node": ">=18"
},
@ -61,27 +62,31 @@
},
"overrides": [
{
"files": "packages/**/*.test.ts",
"files": [
"packages/**/*.test.ts",
"packages/**/*.integration.ts"
],
"rules": {
"@typescript-eslint/no-floating-promises": 0
"@typescript-eslint/no-floating-promises": 0,
"max-params": 0
}
}
]
},
"dependencies": {
"@changesets/cli": "2.26.2",
"@commitlint/cli": "18.4.2",
"@commitlint/config-conventional": "18.4.2",
"@manypkg/cli": "0.21.0",
"@types/node": "20.9.2",
"@changesets/cli": "2.27.1",
"@commitlint/cli": "18.6.1",
"@commitlint/config-conventional": "18.6.1",
"@types/node": "20.10.4",
"glob": "10.3.10",
"husky": "8.0.3",
"lint-staged": "15.1.0",
"lint-staged": "15.2.0",
"npm-run-all": "4.1.5",
"prettier": "3.1.0",
"tsx": "4.1.2",
"turbo": "1.10.16",
"typescript": "5.2.2",
"prettier": "3.1.1",
"testcontainers": "10.24.2",
"tsx": "4.15.7",
"turbo": "2.0.5",
"typescript": "5.5.2",
"xo": "0.56.0"
}
}

View file

@ -1,5 +1,220 @@
# @emigrate/cli
## 0.18.4
### Patch Changes
- d779286: Upgrade TypeScript to v5.5 and enable [isolatedDeclarations](https://devblogs.microsoft.com/typescript/announcing-typescript-5-5/#isolated-declarations)
- Updated dependencies [d779286]
- @emigrate/plugin-tools@0.9.8
- @emigrate/types@0.12.2
## 0.18.3
### Patch Changes
- ca154fa: Minimize package size by excluding \*.tsbuildinfo files
- Updated dependencies [ca154fa]
- @emigrate/plugin-tools@0.9.7
- @emigrate/types@0.12.2
## 0.18.2
### Patch Changes
- 4152209: Handle the case where the config is returned as an object with a nested `default` property
## 0.18.1
### Patch Changes
- 57a0991: Cleanup AbortSignal listeners when they are not needed to avoid MaxListenersExceededWarning when migrating many migrations at once
## 0.18.0
### Minor Changes
- c838ffb: Make it possible to write the Emigrate configuration file in TypeScript and load it using `tsx` in a NodeJS environment by importing packages provided using the `--import` CLI option before loading the configuration file. This makes it possible to run Emigrate in production with a configuration file written in TypeScript without having the `typescript` package installed.
- 18382ce: Add a built-in "json" reporter for outputting a single JSON object
- 18382ce: Rename the "default" reporter to "pretty" and make it possible to specify it using the `--reporter` CLI option or in the configuration file
### Patch Changes
- c838ffb: Don't use the `typescript` package for loading an Emigrate configuration file written in TypeScript in a Bun or Deno environment
## 0.17.2
### Patch Changes
- 61cbcbd: Force exiting after 10 seconds should not change the exit code, i.e. if all migrations have run successfully the exit code should be 0
## 0.17.1
### Patch Changes
- 543b7f6: Use setTimeout/setInterval from "node:timers" so that .unref() correctly works with Bun
- db656c2: Enable NPM provenance
- Updated dependencies [db656c2]
- @emigrate/plugin-tools@0.9.6
- @emigrate/types@0.12.1
## 0.17.0
### Minor Changes
- 0faebbe: Add support for passing the relative path to a migration file to remove from the history using the "remove" command
- 9109238: When the `--from` or `--to` CLI options are used the given migration name (or path to migration file) must exist. This is a BREAKING CHANGE from before. The reasoning is that by forcing the migrations to exist you avoid accidentally running migrations you don't intend to, because a simple typo could have the effect that many unwanted migrations is executed so it's better to show an error if that's the case.
- 1f139fd: Completely rework how the "remove" command is run, this is to make it more similar to the "up" and "list" command as now it will also use the `onMigrationStart`, `onMigrationSuccess` and `onMigrationError` reporter methods when reporting the command progress. It's also in preparation for adding `--from` and `--to` CLI options for the "remove" command, similar to how the same options work for the "up" command.
- 9109238: Add support for passing relative paths to migration files as the `--from` and `--to` CLI options. This is very useful from terminals that support autocomplete for file paths. It also makes it possible to copy the path to a migration file from Emigrate's output and use that as either `--from` and `--to` directly.
### Patch Changes
- f1b9098: Only include files when collecting migrations, i.e. it should be possible to have folders inside your migrations folder.
- 2f6b4d2: Don't dim decimal points in durations in the default reporter
- f2d4bb3: Set Emigrate error instance names from their respective constructor's name for consistency and correct error deserialization.
- ef45be9: Show number of skipped migrations correctly in the command output
- Updated dependencies [94ad9fe]
- @emigrate/types@0.12.0
- @emigrate/plugin-tools@0.9.5
## 0.16.2
### Patch Changes
- b56b6da: Handle migration history entries without file extensions for migration files with periods in their names that are not part of the file extension. Previously Emigrate would attempt to re-run these migrations, but now it will correctly ignore them. E.g. the migration history contains an entry for "migration.file.name" and the migration file is named "migration.file.name.js" it will not be re-run.
## 0.16.1
### Patch Changes
- 121492b: Sort migration files lexicographically correctly by using the default Array.sort implementation
## 0.16.0
### Minor Changes
- a4da353: Handle process interruptions gracefully, e.g. due to receiving a SIGINT or SIGTERM signal. If a migration is currently running when the process is about to shutdown it will have a maximum of 10 more seconds to finish before being deserted (there's no way to cancel a promise sadly, and many database queries are not easy to abort either). The 10 second respite length can be customized using the --abort-respite CLI option or the abortRespite config.
### Patch Changes
- Updated dependencies [ce15648]
- @emigrate/types@0.11.0
- @emigrate/plugin-tools@0.9.4
## 0.15.0
### Minor Changes
- f515c8a: Add support for the --no-execution option to the "up" command to be able to log migrations as successful without actually running them. Can for instance be used for baselining a database or logging manually run migrations as successful.
- 9ef0fa2: Add --from and --to CLI options to control which migrations to include or skip when executing migrations.
- 02c142e: Add --limit option to the "up" command, for limiting the number of migrations to run
### Patch Changes
- bf4d596: Clarify which cli options that needs parameters
- 98adcda: Use better wording in the header in the console output from the default reporter
## 0.14.1
### Patch Changes
- 73a8a42: Support stored migration histories that have only stored the migration file names without file extension and assume it's .js files in that case. This is to be compatible with a migration history generated by Immigration.
## 0.14.0
### Minor Changes
- b083e88: Upgrade cosmiconfig to 9.0.0
## 0.13.1
### Patch Changes
- 83dc618: Remove the --enable-source-maps flag from the shebang for better NodeJS compatibility
## 0.13.0
### Minor Changes
- 9a605a8: Add support for loading TypeScript migration files in the default loader
- 9a605a8: Add a guide for running migration files written in TypeScript to the documentation
## 0.12.0
### Minor Changes
- 9f91bdc: Add support for the `--import` option to import modules/packages before any command is run. This can for instance be used to load environment variables using the [dotenv](https://github.com/motdotla/dotenv) package with `--import dotenv/config`.
- f9a16d8: Add `color` option to the CLI and configuration file, which is used to force enable/disable color output from the reporter (the option is passed to the chosen reporter which should respect it)
- e6e4433: BREAKING CHANGE: Rename the `extension` short CLI option from `-e` to `-x` in preparation for an upcoming option that will take its place
### Patch Changes
- Updated dependencies [f9a16d8]
- @emigrate/types@0.10.0
- @emigrate/plugin-tools@0.9.3
## 0.11.2
### Patch Changes
- Updated dependencies [a6c6e6d]
- @emigrate/types@0.9.1
- @emigrate/plugin-tools@0.9.2
## 0.11.1
### Patch Changes
- Updated dependencies [3a8b06b]
- @emigrate/plugin-tools@0.9.1
## 0.11.0
### Minor Changes
- ce6946c: Emigrate supports Bun, make use of the `bun` key in package.json `exports`
### Patch Changes
- Updated dependencies [ce6946c]
- @emigrate/plugin-tools@0.9.0
- @emigrate/types@0.9.0
## 0.10.0
### Minor Changes
- cae6d11: Make Emigrate Error instances deserializable using the serialize-error package, and also switch to its serializeError method
- cae6d11: Adapt to the new discriminating union types in @emigrate/types
### Patch Changes
- cae6d11: Shutdown the storage correctly in case of directory or file reading errors
- Updated dependencies [cae6d11]
- Updated dependencies [cae6d11]
- Updated dependencies [cae6d11]
- @emigrate/types@0.8.0
- @emigrate/plugin-tools@0.8.0
## 0.9.0
### Minor Changes
- 1434be5: The default reporter now prints the relative path instead of only the migration file name when logging migrations. Thanks to this most shells supports opening the corresponding migration file by clicking it.
- 1434be5: Print Emigrate CLI version when using the default reporter
## 0.8.0
### Minor Changes
- bad4e25: Pass the Emigrate CLI's version number to reporters
- 960ce08: Add --help and --version options to main command
### Patch Changes
- Updated dependencies [bad4e25]
- @emigrate/plugin-tools@0.7.0
## 0.7.0
### Minor Changes

View file

@ -2,20 +2,104 @@
Emigrate is a tool for managing database migrations. It is designed to be simple yet support advanced setups, modular and extensible.
📖 Read the [documentation](https://emigrate.dev) for more information!
## Installation
Install the Emigrate CLI in your project:
```bash
npm install --save-dev @emigrate/cli
npm install @emigrate/cli
# or
pnpm add @emigrate/cli
# or
yarn add @emigrate/cli
# or
bun add @emigrate/cli
```
## Usage
```text
Usage: emigrate <options>/<command>
Options:
-h, --help Show this help message and exit
-v, --version Print version number and exit
Commands:
up Run all pending migrations (or do a dry run)
new Create a new migration file
list List all migrations and their status
remove Remove entries from the migration history
```
### `emigrate up`
```text
Usage: emigrate up [options]
Run all pending migrations
Options:
-h, --help Show this help message and exit
-d, --directory <path> The directory where the migration files are located (required)
-i, --import <module> Additional modules/packages to import before running the migrations (can be specified multiple times)
For example if you want to use Dotenv to load environment variables or when using TypeScript
-s, --storage <name> The storage to use for where to store the migration history (required)
-p, --plugin <name> The plugin(s) to use (can be specified multiple times)
-r, --reporter <name> The reporter to use for reporting the migration progress
-l, --limit <count> Limit the number of migrations to run
-f, --from <name/path> Start running migrations from the given migration name or relative file path to a migration file,
the given name or path needs to exist. The same migration and those after it lexicographically will be run
-t, --to <name/path> Skip migrations after the given migration name or relative file path to a migration file,
the given name or path needs to exist. The same migration and those before it lexicographically will be run
--dry List the pending migrations that would be run without actually running them
--color Force color output (this option is passed to the reporter)
--no-color Disable color output (this option is passed to the reporter)
--no-execution Mark the migrations as executed and successful without actually running them,
which is useful if you want to mark migrations as successful after running them manually
--abort-respite <sec> The number of seconds to wait before abandoning running migrations after the command has been aborted (default: 10)
Examples:
emigrate up --directory src/migrations -s fs
emigrate up -d ./migrations --storage @emigrate/mysql
emigrate up -d src/migrations -s postgres -r json --dry
emigrate up -d ./migrations -s mysql --import dotenv/config
emigrate up --limit 1
emigrate up --to 20231122120529381_some_migration_file.js
emigrate up --to 20231122120529381_some_migration_file.js --no-execution
```
### Examples
Create a new migration:
```bash
emigrate new -d migrations -e .js create some fancy table
npx emigrate new -d migrations create some fancy table
# or
pnpm emigrate new -d migrations create some fancy table
# or
yarn emigrate new -d migrations create some fancy table
# or
bunx --bun emigrate new -d migrations create some fancy table
```
Will create a new empty JavaScript migration file with the name "YYYYMMDDHHmmssuuu_create_some_fancy_table.js" in the `migrations` directory.

View file

@ -1,8 +1,9 @@
{
"name": "@emigrate/cli",
"version": "0.7.0",
"version": "0.18.4",
"publishConfig": {
"access": "public"
"access": "public",
"provenance": true
},
"description": "",
"type": "module",
@ -18,7 +19,8 @@
"emigrate": "dist/cli.js"
},
"files": [
"dist"
"dist",
"!dist/*.tsbuildinfo"
],
"scripts": {
"build": "tsc --pretty",
@ -35,7 +37,9 @@
"immigration"
],
"devDependencies": {
"@emigrate/tsconfig": "workspace:*"
"@emigrate/tsconfig": "workspace:*",
"@types/bun": "1.0.5",
"bun-types": "1.0.26"
},
"author": "Aboviq AB <dev@aboviq.com> (https://www.aboviq.com)",
"homepage": "https://github.com/aboviq/emigrate/tree/main/packages/cli#readme",
@ -44,13 +48,16 @@
"license": "MIT",
"dependencies": {
"@emigrate/plugin-tools": "workspace:*",
"ansis": "2.0.2",
"cosmiconfig": "8.3.6",
"@emigrate/types": "workspace:*",
"ansis": "2.0.3",
"cosmiconfig": "9.0.0",
"elegant-spinner": "3.0.0",
"figures": "6.0.1",
"import-from-esm": "1.3.3",
"is-interactive": "2.0.0",
"log-update": "6.0.0",
"pretty-ms": "8.0.0"
"pretty-ms": "8.0.0",
"serialize-error": "11.0.3"
},
"volta": {
"extends": "../../package.json"

View file

@ -0,0 +1,5 @@
export async function* arrayMapAsync<T, U>(iterable: AsyncIterable<T>, mapper: (item: T) => U): AsyncIterable<U> {
for await (const item of iterable) {
yield mapper(item);
}
}

View file

@ -1,13 +1,29 @@
#!/usr/bin/env node --enable-source-maps
#!/usr/bin/env node
import process from 'node:process';
import { parseArgs } from 'node:util';
import { ShowUsageError } from './errors.js';
import { setTimeout } from 'node:timers';
import importFromEsm from 'import-from-esm';
import { CommandAbortError, ShowUsageError } from './errors.js';
import { getConfig } from './get-config.js';
import { DEFAULT_RESPITE_SECONDS } from './defaults.js';
type Action = (args: string[]) => Promise<void>;
type Action = (args: string[], abortSignal: AbortSignal) => Promise<void>;
const up: Action = async (args) => {
const config = await getConfig('up');
const useColors = (values: { color?: boolean; 'no-color'?: boolean }) => {
if (values['no-color']) {
return false;
}
return values.color;
};
const importAll = async (cwd: string, modules: string[]) => {
for await (const module of modules) {
await importFromEsm(cwd, module);
}
};
const up: Action = async (args, abortSignal) => {
const { values } = parseArgs({
args,
options: {
@ -19,6 +35,12 @@ const up: Action = async (args) => {
type: 'string',
short: 'd',
},
import: {
type: 'string',
short: 'i',
multiple: true,
default: [],
},
reporter: {
type: 'string',
short: 'r',
@ -27,6 +49,18 @@ const up: Action = async (args) => {
type: 'string',
short: 's',
},
limit: {
type: 'string',
short: 'l',
},
from: {
type: 'string',
short: 'f',
},
to: {
type: 'string',
short: 't',
},
dry: {
type: 'boolean',
},
@ -36,6 +70,18 @@ const up: Action = async (args) => {
multiple: true,
default: [],
},
color: {
type: 'boolean',
},
'no-execution': {
type: 'boolean',
},
'no-color': {
type: 'boolean',
},
'abort-respite': {
type: 'string',
},
},
allowPositionals: false,
});
@ -47,17 +93,46 @@ Run all pending migrations
Options:
-h, --help Show this help message and exit
-d, --directory The directory where the migration files are located (required)
-s, --storage The storage to use for where to store the migration history (required)
-p, --plugin The plugin(s) to use (can be specified multiple times)
-r, --reporter The reporter to use for reporting the migration progress
-d, --directory <path> The directory where the migration files are located (required)
-i, --import <module> Additional modules/packages to import before running the migrations (can be specified multiple times)
For example if you want to use Dotenv to load environment variables or when using TypeScript
-s, --storage <name> The storage to use for where to store the migration history (required)
-p, --plugin <name> The plugin(s) to use (can be specified multiple times)
-r, --reporter <name> The reporter to use for reporting the migration progress (default: pretty)
-l, --limit <count> Limit the number of migrations to run
-f, --from <name/path> Start running migrations from the given migration name or relative file path to a migration file,
the given name or path needs to exist. The same migration and those after it lexicographically will be run
-t, --to <name/path> Skip migrations after the given migration name or relative file path to a migration file,
the given name or path needs to exist. The same migration and those before it lexicographically will be run
--dry List the pending migrations that would be run without actually running them
--color Force color output (this option is passed to the reporter)
--no-color Disable color output (this option is passed to the reporter)
--no-execution Mark the migrations as executed and successful without actually running them,
which is useful if you want to mark migrations as successful after running them manually
--abort-respite <sec> The number of seconds to wait before abandoning running migrations after the command has been aborted (default: ${DEFAULT_RESPITE_SECONDS})
Examples:
emigrate up --directory src/migrations -s fs
emigrate up -d ./migrations --storage @emigrate/storage-mysql
emigrate up -d ./migrations --storage @emigrate/mysql
emigrate up -d src/migrations -s postgres -r json --dry
emigrate up -d ./migrations -s mysql --import dotenv/config
emigrate up --limit 1
emigrate up --to 20231122120529381_some_migration_file.js
emigrate up --to 20231122120529381_some_migration_file.js --no-execution
`;
if (values.help) {
@ -66,12 +141,65 @@ Examples:
return;
}
const { directory = config.directory, storage = config.storage, reporter = config.reporter, dry } = values;
const cwd = process.cwd();
if (values.import) {
await importAll(cwd, values.import);
}
const forceImportTypeScriptAsIs = values.import?.some((module) => module === 'tsx' || module.startsWith('tsx/'));
const config = await getConfig('up', forceImportTypeScriptAsIs);
const {
directory = config.directory,
storage = config.storage,
reporter = config.reporter,
dry,
from,
to,
limit: limitString,
'abort-respite': abortRespiteString,
'no-execution': noExecution,
} = values;
const plugins = [...(config.plugins ?? []), ...(values.plugin ?? [])];
const limit = limitString === undefined ? undefined : Number.parseInt(limitString, 10);
const abortRespite = abortRespiteString === undefined ? config.abortRespite : Number.parseInt(abortRespiteString, 10);
if (Number.isNaN(limit)) {
console.error('Invalid limit value, expected an integer but was:', limitString);
console.log(usage);
process.exitCode = 1;
return;
}
if (Number.isNaN(abortRespite)) {
console.error(
'Invalid abortRespite value, expected an integer but was:',
abortRespiteString ?? config.abortRespite,
);
console.log(usage);
process.exitCode = 1;
return;
}
try {
const { default: upCommand } = await import('./commands/up.js');
process.exitCode = await upCommand({ storage, reporter, directory, plugins, dry });
process.exitCode = await upCommand({
storage,
reporter,
directory,
plugins,
cwd,
dry,
limit,
from,
to,
noExecution,
abortSignal,
abortRespite: (abortRespite ?? DEFAULT_RESPITE_SECONDS) * 1000,
color: useColors(values),
});
} catch (error) {
if (error instanceof ShowUsageError) {
console.error(error.message, '\n');
@ -85,7 +213,6 @@ Examples:
};
const newMigration: Action = async (args) => {
const config = await getConfig('new');
const { values, positionals } = parseArgs({
args,
options: {
@ -107,7 +234,7 @@ const newMigration: Action = async (args) => {
},
extension: {
type: 'string',
short: 'e',
short: 'x',
},
plugin: {
type: 'string',
@ -115,6 +242,18 @@ const newMigration: Action = async (args) => {
multiple: true,
default: [],
},
import: {
type: 'string',
short: 'i',
multiple: true,
default: [],
},
color: {
type: 'boolean',
},
'no-color': {
type: 'boolean',
},
},
allowPositionals: true,
});
@ -130,22 +269,34 @@ Arguments:
Options:
-h, --help Show this help message and exit
-d, --directory The directory where the migration files are located (required)
-r, --reporter The reporter to use for reporting the migration file creation progress
-p, --plugin The plugin(s) to use (can be specified multiple times)
-t, --template A template file to use as contents for the new migration file
-d, --directory <path> The directory where the migration files are located (required)
-i, --import <module> Additional modules/packages to import before creating the migration (can be specified multiple times)
For example if you want to use Dotenv to load environment variables or when using TypeScript
-r, --reporter <name> The reporter to use for reporting the migration file creation progress (default: pretty)
-p, --plugin <name> The plugin(s) to use (can be specified multiple times)
-t, --template <path> A template file to use as contents for the new migration file
(if the extension option is not provided the template file's extension will be used)
-e, --extension The extension to use for the new migration file
-x, --extension <ext> The extension to use for the new migration file
(if no template or plugin is provided an empty migration file will be created with the given extension)
--color Force color output (this option is passed to the reporter)
--no-color Disable color output (this option is passed to the reporter)
One of the --template, --extension or the --plugin options must be specified
Examples:
emigrate new -d src/migrations -t migration-template.js create users table
emigrate new --directory ./migrations --plugin @emigrate/plugin-generate-sql create_users_table
emigrate new -d ./migrations -e .sql create_users_table
emigrate new -d ./migrations -t .migration-template -e .sql "drop some table"
emigrate new --directory ./migrations --plugin @emigrate/postgres create_users_table
emigrate new -d ./migrations -x .sql create_users_table
emigrate new -d ./migrations -t .migration-template -x .sql "drop some table"
`;
if (values.help) {
@ -154,6 +305,15 @@ Examples:
return;
}
const cwd = process.cwd();
if (values.import) {
await importAll(cwd, values.import);
}
const forceImportTypeScriptAsIs = values.import?.some((module) => module === 'tsx' || module.startsWith('tsx/'));
const config = await getConfig('new', forceImportTypeScriptAsIs);
const {
directory = config.directory,
template = config.template,
@ -165,7 +325,7 @@ Examples:
try {
const { default: newCommand } = await import('./commands/new.js');
await newCommand({ directory, template, plugins, extension, reporter }, name);
await newCommand({ directory, template, plugins, extension, reporter, cwd, color: useColors(values) }, name);
} catch (error) {
if (error instanceof ShowUsageError) {
console.error(error.message, '\n');
@ -179,7 +339,6 @@ Examples:
};
const list: Action = async (args) => {
const config = await getConfig('list');
const { values } = parseArgs({
args,
options: {
@ -191,6 +350,12 @@ const list: Action = async (args) => {
type: 'string',
short: 'd',
},
import: {
type: 'string',
short: 'i',
multiple: true,
default: [],
},
reporter: {
type: 'string',
short: 'r',
@ -199,6 +364,12 @@ const list: Action = async (args) => {
type: 'string',
short: 's',
},
color: {
type: 'boolean',
},
'no-color': {
type: 'boolean',
},
},
allowPositionals: false,
});
@ -210,9 +381,19 @@ List all migrations and their status. This command does not run any migrations.
Options:
-h, --help Show this help message and exit
-d, --directory The directory where the migration files are located (required)
-r, --reporter The reporter to use for reporting the migrations
-s, --storage The storage to use to get the migration history (required)
-d, --directory <path> The directory where the migration files are located (required)
-i, --import <module> Additional modules/packages to import before listing the migrations (can be specified multiple times)
For example if you want to use Dotenv to load environment variables
-r, --reporter <name> The reporter to use for reporting the migrations (default: pretty)
-s, --storage <name> The storage to use to get the migration history (required)
--color Force color output (this option is passed to the reporter)
--no-color Disable color output (this option is passed to the reporter)
Examples:
@ -226,11 +407,20 @@ Examples:
return;
}
const cwd = process.cwd();
if (values.import) {
await importAll(cwd, values.import);
}
const forceImportTypeScriptAsIs = values.import?.some((module) => module === 'tsx' || module.startsWith('tsx/'));
const config = await getConfig('list', forceImportTypeScriptAsIs);
const { directory = config.directory, storage = config.storage, reporter = config.reporter } = values;
try {
const { default: listCommand } = await import('./commands/list.js');
process.exitCode = await listCommand({ directory, storage, reporter });
process.exitCode = await listCommand({ directory, storage, reporter, cwd, color: useColors(values) });
} catch (error) {
if (error instanceof ShowUsageError) {
console.error(error.message, '\n');
@ -244,7 +434,6 @@ Examples:
};
const remove: Action = async (args) => {
const config = await getConfig('remove');
const { values, positionals } = parseArgs({
args,
options: {
@ -256,6 +445,12 @@ const remove: Action = async (args) => {
type: 'string',
short: 'd',
},
import: {
type: 'string',
short: 'i',
multiple: true,
default: [],
},
force: {
type: 'boolean',
short: 'f',
@ -268,32 +463,50 @@ const remove: Action = async (args) => {
type: 'string',
short: 's',
},
color: {
type: 'boolean',
},
'no-color': {
type: 'boolean',
},
},
allowPositionals: true,
});
const usage = `Usage: emigrate remove [options] <name>
const usage = `Usage: emigrate remove [options] <name/path>
Remove entries from the migration history.
This is useful if you want to retry a migration that has failed.
Arguments:
name The name of the migration file to remove from the history (required)
name/path The name of or relative path to the migration file to remove from the history (required)
Options:
-h, --help Show this help message and exit
-d, --directory The directory where the migration files are located (required)
-r, --reporter The reporter to use for reporting the removal process
-s, --storage The storage to use to get the migration history (required)
-f, --force Force removal of the migration history entry even if the migration file does not exist
or it's in a non-failed state
-d, --directory <path> The directory where the migration files are located (required)
-i, --import <module> Additional modules/packages to import before removing the migration (can be specified multiple times)
For example if you want to use Dotenv to load environment variables
-r, --reporter <name> The reporter to use for reporting the removal process (default: pretty)
-s, --storage <name> The storage to use to get the migration history (required)
-f, --force Force removal of the migration history entry even if the migration is not in a failed state
--color Force color output (this option is passed to the reporter)
--no-color Disable color output (this option is passed to the reporter)
Examples:
emigrate remove -d migrations -s fs 20231122120529381_some_migration_file.js
emigrate remove --directory ./migrations --storage postgres 20231122120529381_some_migration_file.sql
emigrate remove -i dotenv/config -d ./migrations -s postgres 20231122120529381_some_migration_file.sql
emigrate remove -i dotenv/config -d ./migrations -s postgres migrations/20231122120529381_some_migration_file.sql
`;
if (values.help) {
@ -302,11 +515,23 @@ Examples:
return;
}
const cwd = process.cwd();
if (values.import) {
await importAll(cwd, values.import);
}
const forceImportTypeScriptAsIs = values.import?.some((module) => module === 'tsx' || module.startsWith('tsx/'));
const config = await getConfig('remove', forceImportTypeScriptAsIs);
const { directory = config.directory, storage = config.storage, reporter = config.reporter, force } = values;
try {
const { default: removeCommand } = await import('./commands/remove.js');
process.exitCode = await removeCommand({ directory, storage, reporter, force }, positionals[0] ?? '');
process.exitCode = await removeCommand(
{ directory, storage, reporter, force, cwd, color: useColors(values) },
positionals[0] ?? '',
);
} catch (error) {
if (error instanceof ShowUsageError) {
console.error(error.message, '\n');
@ -326,17 +551,29 @@ const commands: Record<string, Action> = {
new: newMigration,
};
const command = process.argv[2]?.toLowerCase();
const action = command ? commands[command] : undefined;
const main: Action = async (args, abortSignal) => {
const { values, positionals } = parseArgs({
args,
options: {
help: {
type: 'boolean',
short: 'h',
},
version: {
type: 'boolean',
short: 'v',
},
},
allowPositionals: true,
strict: false,
});
if (!action) {
if (command) {
console.error(`Unknown command: ${command}\n`);
} else {
console.error('No command specified\n');
}
const usage = `Usage: emigrate <options>/<command>
console.log(`Usage: emigrate <command>
Options:
-h, --help Show this help message and exit
-v, --version Print version number and exit
Commands:
@ -344,21 +581,65 @@ Commands:
new Create a new migration file
list List all migrations and their status
remove Remove entries from the migration history
`);
process.exit(1);
}
`;
try {
await action(process.argv.slice(3));
} catch (error) {
const command = positionals[0]?.toLowerCase();
const action = command ? commands[command] : undefined;
if (!action) {
if (command) {
console.error(`Unknown command: ${command}\n`);
} else if (values.version) {
const { version } = await import('./get-package-info.js');
console.log(version);
process.exitCode = 0;
return;
} else if (!values.help) {
console.error('No command specified\n');
}
console.log(usage);
process.exitCode = 1;
return;
}
try {
await action(process.argv.slice(3), abortSignal);
} catch (error) {
if (error instanceof Error) {
console.error(error.message);
console.error(error);
if (error.cause instanceof Error) {
console.error(error.cause.stack);
console.error(error.cause);
}
} else {
console.error(error);
}
process.exitCode = 1;
}
}
};
const controller = new AbortController();
process.on('SIGINT', () => {
controller.abort(CommandAbortError.fromSignal('SIGINT'));
});
process.on('SIGTERM', () => {
controller.abort(CommandAbortError.fromSignal('SIGTERM'));
});
process.on('uncaughtException', (error) => {
controller.abort(CommandAbortError.fromReason('Uncaught exception', error));
});
process.on('unhandledRejection', (error) => {
controller.abort(CommandAbortError.fromReason('Unhandled rejection', error));
});
await main(process.argv.slice(2), controller.signal);
setTimeout(() => {
console.error('Process did not exit within 10 seconds, forcing exit');
process.exit(process.exitCode);
}, 10_000).unref();

View file

@ -0,0 +1,99 @@
import { describe, it } from 'node:test';
import assert from 'node:assert';
import { collectMigrations } from './collect-migrations.js';
import { toEntries, toEntry, toMigration, toMigrations } from './test-utils.js';
import { arrayFromAsync } from './array-from-async.js';
import { MigrationHistoryError } from './errors.js';
describe('collect-migrations', () => {
it('returns all migrations from the history and all pending migrations', async () => {
const cwd = '/cwd';
const directory = 'directory';
const history = {
async *[Symbol.asyncIterator]() {
yield* toEntries(['migration1.js', 'migration2.js']);
},
};
const getMigrations = async () => toMigrations(cwd, directory, ['migration1.js', 'migration2.js', 'migration3.js']);
const result = await arrayFromAsync(collectMigrations(cwd, directory, history, getMigrations));
assert.deepStrictEqual(result, [
{
...toMigration(cwd, directory, 'migration1.js'),
duration: 0,
status: 'done',
},
{
...toMigration(cwd, directory, 'migration2.js'),
duration: 0,
status: 'done',
},
toMigration(cwd, directory, 'migration3.js'),
]);
});
it('includes any errors from the history', async () => {
const entry = toEntry('migration1.js', 'failed');
const cwd = '/cwd';
const directory = 'directory';
const history = {
async *[Symbol.asyncIterator]() {
yield* [entry];
},
};
const getMigrations = async () => toMigrations(cwd, directory, ['migration1.js', 'migration2.js', 'migration3.js']);
const result = await arrayFromAsync(collectMigrations(cwd, directory, history, getMigrations));
assert.deepStrictEqual(result, [
{
...toMigration(cwd, directory, 'migration1.js'),
duration: 0,
status: 'failed',
error: MigrationHistoryError.fromHistoryEntry(entry),
},
toMigration(cwd, directory, 'migration2.js'),
toMigration(cwd, directory, 'migration3.js'),
]);
});
it('can handle a migration history without file extensions', async () => {
const cwd = '/cwd';
const directory = 'directory';
const history = {
async *[Symbol.asyncIterator]() {
yield* toEntries(['migration1']);
},
};
const getMigrations = async () => toMigrations(cwd, directory, ['migration1.js', 'migration2.js', 'migration3.js']);
const result = await arrayFromAsync(collectMigrations(cwd, directory, history, getMigrations));
assert.deepStrictEqual(result, [
{ ...toMigration(cwd, directory, 'migration1.js'), duration: 0, status: 'done' },
toMigration(cwd, directory, 'migration2.js'),
toMigration(cwd, directory, 'migration3.js'),
]);
});
it('can handle a migration history without file extensions even if the migration name contains periods', async () => {
const cwd = '/cwd';
const directory = 'directory';
const history = {
async *[Symbol.asyncIterator]() {
yield* toEntries(['mig.ration1']);
},
};
const getMigrations = async () =>
toMigrations(cwd, directory, ['mig.ration1.js', 'migration2.js', 'migration3.js']);
const result = await arrayFromAsync(collectMigrations(cwd, directory, history, getMigrations));
assert.deepStrictEqual(result, [
{ ...toMigration(cwd, directory, 'mig.ration1.js'), duration: 0, status: 'done' },
toMigration(cwd, directory, 'migration2.js'),
toMigration(cwd, directory, 'migration3.js'),
]);
});
});

View file

@ -1,30 +1,28 @@
import {
type MigrationHistoryEntry,
type MigrationMetadata,
type MigrationMetadataFinished,
} from '@emigrate/plugin-tools/types';
import { type MigrationHistoryEntry, type MigrationMetadata, type MigrationMetadataFinished } from '@emigrate/types';
import { toMigrationMetadata } from './to-migration-metadata.js';
import { getMigrations as getMigrationsOriginal } from './get-migrations.js';
import { getMigrations as getMigrationsOriginal, type GetMigrationsFunction } from './get-migrations.js';
export async function* collectMigrations(
cwd: string,
directory: string,
history: AsyncIterable<MigrationHistoryEntry>,
getMigrations = getMigrationsOriginal,
getMigrations: GetMigrationsFunction = getMigrationsOriginal,
): AsyncIterable<MigrationMetadata | MigrationMetadataFinished> {
const allMigrations = await getMigrations(cwd, directory);
const seen = new Set<string>();
for await (const entry of history) {
const index = allMigrations.findIndex((migrationFile) => migrationFile.name === entry.name);
const migration = allMigrations.find((migrationFile) => {
return migrationFile.name === entry.name || migrationFile.name === `${entry.name}.js`;
});
if (index === -1) {
if (!migration) {
continue;
}
yield toMigrationMetadata(entry, { cwd, directory });
yield toMigrationMetadata({ ...entry, name: migration.name }, { cwd, directory });
seen.add(entry.name);
seen.add(migration.name);
}
yield* allMigrations.filter((migration) => !seen.has(migration.name));

View file

@ -1,59 +1,80 @@
import process from 'node:process';
import { getOrLoadReporter, getOrLoadStorage } from '@emigrate/plugin-tools';
import { BadOptionError, MissingOptionError, StorageInitError } from '../errors.js';
import { BadOptionError, MissingOptionError, StorageInitError, toError } from '../errors.js';
import { type Config } from '../types.js';
import { exec } from '../exec.js';
import { migrationRunner } from '../migration-runner.js';
import { arrayFromAsync } from '../array-from-async.js';
import { collectMigrations } from '../collect-migrations.js';
import { version } from '../get-package-info.js';
import { getStandardReporter } from '../reporters/get.js';
const lazyDefaultReporter = async () => import('../reporters/default.js');
type ExtraFlags = {
cwd: string;
};
export default async function listCommand({ directory, reporter: reporterConfig, storage: storageConfig }: Config) {
export default async function listCommand({
directory,
reporter: reporterConfig,
storage: storageConfig,
color,
cwd,
}: Config & ExtraFlags): Promise<number> {
if (!directory) {
throw new MissingOptionError('directory');
throw MissingOptionError.fromOption('directory');
}
const cwd = process.cwd();
const storagePlugin = await getOrLoadStorage([storageConfig]);
if (!storagePlugin) {
throw new BadOptionError('storage', 'No storage found, please specify a storage using the storage option');
throw BadOptionError.fromOption('storage', 'No storage found, please specify a storage using the storage option');
}
const reporter = await getOrLoadReporter([reporterConfig ?? lazyDefaultReporter]);
const reporter = getStandardReporter(reporterConfig) ?? (await getOrLoadReporter([reporterConfig]));
if (!reporter) {
throw new BadOptionError(
throw BadOptionError.fromOption(
'reporter',
'No reporter found, please specify an existing reporter using the reporter option',
);
}
await reporter.onInit?.({ command: 'list', cwd, dry: false, directory });
await reporter.onInit?.({ command: 'list', version, cwd, dry: false, directory, color });
const [storage, storageError] = await exec(async () => storagePlugin.initializeStorage());
if (storageError) {
await reporter.onFinished?.([], new StorageInitError('Could not initialize storage', { cause: storageError }));
await reporter.onFinished?.([], StorageInitError.fromError(storageError));
return 1;
}
try {
const collectedMigrations = collectMigrations(cwd, directory, storage.getHistory());
const error = await migrationRunner({
dry: true,
reporter,
storage,
migrations: await arrayFromAsync(collectedMigrations),
migrations: collectedMigrations,
async validate() {
// No-op
},
async execute() {
throw new Error('Unexpected execute call');
},
async onSuccess() {
throw new Error('Unexpected onSuccess call');
},
async onError() {
throw new Error('Unexpected onError call');
},
});
return error ? 1 : 0;
} catch (error) {
await reporter.onFinished?.([], toError(error));
return 1;
} finally {
await storage.end();
}
}

View file

@ -1,49 +1,61 @@
import process from 'node:process';
import { hrtime } from 'node:process';
import fs from 'node:fs/promises';
import path from 'node:path';
import { getTimestampPrefix, sanitizeMigrationName, getOrLoadPlugin, getOrLoadReporter } from '@emigrate/plugin-tools';
import { type MigrationMetadata } from '@emigrate/plugin-tools/types';
import { BadOptionError, MissingArgumentsError, MissingOptionError, UnexpectedError } from '../errors.js';
import { type MigrationMetadataFinished, type MigrationMetadata, isFailedMigration } from '@emigrate/types';
import {
BadOptionError,
EmigrateError,
MissingArgumentsError,
MissingOptionError,
UnexpectedError,
toError,
} from '../errors.js';
import { type Config } from '../types.js';
import { withLeadingPeriod } from '../with-leading-period.js';
import { version } from '../get-package-info.js';
import { getDuration } from '../get-duration.js';
import { getStandardReporter } from '../reporters/get.js';
const lazyDefaultReporter = async () => import('../reporters/default.js');
type ExtraFlags = {
cwd: string;
};
export default async function newCommand(
{ directory, template, reporter: reporterConfig, plugins = [], extension }: Config,
{ directory, template, reporter: reporterConfig, plugins = [], cwd, extension, color }: Config & ExtraFlags,
name: string,
) {
): Promise<void> {
if (!directory) {
throw new MissingOptionError('directory');
throw MissingOptionError.fromOption('directory');
}
if (!name) {
throw new MissingArgumentsError('name');
throw MissingArgumentsError.fromArgument('name');
}
if (!extension && !template && plugins.length === 0) {
throw new MissingOptionError(['extension', 'template', 'plugin']);
throw MissingOptionError.fromOption(['extension', 'template', 'plugin']);
}
const cwd = process.cwd();
const reporter = await getOrLoadReporter([reporterConfig ?? lazyDefaultReporter]);
const reporter = getStandardReporter(reporterConfig) ?? (await getOrLoadReporter([reporterConfig]));
if (!reporter) {
throw new BadOptionError(
throw BadOptionError.fromOption(
'reporter',
'No reporter found, please specify an existing reporter using the reporter option',
);
}
await reporter.onInit?.({ command: 'new', cwd, dry: false, directory });
await reporter.onInit?.({ command: 'new', version, cwd, dry: false, directory, color });
const start = hrtime();
let filename: string | undefined;
let content: string | undefined;
if (template) {
const fs = await import('node:fs/promises');
const templatePath = path.resolve(process.cwd(), template);
const templatePath = path.resolve(cwd, template);
const fileExtension = path.extname(templatePath);
try {
@ -81,13 +93,13 @@ export default async function newCommand(
}
if (!filename || content === undefined) {
throw new BadOptionError(
throw BadOptionError.fromOption(
'plugin',
'No generator plugin found, please specify a generator plugin using the plugin option',
);
}
const directoryPath = path.resolve(process.cwd(), directory);
const directoryPath = path.resolve(cwd, directory);
const filePath = path.resolve(directoryPath, filename);
const migration: MigrationMetadata = {
@ -101,19 +113,31 @@ export default async function newCommand(
await reporter.onNewMigration?.(migration, content);
let saveError: Error | undefined;
const finishedMigrations: MigrationMetadataFinished[] = [];
try {
await createDirectory(directoryPath);
await saveFile(filePath, content);
const duration = getDuration(start);
finishedMigrations.push({ ...migration, status: 'done', duration });
} catch (error) {
saveError = error instanceof Error ? error : new Error(String(error));
const duration = getDuration(start);
const errorInstance = toError(error);
finishedMigrations.push({ ...migration, status: 'failed', duration, error: errorInstance });
}
await reporter.onFinished?.(
[{ ...migration, status: saveError ? 'failed' : 'done', error: saveError, duration: 0 }],
saveError,
);
// eslint-disable-next-line unicorn/no-array-callback-reference
const firstFailed = finishedMigrations.find(isFailedMigration);
const firstError =
firstFailed?.error instanceof EmigrateError
? firstFailed.error
: firstFailed
? new UnexpectedError(`Failed to create migration file: ${firstFailed.relativeFilePath}`, {
cause: firstFailed?.error,
})
: undefined;
await reporter.onFinished?.(finishedMigrations, firstError);
}
async function createDirectory(directoryPath: string) {

View file

@ -0,0 +1,305 @@
import { describe, it } from 'node:test';
import assert from 'node:assert';
import { type EmigrateReporter, type Storage, type Plugin, type MigrationMetadataFinished } from '@emigrate/types';
import { deserializeError } from 'serialize-error';
import { version } from '../get-package-info.js';
import {
BadOptionError,
MigrationNotRunError,
MigrationRemovalError,
OptionNeededError,
StorageInitError,
} from '../errors.js';
import {
assertErrorEqualEnough,
getErrorCause,
getMockedReporter,
getMockedStorage,
toEntry,
toMigrations,
type Mocked,
} from '../test-utils.js';
import removeCommand from './remove.js';
describe('remove', () => {
it("returns 1 and finishes with an error when the storage couldn't be initialized", async () => {
const { reporter, run } = getRemoveCommand([]);
const exitCode = await run('some_migration.js');
assert.strictEqual(exitCode, 1, 'Exit code');
assertPreconditionsFailed(reporter, StorageInitError.fromError(new Error('No storage configured')));
});
it('returns 1 and finishes with an error when the given migration has not been executed', async () => {
const storage = getMockedStorage(['some_other_migration.js']);
const { reporter, run } = getRemoveCommand(['some_migration.js'], storage);
const exitCode = await run('some_migration.js');
assert.strictEqual(exitCode, 1, 'Exit code');
assertPreconditionsFulfilled(
reporter,
storage,
[
{
name: 'some_migration.js',
status: 'failed',
error: new MigrationNotRunError('Migration "some_migration.js" is not in the migration history'),
},
],
new MigrationNotRunError('Migration "some_migration.js" is not in the migration history'),
);
});
it('returns 1 and finishes with an error when the given migration is not in a failed state in the history', async () => {
const storage = getMockedStorage(['1_old_migration.js', '2_some_migration.js', '3_new_migration.js']);
const { reporter, run } = getRemoveCommand(['2_some_migration.js'], storage);
const exitCode = await run('2_some_migration.js');
assert.strictEqual(exitCode, 1, 'Exit code');
assertPreconditionsFulfilled(
reporter,
storage,
[
{
name: '2_some_migration.js',
status: 'failed',
error: OptionNeededError.fromOption(
'force',
'The migration "2_some_migration.js" is not in a failed state. Use the "force" option to force its removal',
),
},
],
OptionNeededError.fromOption(
'force',
'The migration "2_some_migration.js" is not in a failed state. Use the "force" option to force its removal',
),
);
});
it('returns 1 and finishes with an error when the given migration does not exist at all', async () => {
const storage = getMockedStorage(['some_migration.js']);
const { reporter, run } = getRemoveCommand(['some_migration.js'], storage);
const exitCode = await run('some_other_migration.js');
assert.strictEqual(exitCode, 1, 'Exit code');
assertPreconditionsFulfilled(
reporter,
storage,
[],
BadOptionError.fromOption('name', 'The migration: "migrations/some_other_migration.js" was not found'),
);
});
it('returns 0, removes the migration from the history and finishes without an error when the given migration is in a failed state', async () => {
const storage = getMockedStorage([toEntry('some_migration.js', 'failed')]);
const { reporter, run } = getRemoveCommand(['some_migration.js'], storage);
const exitCode = await run('some_migration.js');
assert.strictEqual(exitCode, 0, 'Exit code');
assertPreconditionsFulfilled(reporter, storage, [{ name: 'some_migration.js', status: 'done', started: true }]);
});
it('returns 0, removes the migration from the history and finishes without an error when the given migration is not in a failed state but "force" is true', async () => {
const storage = getMockedStorage(['1_old_migration.js', '2_some_migration.js', '3_new_migration.js']);
const { reporter, run } = getRemoveCommand(['2_some_migration.js'], storage);
const exitCode = await run('2_some_migration.js', { force: true });
assert.strictEqual(exitCode, 0, 'Exit code');
assertPreconditionsFulfilled(reporter, storage, [{ name: '2_some_migration.js', status: 'done', started: true }]);
});
it('returns 1 and finishes with an error when the removal of the migration crashes', async () => {
const storage = getMockedStorage([toEntry('some_migration.js', 'failed')]);
storage.remove.mock.mockImplementation(async () => {
throw new Error('Some error');
});
const { reporter, run } = getRemoveCommand(['some_migration.js'], storage);
const exitCode = await run('some_migration.js');
assert.strictEqual(exitCode, 1, 'Exit code');
assertPreconditionsFulfilled(
reporter,
storage,
[
{
name: 'some_migration.js',
status: 'failed',
error: new Error('Some error'),
started: true,
},
],
new MigrationRemovalError('Failed to remove migration: migrations/some_migration.js', {
cause: new Error('Some error'),
}),
);
});
});
function getRemoveCommand(migrationFiles: string[], storage?: Mocked<Storage>, plugins?: Plugin[]) {
const reporter = getMockedReporter();
const run = async (
name: string,
options?: Omit<Parameters<typeof removeCommand>[0], 'cwd' | 'directory' | 'storage' | 'reporter' | 'plugins'>,
) => {
return removeCommand(
{
cwd: '/emigrate',
directory: 'migrations',
storage: {
async initializeStorage() {
if (!storage) {
throw new Error('No storage configured');
}
return storage;
},
},
reporter,
plugins: plugins ?? [],
async getMigrations(cwd, directory) {
return toMigrations(cwd, directory, migrationFiles);
},
...options,
},
name,
);
};
return {
reporter,
storage,
run,
};
}
function assertPreconditionsFailed(reporter: Mocked<Required<EmigrateReporter>>, finishedError?: Error) {
assert.strictEqual(reporter.onInit.mock.calls.length, 1);
assert.deepStrictEqual(reporter.onInit.mock.calls[0]?.arguments, [
{
command: 'remove',
cwd: '/emigrate',
version,
dry: false,
color: undefined,
directory: 'migrations',
},
]);
assert.strictEqual(reporter.onCollectedMigrations.mock.calls.length, 0, 'Collected call');
assert.strictEqual(reporter.onLockedMigrations.mock.calls.length, 0, 'Locked call');
assert.strictEqual(reporter.onMigrationStart.mock.calls.length, 0, 'Started migrations');
assert.strictEqual(reporter.onMigrationSuccess.mock.calls.length, 0, 'Successful migrations');
assert.strictEqual(reporter.onMigrationError.mock.calls.length, 0, 'Failed migrations');
assert.strictEqual(reporter.onMigrationSkip.mock.calls.length, 0, 'Total pending and skipped');
assert.strictEqual(reporter.onFinished.mock.calls.length, 1, 'Finished called once');
const [entries, error] = reporter.onFinished.mock.calls[0]?.arguments ?? [];
// hackety hack:
if (finishedError) {
finishedError.stack = error?.stack;
}
assert.deepStrictEqual(error, finishedError, 'Finished error');
const cause = getErrorCause(error);
const expectedCause = finishedError?.cause;
assert.deepStrictEqual(
cause,
expectedCause ? deserializeError(expectedCause) : expectedCause,
'Finished error cause',
);
assert.strictEqual(entries?.length, 0, 'Finished entries length');
}
function assertPreconditionsFulfilled(
reporter: Mocked<Required<EmigrateReporter>>,
storage: Mocked<Storage>,
expected: Array<{ name: string; status: MigrationMetadataFinished['status']; started?: boolean; error?: Error }>,
finishedError?: Error,
) {
assert.strictEqual(reporter.onInit.mock.calls.length, 1);
assert.deepStrictEqual(reporter.onInit.mock.calls[0]?.arguments, [
{
command: 'remove',
cwd: '/emigrate',
version,
dry: false,
color: undefined,
directory: 'migrations',
},
]);
let started = 0;
let done = 0;
let failed = 0;
let skipped = 0;
let pending = 0;
let failedAndStarted = 0;
const failedEntries: typeof expected = [];
const successfulEntries: typeof expected = [];
for (const entry of expected) {
if (entry.started) {
started++;
}
// eslint-disable-next-line default-case
switch (entry.status) {
case 'done': {
done++;
if (entry.started) {
successfulEntries.push(entry);
}
break;
}
case 'failed': {
failed++;
failedEntries.push(entry);
if (entry.started) {
failedAndStarted++;
}
break;
}
case 'skipped': {
skipped++;
break;
}
case 'pending': {
pending++;
break;
}
}
}
assert.strictEqual(reporter.onCollectedMigrations.mock.calls.length, 1, 'Collected call');
assert.strictEqual(storage.lock.mock.calls.length, 0, 'Storage lock never called');
assert.strictEqual(storage.unlock.mock.calls.length, 0, 'Storage unlock never called');
assert.strictEqual(reporter.onLockedMigrations.mock.calls.length, 0, 'Locked call');
assert.strictEqual(reporter.onMigrationStart.mock.calls.length, started, 'Started migrations');
assert.strictEqual(reporter.onMigrationSuccess.mock.calls.length, successfulEntries.length, 'Successful migrations');
assert.strictEqual(storage.remove.mock.calls.length, started, 'Storage remove called');
assert.strictEqual(reporter.onMigrationError.mock.calls.length, failedEntries.length, 'Failed migrations');
assert.strictEqual(reporter.onMigrationSkip.mock.calls.length, 0, 'Total pending and skipped');
assert.strictEqual(reporter.onFinished.mock.calls.length, 1, 'Finished called once');
const [entries, error] = reporter.onFinished.mock.calls[0]?.arguments ?? [];
assertErrorEqualEnough(error, finishedError, 'Finished error');
assert.strictEqual(entries?.length, expected.length, 'Finished entries length');
assert.deepStrictEqual(
entries.map((entry) => `${entry.name} (${entry.status})`),
expected.map((entry) => `${entry.name} (${entry.status})`),
'Finished entries',
);
assert.strictEqual(storage.end.mock.calls.length, 1, 'Storage end called once');
}

View file

@ -1,123 +1,151 @@
import process from 'node:process';
import path from 'node:path';
import { getOrLoadReporter, getOrLoadStorage } from '@emigrate/plugin-tools';
import { type MigrationHistoryEntry, type MigrationMetadataFinished } from '@emigrate/plugin-tools/types';
import { type MigrationMetadata, isFinishedMigration } from '@emigrate/types';
import {
BadOptionError,
MigrationNotRunError,
MigrationRemovalError,
MissingArgumentsError,
MissingOptionError,
OptionNeededError,
StorageInitError,
toError,
} from '../errors.js';
import { type Config } from '../types.js';
import { getMigration } from '../get-migration.js';
import { getDuration } from '../get-duration.js';
import { exec } from '../exec.js';
import { version } from '../get-package-info.js';
import { collectMigrations } from '../collect-migrations.js';
import { migrationRunner } from '../migration-runner.js';
import { arrayMapAsync } from '../array-map-async.js';
import { type GetMigrationsFunction } from '../get-migrations.js';
import { getStandardReporter } from '../reporters/get.js';
type ExtraFlags = {
cwd: string;
force?: boolean;
getMigrations?: GetMigrationsFunction;
};
const lazyDefaultReporter = async () => import('../reporters/default.js');
type RemovableMigrationMetadata = MigrationMetadata & { originalStatus?: 'done' | 'failed' };
export default async function removeCommand(
{ directory, reporter: reporterConfig, storage: storageConfig, force }: Config & ExtraFlags,
{
directory,
reporter: reporterConfig,
storage: storageConfig,
color,
cwd,
force = false,
getMigrations,
}: Config & ExtraFlags,
name: string,
) {
): Promise<number> {
if (!directory) {
throw new MissingOptionError('directory');
throw MissingOptionError.fromOption('directory');
}
if (!name) {
throw new MissingArgumentsError('name');
throw MissingArgumentsError.fromArgument('name');
}
const cwd = process.cwd();
const storagePlugin = await getOrLoadStorage([storageConfig]);
if (!storagePlugin) {
throw new BadOptionError('storage', 'No storage found, please specify a storage using the storage option');
throw BadOptionError.fromOption('storage', 'No storage found, please specify a storage using the storage option');
}
const reporter = await getOrLoadReporter([reporterConfig ?? lazyDefaultReporter]);
const reporter = getStandardReporter(reporterConfig) ?? (await getOrLoadReporter([reporterConfig]));
if (!reporter) {
throw new BadOptionError(
throw BadOptionError.fromOption(
'reporter',
'No reporter found, please specify an existing reporter using the reporter option',
);
}
await reporter.onInit?.({ command: 'remove', version, cwd, dry: false, directory, color });
const [storage, storageError] = await exec(async () => storagePlugin.initializeStorage());
if (storageError) {
await reporter.onFinished?.([], new StorageInitError('Could not initialize storage', { cause: storageError }));
await reporter.onFinished?.([], StorageInitError.fromError(storageError));
return 1;
}
await reporter.onInit?.({ command: 'remove', cwd, dry: false, directory });
const migrationFile = await getMigration(cwd, directory, name, !force);
const finishedMigrations: MigrationMetadataFinished[] = [];
let historyEntry: MigrationHistoryEntry | undefined;
let removalError: Error | undefined;
for await (const migrationHistoryEntry of storage.getHistory()) {
if (migrationHistoryEntry.name !== migrationFile.name) {
continue;
}
if (migrationHistoryEntry.status === 'done' && !force) {
removalError = new OptionNeededError(
'force',
`The migration "${migrationFile.name}" is not in a failed state. Use the "force" option to force its removal`,
);
} else {
historyEntry = migrationHistoryEntry;
}
}
await reporter.onMigrationRemoveStart?.(migrationFile);
const start = process.hrtime();
if (historyEntry) {
try {
await storage.remove(migrationFile);
const collectedMigrations = arrayMapAsync(
collectMigrations(cwd, directory, storage.getHistory(), getMigrations),
(migration) => {
if (isFinishedMigration(migration)) {
if (migration.status === 'failed') {
const { status, duration, error, ...pendingMigration } = migration;
const removableMigration: RemovableMigrationMetadata = { ...pendingMigration, originalStatus: status };
const duration = getDuration(start);
const finishedMigration: MigrationMetadataFinished = { ...migrationFile, status: 'done', duration };
await reporter.onMigrationRemoveSuccess?.(finishedMigration);
finishedMigrations.push(finishedMigration);
} catch (error) {
removalError = error instanceof Error ? error : new Error(String(error));
return removableMigration;
}
} else if (!removalError) {
removalError = new MigrationNotRunError(
`Migration "${migrationFile.name}" is not in the migration history`,
migrationFile,
if (migration.status === 'done') {
const { status, duration, ...pendingMigration } = migration;
const removableMigration: RemovableMigrationMetadata = { ...pendingMigration, originalStatus: status };
return removableMigration;
}
throw new Error(`Unexpected migration status: ${migration.status}`);
}
return migration as RemovableMigrationMetadata;
},
);
if (!name.includes(path.sep)) {
name = path.join(directory, name);
}
const error = await migrationRunner({
dry: false,
lock: false,
name,
reporter,
storage,
migrations: collectedMigrations,
migrationFilter(migration) {
return migration.relativeFilePath === name;
},
async validate(migration) {
if (migration.originalStatus === 'done' && !force) {
throw OptionNeededError.fromOption(
'force',
`The migration "${migration.name}" is not in a failed state. Use the "force" option to force its removal`,
);
}
if (removalError) {
const duration = getDuration(start);
const finishedMigration: MigrationMetadataFinished = {
...migrationFile,
status: 'failed',
error: removalError,
duration,
};
await reporter.onMigrationRemoveError?.(finishedMigration, removalError);
finishedMigrations.push(finishedMigration);
if (!migration.originalStatus) {
throw MigrationNotRunError.fromMetadata(migration);
}
},
async execute(migration) {
try {
await storage.remove(migration);
} catch (error) {
throw MigrationRemovalError.fromMetadata(migration, toError(error));
}
},
async onSuccess() {
// No-op
},
async onError() {
// No-op
},
});
await reporter.onFinished?.(finishedMigrations, removalError);
return error ? 1 : 0;
} catch (error) {
await reporter.onFinished?.([], toError(error));
return 1;
} finally {
await storage.end();
return removalError ? 1 : 0;
}
}

File diff suppressed because it is too large Load diff

View file

@ -1,67 +1,84 @@
import process from 'node:process';
import path from 'node:path';
import { getOrLoadPlugins, getOrLoadReporter, getOrLoadStorage } from '@emigrate/plugin-tools';
import { isFinishedMigration, type LoaderPlugin } from '@emigrate/plugin-tools/types';
import { BadOptionError, MigrationLoadError, MissingOptionError, StorageInitError } from '../errors.js';
import { isFinishedMigration, type LoaderPlugin } from '@emigrate/types';
import {
BadOptionError,
MigrationLoadError,
MissingOptionError,
StorageInitError,
toError,
toSerializedError,
} from '../errors.js';
import { type Config } from '../types.js';
import { withLeadingPeriod } from '../with-leading-period.js';
import { type GetMigrationsFunction } from '../get-migrations.js';
import { exec } from '../exec.js';
import { migrationRunner } from '../migration-runner.js';
import { filterAsync } from '../filter-async.js';
import { collectMigrations } from '../collect-migrations.js';
import { arrayFromAsync } from '../array-from-async.js';
import { version } from '../get-package-info.js';
import { getStandardReporter } from '../reporters/get.js';
type ExtraFlags = {
cwd?: string;
cwd: string;
dry?: boolean;
limit?: number;
from?: string;
to?: string;
noExecution?: boolean;
getMigrations?: GetMigrationsFunction;
abortSignal?: AbortSignal;
abortRespite?: number;
};
const lazyDefaultReporter = async () => import('../reporters/default.js');
const lazyPluginLoaderJs = async () => import('../plugin-loader-js.js');
export default async function upCommand({
storage: storageConfig,
reporter: reporterConfig,
directory,
color,
limit,
from,
to,
noExecution,
abortSignal,
abortRespite,
dry = false,
plugins = [],
cwd = process.cwd(),
cwd,
getMigrations,
}: Config & ExtraFlags): Promise<number> {
if (!directory) {
throw new MissingOptionError('directory');
throw MissingOptionError.fromOption('directory');
}
const storagePlugin = await getOrLoadStorage([storageConfig]);
if (!storagePlugin) {
throw new BadOptionError('storage', 'No storage found, please specify a storage using the storage option');
throw BadOptionError.fromOption('storage', 'No storage found, please specify a storage using the storage option');
}
const reporter = await getOrLoadReporter([reporterConfig ?? lazyDefaultReporter]);
const reporter = getStandardReporter(reporterConfig) ?? (await getOrLoadReporter([reporterConfig]));
if (!reporter) {
throw new BadOptionError(
throw BadOptionError.fromOption(
'reporter',
'No reporter found, please specify an existing reporter using the reporter option',
);
}
await reporter.onInit?.({ command: 'up', cwd, dry, directory });
await reporter.onInit?.({ command: 'up', version, cwd, dry, directory, color });
const [storage, storageError] = await exec(async () => storagePlugin.initializeStorage());
if (storageError) {
await reporter.onFinished?.([], new StorageInitError('Could not initialize storage', { cause: storageError }));
await reporter.onFinished?.([], StorageInitError.fromError(storageError));
return 1;
}
const collectedMigrations = filterAsync(
collectMigrations(cwd, directory, storage.getHistory(), getMigrations),
(migration) => !isFinishedMigration(migration) || migration.status === 'failed',
);
try {
const collectedMigrations = collectMigrations(cwd, directory, storage.getHistory(), getMigrations);
const loaderPlugins = await getOrLoadPlugins('loader', [lazyPluginLoaderJs, ...plugins]);
@ -79,31 +96,69 @@ export default async function upCommand({
return loaderByExtension.get(extension);
};
if (from && !from.includes(path.sep)) {
from = path.join(directory, from);
}
if (to && !to.includes(path.sep)) {
to = path.join(directory, to);
}
const error = await migrationRunner({
dry,
limit,
from,
to,
abortSignal,
abortRespite,
reporter,
storage,
migrations: await arrayFromAsync(collectedMigrations),
migrations: collectedMigrations,
migrationFilter(migration) {
return !isFinishedMigration(migration) || migration.status === 'failed';
},
async validate(migration) {
if (noExecution) {
return;
}
const loader = getLoaderByExtension(migration.extension);
if (!loader) {
throw new BadOptionError('plugin', `No loader plugin found for file extension: ${migration.extension}`);
throw BadOptionError.fromOption(
'plugin',
`No loader plugin found for file extension: ${migration.extension}`,
);
}
},
async execute(migration) {
if (noExecution) {
return;
}
const loader = getLoaderByExtension(migration.extension)!;
const [migrationFunction, loadError] = await exec(async () => loader.loadMigration(migration));
if (loadError) {
throw new MigrationLoadError(`Failed to load migration file: ${migration.relativeFilePath}`, migration, {
cause: loadError,
});
throw MigrationLoadError.fromMetadata(migration, loadError);
}
await migrationFunction();
},
async onSuccess(migration) {
await storage.onSuccess(migration);
},
async onError(migration, error) {
await storage.onError(migration, toSerializedError(error));
},
});
return error ? 1 : 0;
} catch (error) {
await reporter.onFinished?.([], toError(error));
return 1;
} finally {
await storage.end();
}
}

View file

@ -0,0 +1,2 @@
// eslint-disable-next-line @typescript-eslint/naming-convention
export const DEFAULT_RESPITE_SECONDS = 10;

6
packages/cli/src/deno.d.ts vendored Normal file
View file

@ -0,0 +1,6 @@
declare global {
// eslint-disable-next-line @typescript-eslint/naming-convention
const Deno: any;
}
export {};

View file

@ -1,98 +1,198 @@
import { type MigrationHistoryEntry, type MigrationMetadata } from '@emigrate/plugin-tools/types';
import {
type SerializedError,
type MigrationMetadata,
type FailedMigrationMetadata,
type FailedMigrationHistoryEntry,
} from '@emigrate/types';
import { serializeError, errorConstructors, deserializeError } from 'serialize-error';
const formatter = new Intl.ListFormat('en', { style: 'long', type: 'disjunction' });
export const toError = (error: unknown) => (error instanceof Error ? error : new Error(String(error)));
export const toError = (error: unknown): Error => (error instanceof Error ? error : new Error(String(error)));
export const toSerializedError = (error: unknown) => {
const errorInstance = toError(error);
return serializeError(errorInstance) as unknown as SerializedError;
};
export class EmigrateError extends Error {
constructor(
public code: string,
message: string,
message: string | undefined,
options?: ErrorOptions,
public code?: string,
) {
super(message, options);
this.name = this.constructor.name;
}
}
export class ShowUsageError extends EmigrateError {}
export class MissingOptionError extends ShowUsageError {
constructor(public option: string | string[]) {
super('ERR_MISSING_OPT', `Missing required option: ${Array.isArray(option) ? formatter.format(option) : option}`);
static fromOption(option: string | string[]): MissingOptionError {
return new MissingOptionError(
`Missing required option: ${Array.isArray(option) ? formatter.format(option) : option}`,
undefined,
option,
);
}
constructor(
message: string | undefined,
options?: ErrorOptions,
public option: string | string[] = '',
) {
super(message, options, 'ERR_MISSING_OPT');
}
}
export class MissingArgumentsError extends ShowUsageError {
constructor(public argument: string) {
super('ERR_MISSING_ARGS', `Missing required argument: ${argument}`);
static fromArgument(argument: string): MissingArgumentsError {
return new MissingArgumentsError(`Missing required argument: ${argument}`, undefined, argument);
}
constructor(
message: string | undefined,
options?: ErrorOptions,
public argument = '',
) {
super(message, options, 'ERR_MISSING_ARGS');
}
}
export class OptionNeededError extends ShowUsageError {
static fromOption(option: string, message: string): OptionNeededError {
return new OptionNeededError(message, undefined, option);
}
constructor(
public option: string,
message: string,
message: string | undefined,
options?: ErrorOptions,
public option = '',
) {
super('ERR_OPT_NEEDED', message);
super(message, options, 'ERR_OPT_NEEDED');
}
}
export class BadOptionError extends ShowUsageError {
static fromOption(option: string, message: string): BadOptionError {
return new BadOptionError(message, undefined, option);
}
constructor(
public option: string,
message: string,
message: string | undefined,
options?: ErrorOptions,
public option = '',
) {
super('ERR_BAD_OPT', message);
super(message, options, 'ERR_BAD_OPT');
}
}
export class UnexpectedError extends EmigrateError {
constructor(message: string, options?: ErrorOptions) {
super('ERR_UNEXPECTED', message, options);
constructor(message: string | undefined, options?: ErrorOptions) {
super(message, options, 'ERR_UNEXPECTED');
}
}
export class MigrationHistoryError extends EmigrateError {
constructor(
message: string,
public entry: MigrationHistoryEntry,
) {
super('ERR_MIGRATION_HISTORY', message, { cause: entry.error });
static fromHistoryEntry(entry: FailedMigrationHistoryEntry): MigrationHistoryError {
return new MigrationHistoryError(`Migration ${entry.name} is in a failed state, it should be fixed and removed`, {
cause: deserializeError(entry.error),
});
}
constructor(message: string | undefined, options?: ErrorOptions) {
super(message, options, 'ERR_MIGRATION_HISTORY');
}
}
export class MigrationLoadError extends EmigrateError {
constructor(
message: string,
public metadata: MigrationMetadata,
options?: ErrorOptions,
) {
super('ERR_MIGRATION_LOAD', message, options);
static fromMetadata(metadata: MigrationMetadata, cause?: Error): MigrationLoadError {
return new MigrationLoadError(`Failed to load migration file: ${metadata.relativeFilePath}`, { cause });
}
constructor(message: string | undefined, options?: ErrorOptions) {
super(message, options, 'ERR_MIGRATION_LOAD');
}
}
export class MigrationRunError extends EmigrateError {
constructor(
message: string,
public metadata: MigrationMetadata,
options?: ErrorOptions,
) {
super('ERR_MIGRATION_RUN', message, options);
static fromMetadata(metadata: FailedMigrationMetadata): MigrationRunError {
return new MigrationRunError(`Failed to run migration: ${metadata.relativeFilePath}`, { cause: metadata.error });
}
constructor(message: string | undefined, options?: ErrorOptions) {
super(message, options, 'ERR_MIGRATION_RUN');
}
}
export class MigrationNotRunError extends EmigrateError {
constructor(
message: string,
public metadata: MigrationMetadata,
options?: ErrorOptions,
) {
super('ERR_MIGRATION_NOT_RUN', message, options);
static fromMetadata(metadata: MigrationMetadata, cause?: Error): MigrationNotRunError {
return new MigrationNotRunError(`Migration "${metadata.name}" is not in the migration history`, { cause });
}
constructor(message: string | undefined, options?: ErrorOptions) {
super(message, options, 'ERR_MIGRATION_NOT_RUN');
}
}
export class MigrationRemovalError extends EmigrateError {
static fromMetadata(metadata: MigrationMetadata, cause?: Error): MigrationRemovalError {
return new MigrationRemovalError(`Failed to remove migration: ${metadata.relativeFilePath}`, { cause });
}
constructor(message: string | undefined, options?: ErrorOptions) {
super(message, options, 'ERR_MIGRATION_REMOVE');
}
}
export class StorageInitError extends EmigrateError {
constructor(message: string, options?: ErrorOptions) {
super('ERR_STORAGE_INIT', message, options);
static fromError(error: Error): StorageInitError {
return new StorageInitError('Could not initialize storage', { cause: error });
}
constructor(message: string | undefined, options?: ErrorOptions) {
super(message, options, 'ERR_STORAGE_INIT');
}
}
export class CommandAbortError extends EmigrateError {
static fromSignal(signal: NodeJS.Signals): CommandAbortError {
return new CommandAbortError(`Command aborted due to signal: ${signal}`);
}
static fromReason(reason: string, cause?: unknown): CommandAbortError {
return new CommandAbortError(`Command aborted: ${reason}`, { cause });
}
constructor(message: string | undefined, options?: ErrorOptions) {
super(message, options, 'ERR_COMMAND_ABORT');
}
}
export class ExecutionDesertedError extends EmigrateError {
static fromReason(reason: string, cause?: Error): ExecutionDesertedError {
return new ExecutionDesertedError(`Execution deserted: ${reason}`, { cause });
}
constructor(message: string | undefined, options?: ErrorOptions) {
super(message, options, 'ERR_EXECUTION_DESERTED');
}
}
errorConstructors.set('EmigrateError', EmigrateError as ErrorConstructor);
errorConstructors.set('ShowUsageError', ShowUsageError as ErrorConstructor);
errorConstructors.set('MissingOptionError', MissingOptionError as unknown as ErrorConstructor);
errorConstructors.set('MissingArgumentsError', MissingArgumentsError as unknown as ErrorConstructor);
errorConstructors.set('OptionNeededError', OptionNeededError as unknown as ErrorConstructor);
errorConstructors.set('BadOptionError', BadOptionError as unknown as ErrorConstructor);
errorConstructors.set('UnexpectedError', UnexpectedError as ErrorConstructor);
errorConstructors.set('MigrationHistoryError', MigrationHistoryError as unknown as ErrorConstructor);
errorConstructors.set('MigrationLoadError', MigrationLoadError as unknown as ErrorConstructor);
errorConstructors.set('MigrationRunError', MigrationRunError as unknown as ErrorConstructor);
errorConstructors.set('MigrationNotRunError', MigrationNotRunError as unknown as ErrorConstructor);
errorConstructors.set('MigrationRemovalError', MigrationRemovalError as unknown as ErrorConstructor);
errorConstructors.set('StorageInitError', StorageInitError as unknown as ErrorConstructor);
errorConstructors.set('CommandAbortError', CommandAbortError as unknown as ErrorConstructor);
errorConstructors.set('ExecutionDesertedError', ExecutionDesertedError as unknown as ErrorConstructor);

View file

@ -1,22 +1,85 @@
import { toError } from './errors.js';
import { setTimeout } from 'node:timers';
import prettyMs from 'pretty-ms';
import { ExecutionDesertedError, toError } from './errors.js';
import { DEFAULT_RESPITE_SECONDS } from './defaults.js';
type Fn<Args extends any[], Result> = (...args: Args) => Result;
type Result<T> = [value: T, error: undefined] | [value: undefined, error: Error];
type ExecOptions = {
abortSignal?: AbortSignal;
abortRespite?: number;
};
/**
* Execute a function and return a result tuple
*
* This is a helper function to make it easier to handle errors without the extra nesting of try/catch
* If an abort signal is provided the function will reject with an ExecutionDesertedError if the signal is aborted
* and the given function has not yet resolved within the given respite time (or a default of 30 seconds)
*
* @param fn The function to execute
* @param options Options for the execution
*/
export const exec = async <Args extends any[], Return extends Promise<any>>(
fn: Fn<Args, Return>,
...args: Args
export const exec = async <Return extends Promise<any>>(
fn: () => Return,
options: ExecOptions = {},
): Promise<Result<Awaited<Return>>> => {
try {
const result = await fn(...args);
const aborter = options.abortSignal ? getAborter(options.abortSignal, options.abortRespite) : undefined;
const result = await Promise.race(aborter ? [aborter, fn()] : [fn()]);
aborter?.cancel();
return [result, undefined];
} catch (error) {
return [undefined, toError(error)];
}
};
/**
* Returns a promise that rejects after a given time after the given signal is aborted
*
* @param signal The abort signal to listen to
* @param respite The time in milliseconds to wait before rejecting
*/
const getAborter = (
signal: AbortSignal,
respite = DEFAULT_RESPITE_SECONDS * 1000,
): PromiseLike<never> & { cancel: () => void } => {
const cleanups: Array<() => void> = [];
const aborter = new Promise<never>((_, reject) => {
const abortListener = () => {
const timer = setTimeout(
reject,
respite,
ExecutionDesertedError.fromReason(`Deserted after ${prettyMs(respite)}`, toError(signal.reason)),
);
timer.unref();
cleanups.push(() => {
clearTimeout(timer);
});
};
if (signal.aborted) {
abortListener();
return;
}
signal.addEventListener('abort', abortListener, { once: true });
cleanups.push(() => {
signal.removeEventListener('abort', abortListener);
});
});
const cancel = () => {
for (const cleanup of cleanups) {
cleanup();
}
cleanups.length = 0;
};
return Object.assign(aborter, { cancel });
};

View file

@ -1,13 +0,0 @@
export function filterAsync<T, S extends T>(
iterable: AsyncIterable<T>,
filter: (item: T) => item is S,
): AsyncIterable<S>;
export function filterAsync<T>(iterable: AsyncIterable<T>, filter: (item: T) => unknown): AsyncIterable<T>;
export async function* filterAsync<T>(iterable: AsyncIterable<T>, filter: (item: T) => unknown): AsyncIterable<T> {
for await (const item of iterable) {
if (filter(item)) {
yield item;
}
}
}

View file

@ -1,11 +1,28 @@
import { cosmiconfig } from 'cosmiconfig';
import process from 'node:process';
import { cosmiconfig, defaultLoaders } from 'cosmiconfig';
import { type Config, type EmigrateConfig } from './types.js';
const commands = ['up', 'list', 'new', 'remove'] as const;
type Command = (typeof commands)[number];
const canImportTypeScriptAsIs = Boolean(process.isBun) || typeof Deno !== 'undefined';
export const getConfig = async (command: Command): Promise<Config> => {
const explorer = cosmiconfig('emigrate');
const getEmigrateConfig = (config: any): EmigrateConfig => {
if ('default' in config && typeof config.default === 'object' && config.default !== null) {
return config.default as EmigrateConfig;
}
if (typeof config === 'object' && config !== null) {
return config as EmigrateConfig;
}
return {};
};
export const getConfig = async (command: Command, forceImportTypeScriptAsIs = false): Promise<Config> => {
const explorer = cosmiconfig('emigrate', {
// eslint-disable-next-line @typescript-eslint/naming-convention
loaders: forceImportTypeScriptAsIs || canImportTypeScriptAsIs ? { '.ts': defaultLoaders['.js'] } : undefined,
});
const result = await explorer.search();
@ -13,7 +30,7 @@ export const getConfig = async (command: Command): Promise<Config> => {
return {};
}
const config = result.config as EmigrateConfig;
const config = getEmigrateConfig(result.config);
const commandConfig = config[command];

View file

@ -1,6 +1,6 @@
import process from 'node:process';
export const getDuration = (start: [number, number]) => {
export const getDuration = (start: [number, number]): number => {
const [seconds, nanoseconds] = process.hrtime(start);
return seconds * 1000 + nanoseconds / 1_000_000;
};

View file

@ -1,6 +1,6 @@
import path from 'node:path';
import fs from 'node:fs/promises';
import { type MigrationMetadata } from '@emigrate/plugin-tools/types';
import { type MigrationMetadata } from '@emigrate/types';
import { withLeadingPeriod } from './with-leading-period.js';
import { OptionNeededError } from './errors.js';
@ -12,7 +12,7 @@ const checkMigrationFile = async (name: string, filePath: string) => {
throw new Error('Not a file');
}
} catch {
throw new OptionNeededError(
throw OptionNeededError.fromOption(
'force',
`The given migration name "${name}" does not exist or is not a file. Use the "force" option to ignore this error`,
);

View file

@ -0,0 +1,190 @@
import fs from 'node:fs/promises';
import { afterEach, beforeEach, describe, it, mock } from 'node:test';
import assert from 'node:assert';
import { getMigrations } from './get-migrations.js';
const originalOpendir = fs.opendir;
const opendirMock = mock.fn(originalOpendir);
describe('get-migrations', () => {
beforeEach(() => {
fs.opendir = opendirMock;
});
afterEach(() => {
opendirMock.mock.restore();
fs.opendir = originalOpendir;
});
it('should skip files with leading periods', async () => {
opendirMock.mock.mockImplementation(async function* () {
yield* [
{ name: '.foo.js', isFile: () => true },
{ name: 'bar.js', isFile: () => true },
{ name: 'baz.js', isFile: () => true },
];
});
const migrations = await getMigrations('/cwd/', 'directory');
assert.deepStrictEqual(migrations, [
{
name: 'bar.js',
filePath: '/cwd/directory/bar.js',
relativeFilePath: 'directory/bar.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
{
name: 'baz.js',
filePath: '/cwd/directory/baz.js',
relativeFilePath: 'directory/baz.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
]);
});
it('should skip files with leading underscores', async () => {
opendirMock.mock.mockImplementation(async function* () {
yield* [
{ name: '_foo.js', isFile: () => true },
{ name: 'bar.js', isFile: () => true },
{ name: 'baz.js', isFile: () => true },
];
});
const migrations = await getMigrations('/cwd/', 'directory');
assert.deepStrictEqual(migrations, [
{
name: 'bar.js',
filePath: '/cwd/directory/bar.js',
relativeFilePath: 'directory/bar.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
{
name: 'baz.js',
filePath: '/cwd/directory/baz.js',
relativeFilePath: 'directory/baz.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
]);
});
it('should skip files without file extensions', async () => {
opendirMock.mock.mockImplementation(async function* () {
yield* [
{ name: 'foo', isFile: () => true },
{ name: 'bar.js', isFile: () => true },
{ name: 'baz.js', isFile: () => true },
];
});
const migrations = await getMigrations('/cwd/', 'directory');
assert.deepStrictEqual(migrations, [
{
name: 'bar.js',
filePath: '/cwd/directory/bar.js',
relativeFilePath: 'directory/bar.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
{
name: 'baz.js',
filePath: '/cwd/directory/baz.js',
relativeFilePath: 'directory/baz.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
]);
});
it('should skip non-files', async () => {
opendirMock.mock.mockImplementation(async function* () {
yield* [
{ name: 'foo.js', isFile: () => false },
{ name: 'bar.js', isFile: () => true },
{ name: 'baz.js', isFile: () => true },
];
});
const migrations = await getMigrations('/cwd/', 'directory');
assert.deepStrictEqual(migrations, [
{
name: 'bar.js',
filePath: '/cwd/directory/bar.js',
relativeFilePath: 'directory/bar.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
{
name: 'baz.js',
filePath: '/cwd/directory/baz.js',
relativeFilePath: 'directory/baz.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
]);
});
it('should sort them in lexicographical order', async () => {
opendirMock.mock.mockImplementation(async function* () {
yield* [
{ name: 'foo.js', isFile: () => true },
{ name: 'bar_data.js', isFile: () => true },
{ name: 'bar.js', isFile: () => true },
{ name: 'baz.js', isFile: () => true },
];
});
const migrations = await getMigrations('/cwd/', 'directory');
assert.deepStrictEqual(migrations, [
{
name: 'bar.js',
filePath: '/cwd/directory/bar.js',
relativeFilePath: 'directory/bar.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
{
name: 'bar_data.js',
filePath: '/cwd/directory/bar_data.js',
relativeFilePath: 'directory/bar_data.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
{
name: 'baz.js',
filePath: '/cwd/directory/baz.js',
relativeFilePath: 'directory/baz.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
{
name: 'foo.js',
filePath: '/cwd/directory/foo.js',
relativeFilePath: 'directory/foo.js',
extension: '.js',
directory: 'directory',
cwd: '/cwd/',
},
]);
});
});

View file

@ -1,20 +1,36 @@
import path from 'node:path';
import fs from 'node:fs/promises';
import { type MigrationMetadata } from '@emigrate/plugin-tools/types';
import { type MigrationMetadata } from '@emigrate/types';
import { withLeadingPeriod } from './with-leading-period.js';
import { BadOptionError } from './errors.js';
import { arrayFromAsync } from './array-from-async.js';
export type GetMigrationsFunction = typeof getMigrations;
export const getMigrations = async (cwd: string, directory: string): Promise<MigrationMetadata[]> => {
const allFilesInMigrationDirectory = await fs.readdir(path.resolve(cwd, directory), {
withFileTypes: true,
});
async function* tryReadDirectory(directoryPath: string): AsyncIterable<string> {
try {
for await (const entry of await fs.opendir(directoryPath)) {
if (
entry.isFile() &&
!entry.name.startsWith('.') &&
!entry.name.startsWith('_') &&
path.extname(entry.name) !== ''
) {
yield entry.name;
}
}
} catch {
throw BadOptionError.fromOption('directory', `Couldn't read directory: ${directoryPath}`);
}
}
const migrationFiles: MigrationMetadata[] = allFilesInMigrationDirectory
.filter((file) => file.isFile() && !file.name.startsWith('.') && !file.name.startsWith('_'))
.sort((a, b) => a.name.localeCompare(b.name))
.map(({ name }) => {
const filePath = path.resolve(cwd, directory, name);
export const getMigrations = async (cwd: string, directory: string): Promise<MigrationMetadata[]> => {
const directoryPath = path.resolve(cwd, directory);
const allFilesInMigrationDirectory = await arrayFromAsync(tryReadDirectory(directoryPath));
return allFilesInMigrationDirectory.sort().map((name) => {
const filePath = path.join(directoryPath, name);
return {
name,
@ -25,6 +41,4 @@ export const getMigrations = async (cwd: string, directory: string): Promise<Mig
cwd,
};
});
return migrationFiles;
};

View file

@ -0,0 +1,34 @@
import fs from 'node:fs/promises';
import { fileURLToPath } from 'node:url';
import { UnexpectedError } from './errors.js';
type PackageInfo = {
version: string;
};
const getPackageInfo = async () => {
const packageInfoPath = fileURLToPath(new URL('../package.json', import.meta.url));
try {
const content = await fs.readFile(packageInfoPath, 'utf8');
const packageJson: unknown = JSON.parse(content);
if (
typeof packageJson === 'object' &&
packageJson &&
'version' in packageJson &&
typeof packageJson.version === 'string'
) {
return packageJson as PackageInfo;
}
} catch {
// ignore
}
throw new UnexpectedError(`Could not read package info from: ${packageInfoPath}`);
};
const packageInfo = await getPackageInfo();
// eslint-disable-next-line prefer-destructuring
export const version: string = packageInfo.version;

View file

@ -1,5 +1,5 @@
export * from './types.js';
export const emigrate = () => {
export const emigrate = (): void => {
// console.log('Done!');
};

View file

@ -1,62 +1,129 @@
import process from 'node:process';
import { hrtime } from 'node:process';
import {
isFinishedMigration,
isFailedMigration,
type EmigrateReporter,
type MigrationMetadata,
type MigrationMetadataFinished,
type Storage,
} from '@emigrate/plugin-tools/types';
import { toError, EmigrateError, MigrationRunError } from './errors.js';
type FailedMigrationMetadata,
type SuccessfulMigrationMetadata,
} from '@emigrate/types';
import { toError, EmigrateError, MigrationRunError, BadOptionError } from './errors.js';
import { exec } from './exec.js';
import { getDuration } from './get-duration.js';
type MigrationRunnerParameters = {
type MigrationRunnerParameters<T extends MigrationMetadata | MigrationMetadataFinished> = {
dry: boolean;
lock?: boolean;
limit?: number;
name?: string;
from?: string;
to?: string;
abortSignal?: AbortSignal;
abortRespite?: number;
reporter: EmigrateReporter;
storage: Storage;
migrations: Array<MigrationMetadata | MigrationMetadataFinished>;
validate: (migration: MigrationMetadata) => Promise<void>;
execute: (migration: MigrationMetadata) => Promise<void>;
migrations: AsyncIterable<T>;
migrationFilter?: (migration: T) => boolean;
validate: (migration: T) => Promise<void>;
execute: (migration: T) => Promise<void>;
onSuccess: (migration: SuccessfulMigrationMetadata) => Promise<void>;
onError: (migration: FailedMigrationMetadata, error: Error) => Promise<void>;
};
export const migrationRunner = async ({
export const migrationRunner = async <T extends MigrationMetadata | MigrationMetadataFinished>({
dry,
lock = true,
limit,
name,
from,
to,
abortSignal,
abortRespite,
reporter,
storage,
migrations,
validate,
execute,
}: MigrationRunnerParameters): Promise<Error | undefined> => {
await reporter.onCollectedMigrations?.(migrations);
const finishedMigrations: MigrationMetadataFinished[] = [];
const migrationsToRun: MigrationMetadata[] = [];
onSuccess,
onError,
migrationFilter = () => true,
}: MigrationRunnerParameters<T>): Promise<Error | undefined> => {
const validatedMigrations: Array<MigrationMetadata | MigrationMetadataFinished> = [];
const migrationsToLock: MigrationMetadata[] = [];
let skip = false;
abortSignal?.addEventListener(
'abort',
() => {
skip = true;
reporter.onAbort?.(toError(abortSignal.reason))?.then(
() => {
/* noop */
},
() => {
/* noop */
},
);
},
{ once: true },
);
let nameFound = false;
let fromFound = false;
let toFound = false;
for await (const migration of migrations) {
if (name && migration.relativeFilePath === name) {
nameFound = true;
}
if (from && migration.relativeFilePath === from) {
fromFound = true;
}
if (to && migration.relativeFilePath === to) {
toFound = true;
}
if (!migrationFilter(migration)) {
continue;
}
if (isFinishedMigration(migration)) {
skip ||= migration.status === 'failed' || migration.status === 'skipped';
finishedMigrations.push(migration);
} else if (skip) {
finishedMigrations.push({
validatedMigrations.push(migration);
} else if (
skip ||
Boolean(from && migration.relativeFilePath < from) ||
Boolean(to && migration.relativeFilePath > to) ||
(limit && migrationsToLock.length >= limit)
) {
validatedMigrations.push({
...migration,
status: dry ? 'pending' : 'skipped',
duration: 0,
status: 'skipped',
});
} else {
try {
await validate(migration);
migrationsToRun.push(migration);
migrationsToLock.push(migration);
validatedMigrations.push(migration);
} catch (error) {
for await (const migration of migrationsToRun) {
finishedMigrations.push({ ...migration, status: 'skipped', duration: 0 });
for (const migration of migrationsToLock) {
const validatedIndex = validatedMigrations.indexOf(migration);
validatedMigrations[validatedIndex] = {
...migration,
status: 'skipped',
};
}
migrationsToRun.length = 0;
migrationsToLock.length = 0;
finishedMigrations.push({
validatedMigrations.push({
...migration,
status: 'failed',
duration: 0,
@ -68,50 +135,103 @@ export const migrationRunner = async ({
}
}
const [lockedMigrations, lockError] = dry ? [migrationsToRun] : await exec(async () => storage.lock(migrationsToRun));
await reporter.onCollectedMigrations?.(validatedMigrations);
if (lockError) {
for await (const migration of migrationsToRun) {
finishedMigrations.push({ ...migration, duration: 0, status: 'skipped' });
let optionError: Error | undefined;
if (name && !nameFound) {
optionError = BadOptionError.fromOption('name', `The migration: "${name}" was not found`);
} else if (from && !fromFound) {
optionError = BadOptionError.fromOption('from', `The "from" migration: "${from}" was not found`);
} else if (to && !toFound) {
optionError = BadOptionError.fromOption('to', `The "to" migration: "${to}" was not found`);
}
migrationsToRun.length = 0;
if (optionError) {
dry = true;
skip = true;
for (const migration of migrationsToLock) {
const validatedIndex = validatedMigrations.indexOf(migration);
validatedMigrations[validatedIndex] = {
...migration,
status: 'skipped',
};
}
migrationsToLock.length = 0;
}
const [lockedMigrations, lockError] =
dry || !lock
? [migrationsToLock]
: await exec(async () => storage.lock(migrationsToLock), { abortSignal, abortRespite });
if (lockError) {
for (const migration of migrationsToLock) {
const validatedIndex = validatedMigrations.indexOf(migration);
validatedMigrations[validatedIndex] = {
...migration,
status: 'skipped',
};
}
migrationsToLock.length = 0;
skip = true;
} else {
} else if (lock) {
for (const migration of migrationsToLock) {
const isLocked = lockedMigrations.some((lockedMigration) => lockedMigration.name === migration.name);
if (!isLocked) {
const validatedIndex = validatedMigrations.indexOf(migration);
validatedMigrations[validatedIndex] = {
...migration,
status: 'skipped',
};
}
}
await reporter.onLockedMigrations?.(lockedMigrations);
}
for await (const finishedMigration of finishedMigrations) {
switch (finishedMigration.status) {
const finishedMigrations: MigrationMetadataFinished[] = [];
for await (const migration of validatedMigrations) {
if (isFinishedMigration(migration)) {
switch (migration.status) {
case 'failed': {
await reporter.onMigrationError?.(finishedMigration, finishedMigration.error!);
await reporter.onMigrationError?.(migration, migration.error);
break;
}
case 'pending': {
await reporter.onMigrationSkip?.(finishedMigration);
await reporter.onMigrationSkip?.(migration);
break;
}
case 'skipped': {
await reporter.onMigrationSkip?.(finishedMigration);
await reporter.onMigrationSkip?.(migration);
break;
}
default: {
await reporter.onMigrationSuccess?.(finishedMigration);
await reporter.onMigrationSuccess?.(migration);
break;
}
}
finishedMigrations.push(migration);
continue;
}
for await (const migration of lockedMigrations ?? []) {
if (dry || skip) {
const finishedMigration: MigrationMetadataFinished = {
...migration,
status: dry ? 'pending' : 'skipped',
duration: 0,
};
await reporter.onMigrationSkip?.(finishedMigration);
@ -122,44 +242,54 @@ export const migrationRunner = async ({
await reporter.onMigrationStart?.(migration);
const start = process.hrtime();
const start = hrtime();
const [, migrationError] = await exec(async () => execute(migration));
const [, migrationError] = await exec(async () => execute(migration as T), { abortSignal, abortRespite });
const duration = getDuration(start);
const finishedMigration: MigrationMetadataFinished = {
if (migrationError) {
const finishedMigration: FailedMigrationMetadata = {
...migration,
status: migrationError ? 'failed' : 'done',
status: 'failed',
duration,
error: migrationError,
};
finishedMigrations.push(finishedMigration);
if (migrationError) {
await storage.onError(finishedMigration, migrationError);
await onError(finishedMigration, migrationError);
await reporter.onMigrationError?.(finishedMigration, migrationError);
finishedMigrations.push(finishedMigration);
skip = true;
} else {
await storage.onSuccess(finishedMigration);
const finishedMigration: SuccessfulMigrationMetadata = {
...migration,
status: 'done',
duration,
};
await onSuccess(finishedMigration);
await reporter.onMigrationSuccess?.(finishedMigration);
finishedMigrations.push(finishedMigration);
}
}
const [, unlockError] = dry ? [] : await exec(async () => storage.unlock(lockedMigrations ?? []));
const [, unlockError] =
dry || !lock ? [] : await exec(async () => storage.unlock(lockedMigrations ?? []), { abortSignal, abortRespite });
const firstFailed = finishedMigrations.find((migration) => migration.status === 'failed');
// eslint-disable-next-line unicorn/no-array-callback-reference
const firstFailed = finishedMigrations.find(isFailedMigration);
const firstError =
firstFailed?.error instanceof EmigrateError
? firstFailed.error
: firstFailed
? new MigrationRunError(`Failed to run migration: ${firstFailed.relativeFilePath}`, firstFailed, {
cause: firstFailed?.error,
})
? MigrationRunError.fromMetadata(firstFailed)
: undefined;
const error = unlockError ?? firstError ?? lockError;
const error =
optionError ??
unlockError ??
firstError ??
lockError ??
(abortSignal?.aborted ? toError(abortSignal.reason) : undefined);
await reporter.onFinished?.(finishedMigrations, error);
await storage.end();
return error;
};

View file

@ -1,5 +1,5 @@
import { promisify } from 'node:util';
import { type LoaderPlugin } from '@emigrate/plugin-tools/types';
import { type LoaderPlugin } from '@emigrate/types';
// eslint-disable-next-line @typescript-eslint/ban-types
const promisifyIfNeeded = <T extends Function>(fn: T) => {
@ -17,7 +17,7 @@ const promisifyIfNeeded = <T extends Function>(fn: T) => {
};
const loaderJs: LoaderPlugin = {
loadableExtensions: ['.js', '.cjs', '.mjs'],
loadableExtensions: ['.js', '.cjs', '.mjs', '.ts', '.cts', '.mts'],
async loadMigration(migration) {
const migrationModule: unknown = await import(migration.filePath);

View file

@ -1,5 +1,5 @@
import path from 'node:path';
import { black, blueBright, bold, cyan, dim, faint, gray, green, red, redBright, yellow } from 'ansis';
import { setInterval } from 'node:timers';
import { black, blueBright, bold, cyan, dim, faint, gray, green, red, redBright, yellow, yellowBright } from 'ansis';
import logUpdate from 'log-update';
import elegantSpinner from 'elegant-spinner';
import figures from 'figures';
@ -11,10 +11,10 @@ import {
type EmigrateReporter,
type ReporterInitParameters,
type Awaitable,
} from '@emigrate/plugin-tools/types';
import { EmigrateError } from '../errors.js';
} from '@emigrate/types';
type Status = ReturnType<typeof getMigrationStatus>;
type Command = ReporterInitParameters['command'];
const interactive = isInteractive();
const spinner = interactive ? elegantSpinner() : () => figures.pointerSmall;
@ -22,21 +22,26 @@ const spinner = interactive ? elegantSpinner() : () => figures.pointerSmall;
const formatDuration = (duration: number): string => {
const pretty = prettyMs(duration);
return yellow(pretty.replaceAll(/([^\s\d]+)/g, dim('$1')));
return yellow(pretty.replaceAll(/([^\s\d.]+)/g, dim('$1')));
};
const getTitle = ({ command, directory, dry, cwd }: ReporterInitParameters) => {
return `${black.bgBlueBright(' Emigrate ').trim()} ${blueBright.bold(command)} ${gray(cwd + path.sep)}${directory}${
const getTitle = ({ command, version, dry, cwd }: ReporterInitParameters) => {
return `${black.bgBlueBright` Emigrate `.trim()} ${blueBright.bold(command)} ${blueBright`v${version}`} ${gray(cwd)}${
dry ? yellow` (dry run)` : ''
}`;
};
const getMigrationStatus = (
command: Command,
migration: MigrationMetadata | MigrationMetadataFinished,
activeMigration?: MigrationMetadata,
) => {
if ('status' in migration) {
return migration.status;
return command === 'remove' && migration.status === 'done' ? 'removed' : migration.status;
}
if (command === 'remove' && migration.name === activeMigration?.name) {
return 'removing';
}
return migration.name === activeMigration?.name ? 'running' : 'pending';
@ -44,6 +49,10 @@ const getMigrationStatus = (
const getIcon = (status: Status) => {
switch (status) {
case 'removing': {
return cyan(spinner());
}
case 'running': {
return cyan(spinner());
}
@ -52,6 +61,10 @@ const getIcon = (status: Status) => {
return gray(figures.pointerSmall);
}
case 'removed': {
return green(figures.tick);
}
case 'done': {
return green(figures.tick);
}
@ -91,19 +104,19 @@ const getName = (name: string, status?: Status) => {
};
const getMigrationText = (
command: Command,
migration: MigrationMetadata | MigrationMetadataFinished,
activeMigration?: MigrationMetadata,
) => {
const pathWithoutName = migration.relativeFilePath.slice(0, -migration.name.length);
const nameWithoutExtension = migration.name.slice(0, -migration.extension.length);
const status = getMigrationStatus(migration, activeMigration);
const status = getMigrationStatus(command, migration, activeMigration);
const parts = [' ', getIcon(status)];
parts.push(`${getName(nameWithoutExtension, status)}${dim(migration.extension)}`);
parts.push(`${dim(pathWithoutName)}${getName(nameWithoutExtension, status)}${dim(migration.extension)}`);
if ('status' in migration) {
parts.push(gray(`(${migration.status})`));
} else if (migration.name === activeMigration?.name) {
parts.push(gray`(running)`);
if ('status' in migration || migration.name === activeMigration?.name) {
parts.push(gray`(${status})`);
}
if ('duration' in migration && migration.duration) {
@ -147,15 +160,14 @@ const getError = (error?: ErrorLike, indent = ' ') => {
others[property] = error[property as keyof ErrorLike];
}
const codeString = typeof others['code'] === 'string' ? others['code'] : undefined;
const codeString =
typeof others['code'] === 'string' || typeof others['code'] === 'number' ? others['code'] : undefined;
const code = codeString ? ` [${codeString}]` : '';
const errorTitle = error.name
? `${error.name}${codeString && !error.name.includes(codeString) ? code : ''}: ${error.message}`
: error.message;
const errorTitle = error.name ? `${error.name}${code}: ${error.message}` : error.message;
const parts = [`${indent}${bold.red(errorTitle)}`, ...stack.map((line) => `${indent} ${dim(line.trim())}`)];
if (properties.length > 0 && !(error instanceof EmigrateError)) {
if (properties.length > 0) {
parts.push(`${indent} ${JSON.stringify(others, undefined, 2).split('\n').join(`\n${indent} `)}`);
}
@ -167,6 +179,20 @@ const getError = (error?: ErrorLike, indent = ' ') => {
return parts.join('\n');
};
const getAbortMessage = (reason?: Error) => {
if (!reason) {
return '';
}
const parts = [` ${red.bold(reason.message)}`];
if (isErrorLike(reason.cause)) {
parts.push(getError(reason.cause, ' '));
}
return parts.join('\n');
};
const getSummary = (
command: ReporterInitParameters['command'],
migrations: Array<MigrationMetadata | MigrationMetadataFinished> = [],
@ -234,26 +260,39 @@ const getHeaderMessage = (
}
if (migrations.length === 0) {
return ' No pending migrations found';
return ' No migrations found';
}
const statusText = command === 'list' ? 'migrations are pending' : 'pending migrations to run';
if (migrations.length === lockedMigrations.length) {
return ` ${bold(migrations.length.toString())} ${dim('pending migrations to run')}`;
return ` ${bold(migrations.length.toString())} ${dim(statusText)}`;
}
const nonLockedMigrations = migrations.filter(
(migration) => !lockedMigrations.some((lockedMigration) => lockedMigration.name === migration.name),
);
const failedMigrations = nonLockedMigrations.filter(
(migration) => 'status' in migration && migration.status === 'failed',
);
const unlockableCount = command === 'up' ? nonLockedMigrations.length - failedMigrations.length : 0;
let skippedCount = 0;
let failedCount = 0;
for (const migration of migrations) {
const isLocked = lockedMigrations.some((lockedMigration) => lockedMigration.name === migration.name);
if (isLocked) {
continue;
}
if ('status' in migration) {
if (migration.status === 'failed') {
failedCount += 1;
} else if (migration.status === 'skipped') {
skippedCount += 1;
}
}
}
const parts = [
bold(`${lockedMigrations.length} of ${migrations.length}`),
dim`pending migrations to run`,
unlockableCount > 0 ? yellow(`(${unlockableCount} locked)`) : '',
failedMigrations.length > 0 ? redBright(`(${failedMigrations.length} failed)`) : '',
dim(statusText),
skippedCount > 0 ? yellowBright(`(${skippedCount} skipped)`) : '',
failedCount > 0 ? redBright(`(${failedCount} failed)`) : '',
].filter(Boolean);
return ` ${parts.join(' ')}`;
@ -266,6 +305,7 @@ class DefaultFancyReporter implements Required<EmigrateReporter> {
#error: Error | undefined;
#parameters!: ReporterInitParameters;
#interval: NodeJS.Timeout | undefined;
#abortReason: Error | undefined;
onInit(parameters: ReporterInitParameters): void | PromiseLike<void> {
this.#parameters = parameters;
@ -273,6 +313,10 @@ class DefaultFancyReporter implements Required<EmigrateReporter> {
this.#start();
}
onAbort(reason: Error): void | PromiseLike<void> {
this.#abortReason = reason;
}
onCollectedMigrations(migrations: MigrationMetadata[]): void | PromiseLike<void> {
this.#migrations = migrations;
}
@ -285,19 +329,6 @@ class DefaultFancyReporter implements Required<EmigrateReporter> {
this.#migrations = [migration];
}
onMigrationRemoveStart(migration: MigrationMetadata): Awaitable<void> {
this.#migrations = [migration];
this.#activeMigration = migration;
}
onMigrationRemoveSuccess(migration: MigrationMetadataFinished): Awaitable<void> {
this.#finishMigration(migration);
}
onMigrationRemoveError(migration: MigrationMetadataFinished, _error: Error): Awaitable<void> {
this.#finishMigration(migration);
}
onMigrationStart(migration: MigrationMetadata): void | PromiseLike<void> {
this.#activeMigration = migration;
}
@ -342,7 +373,10 @@ class DefaultFancyReporter implements Required<EmigrateReporter> {
const parts = [
getTitle(this.#parameters),
getHeaderMessage(this.#parameters.command, this.#migrations, this.#lockedMigrations),
this.#migrations?.map((migration) => getMigrationText(migration, this.#activeMigration)).join('\n') ?? '',
this.#migrations
?.map((migration) => getMigrationText(this.#parameters.command, migration, this.#activeMigration))
.join('\n') ?? '',
getAbortMessage(this.#abortReason),
getSummary(this.#parameters.command, this.#migrations),
getError(this.#error),
];
@ -388,6 +422,12 @@ class DefaultReporter implements Required<EmigrateReporter> {
console.log('');
}
onAbort(reason: Error): void | PromiseLike<void> {
console.log('');
console.error(getAbortMessage(reason));
console.log('');
}
onCollectedMigrations(migrations: MigrationMetadata[]): void | PromiseLike<void> {
this.#migrations = migrations;
}
@ -400,35 +440,23 @@ class DefaultReporter implements Required<EmigrateReporter> {
}
onNewMigration(migration: MigrationMetadata, _content: string): Awaitable<void> {
console.log(getMigrationText(migration));
}
onMigrationRemoveStart(migration: MigrationMetadata): Awaitable<void> {
console.log(getMigrationText(migration));
}
onMigrationRemoveSuccess(migration: MigrationMetadataFinished): Awaitable<void> {
console.log(getMigrationText(migration));
}
onMigrationRemoveError(migration: MigrationMetadataFinished, _error: Error): Awaitable<void> {
console.error(getMigrationText(migration));
console.log(getMigrationText(this.#parameters.command, migration));
}
onMigrationStart(migration: MigrationMetadata): void | PromiseLike<void> {
console.log(getMigrationText(migration, migration));
console.log(getMigrationText(this.#parameters.command, migration, migration));
}
onMigrationSuccess(migration: MigrationMetadataFinished): void | PromiseLike<void> {
console.log(getMigrationText(migration));
console.log(getMigrationText(this.#parameters.command, migration));
}
onMigrationError(migration: MigrationMetadataFinished, _error: Error): void | PromiseLike<void> {
console.error(getMigrationText(migration));
console.error(getMigrationText(this.#parameters.command, migration));
}
onMigrationSkip(migration: MigrationMetadataFinished): void | PromiseLike<void> {
console.log(getMigrationText(migration));
console.log(getMigrationText(this.#parameters.command, migration));
}
onFinished(migrations: MigrationMetadataFinished[], error?: Error | undefined): void | PromiseLike<void> {
@ -443,6 +471,6 @@ class DefaultReporter implements Required<EmigrateReporter> {
}
}
const reporterDefault = interactive ? new DefaultFancyReporter() : new DefaultReporter();
const reporterDefault: EmigrateReporter = interactive ? new DefaultFancyReporter() : new DefaultReporter();
export default reporterDefault;

View file

@ -0,0 +1,15 @@
import type { EmigrateReporter } from '@emigrate/types';
import { type Config } from '../types.js';
import * as reporters from './index.js';
export const getStandardReporter = (reporter?: Config['reporter']): EmigrateReporter | undefined => {
if (!reporter) {
return reporters.pretty;
}
if (typeof reporter === 'string' && reporter in reporters) {
return reporters[reporter as keyof typeof reporters];
}
return undefined;
};

View file

@ -0,0 +1,2 @@
export { default as pretty } from './default.js';
export { default as json } from './json.js';

View file

@ -0,0 +1,60 @@
import { type ReporterInitParameters, type EmigrateReporter, type MigrationMetadataFinished } from '@emigrate/types';
import { toSerializedError } from '../errors.js';
class JsonReporter implements EmigrateReporter {
#parameters!: ReporterInitParameters;
#startTime!: number;
onInit(parameters: ReporterInitParameters): void {
this.#startTime = Date.now();
this.#parameters = parameters;
}
onFinished(migrations: MigrationMetadataFinished[], error?: Error | undefined): void {
const { command, version } = this.#parameters;
let numberDoneMigrations = 0;
let numberSkippedMigrations = 0;
let numberFailedMigrations = 0;
let numberPendingMigrations = 0;
for (const migration of migrations) {
// eslint-disable-next-line unicorn/prefer-switch
if (migration.status === 'done') {
numberDoneMigrations++;
} else if (migration.status === 'skipped') {
numberSkippedMigrations++;
} else if (migration.status === 'failed') {
numberFailedMigrations++;
} else {
numberPendingMigrations++;
}
}
const result = {
command,
version,
numberTotalMigrations: migrations.length,
numberDoneMigrations,
numberSkippedMigrations,
numberFailedMigrations,
numberPendingMigrations,
success: !error,
startTime: this.#startTime,
endTime: Date.now(),
error: error ? toSerializedError(error) : undefined,
migrations: migrations.map((migration) => ({
name: migration.filePath,
status: migration.status,
duration: 'duration' in migration ? migration.duration : 0,
error: 'error' in migration ? toSerializedError(migration.error) : undefined,
})),
};
console.log(JSON.stringify(result, undefined, 2));
}
}
const jsonReporter: EmigrateReporter = new JsonReporter();
export default jsonReporter;

View file

@ -0,0 +1,134 @@
import { mock, type Mock } from 'node:test';
import path from 'node:path';
import assert from 'node:assert';
import {
type SerializedError,
type EmigrateReporter,
type FailedMigrationHistoryEntry,
type MigrationHistoryEntry,
type MigrationMetadata,
type NonFailedMigrationHistoryEntry,
type Storage,
} from '@emigrate/types';
import { toSerializedError } from './errors.js';
export type Mocked<T> = {
// @ts-expect-error - This is a mock
[K in keyof T]: Mock<T[K]>;
};
export async function noop(): Promise<void> {
// noop
}
export function getErrorCause(error: Error | undefined): Error | SerializedError | undefined {
if (error?.cause instanceof Error) {
return error.cause;
}
if (typeof error?.cause === 'object' && error.cause !== null) {
return error.cause as unknown as SerializedError;
}
return undefined;
}
export function getMockedStorage(historyEntries: Array<string | MigrationHistoryEntry>): Mocked<Storage> {
return {
lock: mock.fn(async (migrations) => migrations),
unlock: mock.fn(async () => {
// void
}),
getHistory: mock.fn(async function* () {
yield* toEntries(historyEntries);
}),
remove: mock.fn(),
onSuccess: mock.fn(),
onError: mock.fn(),
end: mock.fn(),
};
}
export function getMockedReporter(): Mocked<Required<EmigrateReporter>> {
return {
onFinished: mock.fn(noop),
onInit: mock.fn(noop),
onAbort: mock.fn(noop),
onCollectedMigrations: mock.fn(noop),
onLockedMigrations: mock.fn(noop),
onNewMigration: mock.fn(noop),
onMigrationStart: mock.fn(noop),
onMigrationSuccess: mock.fn(noop),
onMigrationError: mock.fn(noop),
onMigrationSkip: mock.fn(noop),
};
}
export function toMigration(cwd: string, directory: string, name: string): MigrationMetadata {
return {
name,
filePath: `${cwd}/${directory}/${name}`,
relativeFilePath: `${directory}/${name}`,
extension: path.extname(name),
directory,
cwd,
};
}
export function toMigrations(cwd: string, directory: string, names: string[]): MigrationMetadata[] {
return names.map((name) => toMigration(cwd, directory, name));
}
export function toEntry(name: MigrationHistoryEntry): MigrationHistoryEntry;
export function toEntry<S extends MigrationHistoryEntry['status']>(
name: string,
status?: S,
): S extends 'failed' ? FailedMigrationHistoryEntry : NonFailedMigrationHistoryEntry;
export function toEntry(name: string | MigrationHistoryEntry, status?: 'done' | 'failed'): MigrationHistoryEntry {
if (typeof name !== 'string') {
return name.status === 'failed' ? name : name;
}
if (status === 'failed') {
return {
name,
status,
date: new Date(),
error: { name: 'Error', message: 'Failed' },
};
}
return {
name,
status: status ?? 'done',
date: new Date(),
};
}
export function toEntries(
names: Array<string | MigrationHistoryEntry>,
status?: MigrationHistoryEntry['status'],
): MigrationHistoryEntry[] {
return names.map((name) => (typeof name === 'string' ? toEntry(name, status) : name));
}
export function assertErrorEqualEnough(actual?: Error | SerializedError, expected?: Error, message?: string): void {
if (expected === undefined) {
assert.strictEqual(actual, undefined);
return;
}
const {
cause: actualCause,
stack: actualStack,
...actualError
} = actual instanceof Error ? toSerializedError(actual) : actual ?? {};
const { cause: expectedCause, stack: expectedStack, ...expectedError } = toSerializedError(expected);
// @ts-expect-error Ignore
const { stack: actualCauseStack, ...actualCauseRest } = actualCause ?? {};
// @ts-expect-error Ignore
const { stack: expectedCauseStack, ...expectedCauseRest } = expectedCause ?? {};
assert.deepStrictEqual(actualError, expectedError, message);
assert.deepStrictEqual(actualCauseRest, expectedCauseRest, message ? `${message} (cause)` : undefined);
}

View file

@ -1,5 +1,5 @@
import path from 'node:path';
import { type MigrationHistoryEntry, type MigrationMetadataFinished } from '@emigrate/plugin-tools/types';
import { type MigrationHistoryEntry, type MigrationMetadataFinished } from '@emigrate/types';
import { withLeadingPeriod } from './with-leading-period.js';
import { MigrationHistoryError } from './errors.js';
@ -8,7 +8,22 @@ export const toMigrationMetadata = (
{ cwd, directory }: { cwd: string; directory: string },
): MigrationMetadataFinished => {
const filePath = path.resolve(cwd, directory, entry.name);
const finishedMigration: MigrationMetadataFinished = {
if (entry.status === 'failed') {
return {
name: entry.name,
status: entry.status,
filePath,
relativeFilePath: path.relative(cwd, filePath),
extension: withLeadingPeriod(path.extname(entry.name)),
directory,
cwd,
duration: 0,
error: MigrationHistoryError.fromHistoryEntry(entry),
};
}
return {
name: entry.name,
status: entry.status,
filePath,
@ -18,13 +33,4 @@ export const toMigrationMetadata = (
cwd,
duration: 0,
};
if (entry.status === 'failed') {
finishedMigration.error = new MigrationHistoryError(
`Migration ${entry.name} is in a failed state, it should be fixed and removed`,
entry,
);
}
return finishedMigration;
};

View file

@ -1,4 +1,7 @@
import { type EmigrateStorage, type Awaitable, type Plugin, type EmigrateReporter } from '@emigrate/plugin-tools/types';
import { type EmigrateStorage, type Awaitable, type Plugin, type EmigrateReporter } from '@emigrate/types';
import type * as reporters from './reporters/index.js';
export type StandardReporter = keyof typeof reporters;
export type EmigratePlugin = Plugin;
@ -6,11 +9,13 @@ type StringOrModule<T> = string | T | (() => Awaitable<T>) | (() => Awaitable<{
export type Config = {
storage?: StringOrModule<EmigrateStorage>;
reporter?: StringOrModule<EmigrateReporter>;
reporter?: StandardReporter | StringOrModule<EmigrateReporter>;
plugins?: Array<StringOrModule<EmigratePlugin>>;
directory?: string;
template?: string;
extension?: string;
color?: boolean;
abortRespite?: number;
};
export type EmigrateConfig = Config & {

View file

@ -1 +1 @@
export const withLeadingPeriod = (string: string) => (string.startsWith('.') ? string : `.${string}`);
export const withLeadingPeriod = (string: string): string => (string.startsWith('.') ? string : `.${string}`);

View file

@ -1,8 +1,3 @@
{
"extends": "@emigrate/tsconfig/build.json",
"compilerOptions": {
"outDir": "dist"
},
"include": ["src"],
"exclude": ["node_modules", "dist"]
"extends": "@emigrate/tsconfig/build.json"
}

View file

@ -1,5 +1,132 @@
# @emigrate/mysql
## 0.3.3
### Patch Changes
- 26240f4: Make sure we can initialize multiple running instances of Emigrate using @emigrate/mysql concurrently without issues with creating the history table (for instance in a Kubernetes environment and/or with a Percona cluster).
- d779286: Upgrade TypeScript to v5.5 and enable [isolatedDeclarations](https://devblogs.microsoft.com/typescript/announcing-typescript-5-5/#isolated-declarations)
- 26240f4: Either lock all or none of the migrations to run to make sure they run in order when multiple instances of Emigrate runs concurrently (for instance in a Kubernetes environment)
- Updated dependencies [d779286]
- @emigrate/plugin-tools@0.9.8
- @emigrate/types@0.12.2
## 0.3.2
### Patch Changes
- 57498db: Unreference all connections when run using Bun, to not keep the process open unnecessarily long
## 0.3.1
### Patch Changes
- ca154fa: Minimize package size by excluding \*.tsbuildinfo files
- Updated dependencies [ca154fa]
- @emigrate/plugin-tools@0.9.7
- @emigrate/types@0.12.2
## 0.3.0
### Minor Changes
- 4442604: Automatically create the database if it doesn't exist, and the user have the permissions to do so
### Patch Changes
- aef2d7c: Avoid "CREATE TABLE IF NOT EXISTS" as it's too locking in a clustered database when running it concurrently
## 0.2.8
### Patch Changes
- 17feb2d: Only unreference connections in a Bun environment as it crashes Node for some reason, without even throwing an error that is
## 0.2.7
### Patch Changes
- 198aa54: Unreference all connections automatically so that they don't hinder the process from exiting. This is especially needed in Bun environments as it seems to handle sockets differently regarding this matter than NodeJS.
## 0.2.6
### Patch Changes
- db656c2: Enable NPM provenance
- Updated dependencies [db656c2]
- @emigrate/plugin-tools@0.9.6
- @emigrate/types@0.12.1
## 0.2.5
### Patch Changes
- f8a5cc7: Make sure the storage initialization crashes when a database connection can't be established
- Updated dependencies [94ad9fe]
- @emigrate/types@0.12.0
- @emigrate/plugin-tools@0.9.5
## 0.2.4
### Patch Changes
- Updated dependencies [ce15648]
- @emigrate/types@0.11.0
- @emigrate/plugin-tools@0.9.4
## 0.2.3
### Patch Changes
- Updated dependencies [f9a16d8]
- @emigrate/types@0.10.0
- @emigrate/plugin-tools@0.9.3
## 0.2.2
### Patch Changes
- Updated dependencies [a6c6e6d]
- @emigrate/types@0.9.1
- @emigrate/plugin-tools@0.9.2
## 0.2.1
### Patch Changes
- 3a8b06b: Don't use the `bun` key in `exports` as that would mean we have to include both built files and source files in each package, which is a bit wasteful. Maybe reconsider in the future if we can package only source files.
- Updated dependencies [3a8b06b]
- @emigrate/plugin-tools@0.9.1
## 0.2.0
### Minor Changes
- ce6946c: Emigrate supports Bun, make use of the `bun` key in package.json `exports`
### Patch Changes
- Updated dependencies [ce6946c]
- @emigrate/plugin-tools@0.9.0
- @emigrate/types@0.9.0
## 0.1.3
### Patch Changes
- Updated dependencies [cae6d11]
- Updated dependencies [cae6d11]
- Updated dependencies [cae6d11]
- @emigrate/types@0.8.0
- @emigrate/plugin-tools@0.8.0
## 0.1.2
### Patch Changes
- Updated dependencies [bad4e25]
- @emigrate/plugin-tools@0.7.0
## 0.1.1
### Patch Changes

View file

@ -1,6 +1,6 @@
# @emigrate/storage-mysql
# @emigrate/mysql
A MySQL plugin for Emigrate. Uses a MySQL database for storing migration history. Can load and generate .sql migration files.
A MySQL plugin for Emigrate. Uses a MySQL database for storing the migration history. Can load and generate .sql migration files.
The table used for storing the migration history is compatible with the [immigration-mysql](https://github.com/joakimbeng/immigration-mysql) package, so you can use this together with the [@emigrate/cli](../cli) as a drop-in replacement for that package.
@ -17,7 +17,13 @@ This plugin is actually three different Emigrate plugins in one:
Install the plugin in your project, alongside the Emigrate CLI:
```bash
npm install --save-dev @emigrate/cli @emigrate/mysql
npm install @emigrate/cli @emigrate/mysql
# or
pnpm add @emigrate/cli @emigrate/mysql
# or
yarn add @emigrate/cli @emigrate/mysql
# or
bun add @emigrate/cli @emigrate/mysql
```
## Usage

View file

@ -1,8 +1,9 @@
{
"name": "@emigrate/mysql",
"version": "0.1.1",
"version": "0.3.3",
"publishConfig": {
"access": "public"
"access": "public",
"provenance": true
},
"description": "A MySQL plugin for Emigrate. Uses a MySQL database for storing migration history. Can load and generate .sql migration files.",
"main": "dist/index.js",
@ -15,12 +16,17 @@
}
},
"files": [
"dist"
"dist",
"!dist/*.tsbuildinfo",
"!dist/**/*.test.js",
"!dist/tests/*"
],
"scripts": {
"build": "tsc --pretty",
"build:watch": "tsc --pretty --watch",
"lint": "xo --cwd=../.. $(pwd)"
"lint": "xo --cwd=../.. $(pwd)",
"integration": "glob -c \"node --import tsx --test-reporter spec --test\" \"./src/**/*.integration.ts\"",
"integration:watch": "glob -c \"node --watch --import tsx --test-reporter spec --test\" \"./src/**/*.integration.ts\""
},
"keywords": [
"emigrate",
@ -38,10 +44,13 @@
"license": "MIT",
"dependencies": {
"@emigrate/plugin-tools": "workspace:*",
"@emigrate/types": "workspace:*",
"mysql2": "3.6.5"
},
"devDependencies": {
"@emigrate/tsconfig": "workspace:*"
"@emigrate/tsconfig": "workspace:*",
"@types/bun": "1.1.2",
"bun-types": "1.1.8"
},
"volta": {
"extends": "../../package.json"

Some files were not shown because too many files have changed in this diff Show more