Compare commits

..

43 commits

Author SHA1 Message Date
52844d7a09 ci(mysql): add @emigrate/mysql integration tests to GitHub Actions
Some checks failed
Deploy to GitHub Pages / build (push) Failing after 2m38s
Deploy to GitHub Pages / deploy (push) Has been skipped
Integration Tests / Emigrate MySQL integration tests (push) Failing after 4m0s
Release / Release (push) Failing after 12s
CI / Build and Test (push) Has been cancelled
2025-04-25 09:48:34 +02:00
github-actions[bot]
fa3fb20dc5 chore(release): version packages 2025-04-24 16:06:29 +02:00
26240f49ff fix(mysql): make sure migrations are run in order when run concurrently
Now we either lock all or none of the migrations to run,
to make sure they are not out of order when multiple instances of Emigrate run concurrently.
2025-04-24 15:57:44 +02:00
6eb60177c5 fix: use another changesets-action version 2024-08-09 16:03:34 +02:00
b3b603b2fc feat: make aggregated GitHub releases instead of one per package
And also publish packages with unreleased changes tagged with `next` to NPM
2024-08-09 15:49:22 +02:00
bb9d674cd7 chore: turn off Turbo's UI as it messes with the terminal and is not as intuitive as it seems 2024-06-27 16:05:45 +02:00
c151031d41 chore(deps): upgrade Turbo and opt out from telemetry 2024-06-27 16:05:45 +02:00
dependabot[bot]
48181d88b7 chore(deps): bump turbo from 1.10.16 to 2.0.5
Bumps [turbo](https://github.com/vercel/turbo) from 1.10.16 to 2.0.5.
- [Release notes](https://github.com/vercel/turbo/releases)
- [Changelog](https://github.com/vercel/turbo/blob/main/release.md)
- [Commits](https://github.com/vercel/turbo/compare/v1.10.16...v2.0.5)

---
updated-dependencies:
- dependency-name: turbo
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-06-27 16:05:45 +02:00
d779286084 chore(deps): upgrade TypeScript to v5.5 and enable isolatedDeclarations 2024-06-27 15:38:50 +02:00
ef848a0553 chore(deps): re-add the specific PNPM version for the deploy workflow 2024-06-27 13:27:34 +02:00
4d12402595 chore(deps): make sure the correct PNPM version is used (everywhere) 2024-06-27 11:55:33 +02:00
be5c4d28b6 chore(deps): make sure the correct PNPM version is used 2024-06-27 11:47:40 +02:00
2cefa2508b chore(deps): upgrade PNPM to v9.4.0 2024-06-27 11:12:21 +02:00
0ff9f60d59 chore(deps): upgrade all action dependencies
Closes #70, #128, #135, #145
2024-06-27 10:59:47 +02:00
github-actions[bot]
31693ddb3c chore(release): version packages 2024-06-25 09:21:37 +02:00
57498db248 fix(mysql): close database connections gracefully when using Bun 2024-06-25 08:22:56 +02:00
github-actions[bot]
cf620a191d chore(release): version packages 2024-05-30 10:16:07 +02:00
ca154fadeb fix: exclude tsbuildinfo files from published packages for smaller bundles 2024-05-30 10:12:37 +02:00
github-actions[bot]
f300f147fa chore(release): version packages 2024-05-29 16:23:49 +02:00
44426042cf feat(mysql,postgres): automatically create the database if it doesn't exist (fixes #147) 2024-05-29 16:19:32 +02:00
aef2d7c861 fix(mysql): handle table initialization better in clustered database environments
The CREATE TABLE IF NOT EXISTS yields more locks than checking if the table exists using a SELECT first before running CREATE TABLE.
This makes more sense as the table usually already exists, so we optimize for the happy path.
2024-05-29 15:10:59 +02:00
github-actions[bot]
e396266f3d chore(release): version packages 2024-04-04 14:46:54 +02:00
081ab34cb4 fix(reporter-pino): make sure the Pino reporter outputs logs in Bun environments 2024-04-04 14:43:38 +02:00
dependabot[bot]
520fdd94ef chore(deps): bump changesets/action from 1.4.5 to 1.4.6
Bumps [changesets/action](https://github.com/changesets/action) from 1.4.5 to 1.4.6.
- [Release notes](https://github.com/changesets/action/releases)
- [Changelog](https://github.com/changesets/action/blob/main/CHANGELOG.md)
- [Commits](https://github.com/changesets/action/compare/v1.4.5...v1.4.6)

---
updated-dependencies:
- dependency-name: changesets/action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-03-15 10:23:19 +01:00
github-actions[bot]
d1bd8fc74f chore(release): version packages 2024-03-15 09:40:25 +01:00
41522094dd fix(cli): handle the case where the config is returned as an object with a nested default property 2024-02-19 10:59:02 +01:00
dependabot[bot]
6763f338ce chore(deps): bump the commitlint group with 2 updates
Bumps the commitlint group with 2 updates: [@commitlint/cli](https://github.com/conventional-changelog/commitlint/tree/HEAD/@commitlint/cli) and [@commitlint/config-conventional](https://github.com/conventional-changelog/commitlint/tree/HEAD/@commitlint/config-conventional).


Updates `@commitlint/cli` from 18.4.3 to 18.6.1
- [Release notes](https://github.com/conventional-changelog/commitlint/releases)
- [Changelog](https://github.com/conventional-changelog/commitlint/blob/master/@commitlint/cli/CHANGELOG.md)
- [Commits](https://github.com/conventional-changelog/commitlint/commits/v18.6.1/@commitlint/cli)

Updates `@commitlint/config-conventional` from 18.4.3 to 18.6.1
- [Release notes](https://github.com/conventional-changelog/commitlint/releases)
- [Changelog](https://github.com/conventional-changelog/commitlint/blob/master/@commitlint/config-conventional/CHANGELOG.md)
- [Commits](https://github.com/conventional-changelog/commitlint/commits/v18.6.1/@commitlint/config-conventional)

---
updated-dependencies:
- dependency-name: "@commitlint/cli"
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: commitlint
- dependency-name: "@commitlint/config-conventional"
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: commitlint
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-02-14 11:55:46 +01:00
github-actions[bot]
6c4e441eff chore(release): version packages 2024-02-13 13:00:15 +01:00
57a099169e fix(cli): cleanup AbortSignal event listeners to avoid MaxListenersExceededWarning 2024-02-12 20:59:26 +01:00
github-actions[bot]
ae9e8b1b04 chore(release): version packages 2024-02-12 13:56:28 +01:00
1065322435 fix(pino): show correct statuses for the "list" and "new" commands 2024-02-12 13:47:55 +01:00
17feb2d2c2 fix(mysql): only unreference connections in a Bun environment as it has problems with Node for some reason 2024-02-12 13:35:18 +01:00
dependabot[bot]
98e3ed5c1b chore(deps): bump pnpm/action-setup from 2.4.0 to 3.0.0
Bumps [pnpm/action-setup](https://github.com/pnpm/action-setup) from 2.4.0 to 3.0.0.
- [Release notes](https://github.com/pnpm/action-setup/releases)
- [Commits](https://github.com/pnpm/action-setup/compare/v2.4.0...v3.0.0)

---
updated-dependencies:
- dependency-name: pnpm/action-setup
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-02-12 09:05:01 +01:00
1d33d65135 docs(cli): change URL path from /commands/ to /cli/ 2024-02-09 14:48:51 +01:00
0c597fd7a8 docs(cli): add a main page for Emigrate's CLI 2024-02-09 14:48:51 +01:00
github-actions[bot]
0360d0b82f chore(release): version packages 2024-02-09 14:05:35 +01:00
c838ffb7f3 fix(typescript): load config written in TypeScript without the typescript package when using Bun, Deno or tsx 2024-02-09 14:00:24 +01:00
198aa545eb fix(mysql): unreference all connections so that the process can exit cleanly
In a NodeJS environment it will just work as before, but in a Bun environment it will make the "forced exit" error message disappear and remove the 10 s waiting period when migrations are done.
2024-02-09 13:13:27 +01:00
e7ec75d9e1 docs(faq): add note on using Emigrate for existing databases 2024-02-06 09:29:53 +01:00
b62c692846 docs(reporters): add "json" reporter and rename "default" to "pretty" 2024-02-06 09:22:35 +01:00
18382ce961 feat(reporters): add built-in "json" reporter and rename "default" to "pretty" 2024-02-06 09:22:35 +01:00
github-actions[bot]
4e8ac5294d chore(release): version packages 2024-02-05 15:51:38 +01:00
61cbcbd691 fix(cli): force exiting after 10 seconds should not change the exit code
If all migrations have been run successfully we want the exit code to be 0 even though we had to force exit the process.
This is because on some platforms (e.g. Bun) all handles are not cleaned up the same as in NodeJS, so lets be forgiving.
2024-02-05 15:48:55 +01:00
85 changed files with 8681 additions and 5254 deletions

View file

@ -13,6 +13,7 @@ jobs:
env: env:
TURBO_TOKEN: ${{ secrets.TURBO_TOKEN }} TURBO_TOKEN: ${{ secrets.TURBO_TOKEN }}
TURBO_TEAM: ${{ secrets.TURBO_TEAM }} TURBO_TEAM: ${{ secrets.TURBO_TEAM }}
DO_NOT_TRACK: 1
steps: steps:
- name: Check out code - name: Check out code
@ -20,14 +21,12 @@ jobs:
with: with:
fetch-depth: 2 fetch-depth: 2
- uses: pnpm/action-setup@v2.4.0 - uses: pnpm/action-setup@v4.0.0
with:
version: 8.3.1
- name: Setup Node.js environment - name: Setup Node.js environment
uses: actions/setup-node@v4 uses: actions/setup-node@v4
with: with:
node-version: 20.9.0 node-version: 22.15.0
cache: 'pnpm' cache: 'pnpm'
- name: Install dependencies - name: Install dependencies

View file

@ -10,6 +10,7 @@ on:
# Allow this job to clone the repo and create a page deployment # Allow this job to clone the repo and create a page deployment
permissions: permissions:
actions: read
contents: read contents: read
pages: write pages: write
id-token: write id-token: write
@ -29,11 +30,10 @@ jobs:
echo $ASTRO_SITE echo $ASTRO_SITE
echo $ASTRO_BASE echo $ASTRO_BASE
- name: Install, build, and upload your site output - name: Install, build, and upload your site output
uses: withastro/action@v1 uses: withastro/action@v2
with: with:
path: ./docs # The root location of your Astro project inside the repository. (optional) path: ./docs # The root location of your Astro project inside the repository. (optional)
node-version: 20 # The specific version of Node that should be used to build your site. Defaults to 18. (optional) package-manager: pnpm@9.4.0 # The Node package manager that should be used to install dependencies and build your site. Automatically detected based on your lockfile. (optional)
package-manager: pnpm@8.10.2 # The Node package manager that should be used to install dependencies and build your site. Automatically detected based on your lockfile. (optional)
deploy: deploy:
needs: build needs: build
@ -44,4 +44,4 @@ jobs:
steps: steps:
- name: Deploy to GitHub Pages - name: Deploy to GitHub Pages
id: deployment id: deployment
uses: actions/deploy-pages@v1 uses: actions/deploy-pages@v4

62
.github/workflows/integration.yaml vendored Normal file
View file

@ -0,0 +1,62 @@
name: Integration Tests
on:
push:
branches: ['main', 'changeset-release/main']
pull_request:
jobs:
mysql_integration:
name: Emigrate MySQL integration tests
timeout-minutes: 15
runs-on: ubuntu-latest
env:
TURBO_TOKEN: ${{ secrets.TURBO_TOKEN }}
TURBO_TEAM: ${{ secrets.TURBO_TEAM }}
DO_NOT_TRACK: 1
services:
mysql:
image: mysql:8.0
env:
MYSQL_ROOT_PASSWORD: root
MYSQL_DATABASE: emigrate
MYSQL_USER: emigrate
MYSQL_PASSWORD: emigrate
ports:
- 3306:3306
options: --health-cmd="mysqladmin ping -h localhost" --health-interval=10s --health-timeout=5s --health-retries=5
steps:
- name: Check out code
uses: actions/checkout@v4
with:
fetch-depth: 2
- uses: pnpm/action-setup@v4.0.0
- name: Setup Node.js environment
uses: actions/setup-node@v4
with:
node-version: 22.15.0
cache: 'pnpm'
- name: Install dependencies
run: pnpm install
- name: Wait for MySQL to be ready
run: |
for i in {1..30}; do
nc -z localhost 3306 && echo "MySQL is up!" && break
echo "Waiting for MySQL..."
sleep 2
done
- name: Build package
run: pnpm build --filter @emigrate/mysql
- name: Integration Tests
env:
MYSQL_HOST: '127.0.0.1'
MYSQL_PORT: 3306
run: pnpm --filter @emigrate/mysql integration

View file

@ -25,25 +25,48 @@ jobs:
persist-credentials: false persist-credentials: false
fetch-depth: 0 fetch-depth: 0
- uses: pnpm/action-setup@v2.4.0 - uses: pnpm/action-setup@v4.0.0
with:
version: 8.3.1
- name: Setup Node.js environment - name: Setup Node.js environment
uses: actions/setup-node@v4 uses: actions/setup-node@v4
with: with:
node-version: 20.9.0 node-version: 22.15.0
cache: 'pnpm' cache: 'pnpm'
- name: Install Dependencies - name: Install Dependencies
run: pnpm install run: pnpm install
- name: Create Release Pull Request - name: Create Release Pull Request
uses: changesets/action@v1.4.5 id: changesets
uses: aboviq/changesets-action@v1.5.2
with: with:
publish: pnpm run release publish: pnpm run release
commit: 'chore(release): version packages' commit: 'chore(release): version packages'
title: 'chore(release): version packages' title: 'chore(release): version packages'
createGithubReleases: aggregate
env: env:
GITHUB_TOKEN: ${{ secrets.PAT_GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.PAT_GITHUB_TOKEN }}
NPM_TOKEN: ${{ secrets.NPM_TOKEN }} NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Release to @next tag on npm
if: github.ref_name == 'main' && steps.changesets.outputs.published != 'true'
run: |
git checkout main
CHANGESET_FILE=$(git diff-tree --no-commit-id --name-only HEAD -r ".changeset/*-*-*.md")
if [ -z "$CHANGESET_FILE" ]; then
echo "No changesets found, skipping release to @next tag"
exit 0
fi
AFFECTED_PACKAGES=$(sed -n '/---/,/---/p' "$CHANGESET_FILE" | sed '/---/d')
if [ -z "$AFFECTED_PACKAGES" ]; then
echo "No packages affected by changesets, skipping release to @next tag"
exit 0
fi
pnpm changeset version --snapshot next
pnpm changeset publish --tag next
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
GITHUB_TOKEN: ${{ secrets.PAT_GITHUB_TOKEN }}

View file

@ -1,5 +1,27 @@
# @emigrate/docs # @emigrate/docs
## 1.0.0
### Major Changes
- 1d33d65: Rename the URL path "/commands/" to "/cli/" to make it more clear that those pages are the documentation for the CLI. This change is a BREAKING CHANGE because it changes the URL path of the pages.
### Minor Changes
- 0c597fd: Add a separate page for the Emigrate CLI itself, with all the commands as sub pages
## 0.4.0
### Minor Changes
- b62c692: Add documentation for the built-in "json" reporter
- b62c692: The "default" reporter is now named "pretty"
- e7ec75d: Add note in FAQ on using Emigrate for existing databases
### Patch Changes
- c838ffb: Add note on how to write Emigrate's config using TypeScript in a production environment without having `typescript` installed.
## 0.3.0 ## 0.3.0
### Minor Changes ### Minor Changes
@ -18,4 +40,4 @@
### Minor Changes ### Minor Changes
- cbc35bd: Add first version of the [Baseline guide](https://emigrate.dev/guides/baseline) - cbc35bd: Add first version of the [Baseline guide](https://emigrate.dev/guides/baseline)
- cbc35bd: Document the new --limit, --from and --to options for the ["up" command](https://emigrate.dev/commands/up/) - cbc35bd: Document the new --limit, --from and --to options for the ["up" command](https://emigrate.dev/cli/up/)

View file

@ -77,24 +77,33 @@ export default defineConfig({
}, },
], ],
}, },
{
label: 'Command Line Interface',
items: [
{
label: 'Introduction',
link: '/cli/',
},
{ {
label: 'Commands', label: 'Commands',
items: [ items: [
{ {
label: 'emigrate up', label: 'emigrate up',
link: '/commands/up/', link: '/cli/up/',
}, },
{ {
label: 'emigrate list', label: 'emigrate list',
link: '/commands/list/', link: '/cli/list/',
}, },
{ {
label: 'emigrate new', label: 'emigrate new',
link: '/commands/new/', link: '/cli/new/',
}, },
{ {
label: 'emigrate remove', label: 'emigrate remove',
link: '/commands/remove/', link: '/cli/remove/',
},
],
}, },
], ],
}, },
@ -106,7 +115,7 @@ export default defineConfig({
link: '/guides/typescript/', link: '/guides/typescript/',
}, },
{ {
label: 'Baseline', label: 'Baseline existing database',
link: '/guides/baseline/', link: '/guides/baseline/',
}, },
], ],
@ -171,8 +180,12 @@ export default defineConfig({
link: '/plugins/reporters/', link: '/plugins/reporters/',
}, },
{ {
label: 'Default Reporter', label: 'Pretty Reporter (default)',
link: '/plugins/reporters/default/', link: '/plugins/reporters/pretty/',
},
{
label: 'JSON Reporter',
link: '/plugins/reporters/json/',
}, },
{ {
label: 'Pino Reporter', label: 'Pino Reporter',

View file

@ -2,7 +2,7 @@
"name": "@emigrate/docs", "name": "@emigrate/docs",
"private": true, "private": true,
"type": "module", "type": "module",
"version": "0.3.0", "version": "1.0.0",
"scripts": { "scripts": {
"dev": "astro dev", "dev": "astro dev",
"start": "astro dev", "start": "astro dev",
@ -11,6 +11,7 @@
"astro": "astro" "astro": "astro"
}, },
"dependencies": { "dependencies": {
"@astrojs/check": "^0.7.0",
"@astrojs/starlight": "^0.15.0", "@astrojs/starlight": "^0.15.0",
"@astrojs/starlight-tailwind": "2.0.1", "@astrojs/starlight-tailwind": "2.0.1",
"@astrojs/tailwind": "^5.0.3", "@astrojs/tailwind": "^5.0.3",
@ -20,5 +21,6 @@
}, },
"volta": { "volta": {
"extends": "../package.json" "extends": "../package.json"
} },
"packageManager": "pnpm@9.4.0"
} }

View file

@ -0,0 +1,73 @@
---
title: "CLI Introduction"
description: "Some basic information about the Emigrate CLI."
---
import { Tabs, TabItem, LinkCard } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
Emigrate comes with a CLI that you can use to manage your migrations. The CLI is a powerful tool that allows you to create, run, and manage migrations.
### Installing the Emigrate CLI
<Tabs>
<TabItem label="npm">
```bash
npm install @emigrate/cli
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm add @emigrate/cli
```
</TabItem>
<TabItem label="yarn">
```bash
yarn add @emigrate/cli
```
</TabItem>
<TabItem label="bun">
```bash
bun add @emigrate/cli
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3,6}
{
"scripts": {
"emigrate": "emigrate"
},
"dependencies": {
"@emigrate/cli": "*"
}
}
```
</TabItem>
</Tabs>
### Existing commands
<LinkCard
href="up/"
title="emigrate up"
description="The command for executing migrations, or showing pending migrations in dry run mode."
/>
<LinkCard
href="list/"
title="emigrate list"
description="The command for listing all migrations and their status."
/>
<LinkCard
href="new/"
title="emigrate new"
description="The command for creating new migration files."
/>
<LinkCard
href="remove/"
title="emigrate remove"
description="The command for removing migrations from the migration history."
/>

View file

@ -86,6 +86,9 @@ In case you have both a `emigrate-storage-somedb` and a `somedb` package install
### `-r`, `--reporter <name>` ### `-r`, `--reporter <name>`
**type:** `"pretty" | "json" | string`
**default:** `"pretty"`
The <Link href="/plugins/reporters/">reporter</Link> to use for listing the migrations. The <Link href="/plugins/reporters/">reporter</Link> to use for listing the migrations.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order: The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:

View file

@ -101,6 +101,9 @@ In case you have both a `emigrate-plugin-someplugin` and a `someplugin` package
### `-r`, `--reporter <name>` ### `-r`, `--reporter <name>`
**type:** `"pretty" | "json" | string`
**default:** `"pretty"`
The <Link href="/plugins/reporters/">reporter</Link> to use for listing the migrations. The <Link href="/plugins/reporters/">reporter</Link> to use for listing the migrations.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order: The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:

View file

@ -55,7 +55,7 @@ The `remove` command is used to remove a migration from the history. This is use
The name of the migration file to remove, including the extension, e.g. `20200101000000_some_migration.js`, or a relative file path to a migration file to remove, e.g: `migrations/20200101000000_some_migration.js`. The name of the migration file to remove, including the extension, e.g. `20200101000000_some_migration.js`, or a relative file path to a migration file to remove, e.g: `migrations/20200101000000_some_migration.js`.
Using relative file paths is useful in terminals that support autocomplete, and also when you copy and use the relative migration file path from the output of the <Link href="/commands/list/">`list`</Link> command. Using relative file paths is useful in terminals that support autocomplete, and also when you copy and use the relative migration file path from the output of the <Link href="/cli/list/">`list`</Link> command.
## Options ## Options
@ -95,6 +95,9 @@ In case you have both a `emigrate-storage-somedb` and a `somedb` package install
### `-r`, `--reporter <name>` ### `-r`, `--reporter <name>`
**type:** `"pretty" | "json" | string`
**default:** `"pretty"`
The <Link href="/plugins/reporters/">reporter</Link> to use for listing the migrations. The <Link href="/plugins/reporters/">reporter</Link> to use for listing the migrations.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order: The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:

View file

@ -147,6 +147,9 @@ In case you have both a `emigrate-plugin-someplugin` and a `someplugin` package
### `-r`, `--reporter <name>` ### `-r`, `--reporter <name>`
**type:** `"pretty" | "json" | string`
**default:** `"pretty"`
The <Link href="/plugins/reporters/">reporter</Link> to use for reporting the migration progress. The <Link href="/plugins/reporters/">reporter</Link> to use for reporting the migration progress.
The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order: The name can be either a path to a module or a package name. For package names Emigrate will automatically prefix the given name with these prefixes in order:

View file

@ -113,8 +113,8 @@ CREATE TABLE public.posts (
### Log the baseline migration ### Log the baseline migration
For new environments this baseline migration will automatically be run when you run <Link href="/commands/up/">`emigrate up`</Link>. For new environments this baseline migration will automatically be run when you run <Link href="/cli/up/">`emigrate up`</Link>.
For any existing environments you will need to run `emigrate up` with the <Link href="/commands/up/#--no-execution">`--no-execution`</Link> flag to prevent the migration from being executed and only log the migration: For any existing environments you will need to run `emigrate up` with the <Link href="/cli/up/#--no-execution">`--no-execution`</Link> flag to prevent the migration from being executed and only log the migration:
<Tabs> <Tabs>
<TabItem label="npm"> <TabItem label="npm">
@ -155,7 +155,7 @@ For any existing environments you will need to run `emigrate up` with the <Link
</TabItem> </TabItem>
</Tabs> </Tabs>
In case you have already added more migration files to your migration directory you can limit the "up" command to just log the baseline migration by specifying the <Link href="/commands/up/#-t---to-name">`--to`</Link> option: In case you have already added more migration files to your migration directory you can limit the "up" command to just log the baseline migration by specifying the <Link href="/cli/up/#-t---to-name">`--to`</Link> option:
<Tabs> <Tabs>
<TabItem label="npm"> <TabItem label="npm">
@ -198,7 +198,7 @@ In case you have already added more migration files to your migration directory
### Verify the baseline migration status ### Verify the baseline migration status
You can verify the status of the baseline migration by running the <Link href="/commands/list/">`emigrate list`</Link> command: You can verify the status of the baseline migration by running the <Link href="/cli/list/">`emigrate list`</Link> command:
<Tabs> <Tabs>
<TabItem label="npm"> <TabItem label="npm">
@ -251,5 +251,5 @@ Emigrate list v0.14.1 /your/project/path
### Happy migrating! ### Happy migrating!
You can now start adding new migrations to your migration directory and run <Link href="/commands/up/">`emigrate up`</Link> to apply them to your database. You can now start adding new migrations to your migration directory and run <Link href="/cli/up/">`emigrate up`</Link> to apply them to your database.
Which should be part of your CD pipeline to ensure that your database schema is always up to date in each environment. Which should be part of your CD pipeline to ensure that your database schema is always up to date in each environment.

View file

@ -7,14 +7,14 @@ import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro'; import Link from '@components/Link.astro';
:::tip[Using Bun or Deno?] :::tip[Using Bun or Deno?]
If you are using [Bun](https://bun.sh) or [Deno](https://deno.land) you are already good to go as they both support TypeScript out of the box. If you are using [Bun](https://bun.sh) or [Deno](https://deno.land) you are already good to go as they both support TypeScript out of the box!
::: :::
You have at least the two following options to support running TypeScript migration files in NodeJS. If you're using NodeJS you have at least the two following options to support running TypeScript migration files in NodeJS.
## Using `tsx` ## Using `tsx`
If you want to be able to write and run migration files written in TypeScript the easiest way is to install the [`tsx`](https://github.com/privatenumber/tsx) package. If you want to be able to write and run migration files written in TypeScript an easy way is to install the [`tsx`](https://github.com/privatenumber/tsx) package.
### Installing `tsx` ### Installing `tsx`
@ -47,7 +47,7 @@ After installing `tsx` you can load it in two ways.
#### Via CLI #### Via CLI
Using the <Link href="/commands/up/#-i---import-module">`--import`</Link> flag you can load `tsx` before running your migration files. Using the <Link href="/cli/up/#-i---import-module">`--import`</Link> flag you can load `tsx` before running your migration files.
<Tabs> <Tabs>
<TabItem label="npm"> <TabItem label="npm">
@ -67,9 +67,13 @@ Using the <Link href="/commands/up/#-i---import-module">`--import`</Link> flag y
</TabItem> </TabItem>
</Tabs> </Tabs>
:::note
This method is necessary if you want to write your configuration file in TypeScript without having `typescript` installed in your production environment, as `tsx` must be loaded before the configuration file is loaded.
:::
#### Via configuration file #### Via configuration file
You can also directly import `tsx` in your configuration file. You can also directly import `tsx` in your configuration file (will only work if you're not using TypeScript for your configuration file).
```js title="emigrate.config.js" {1} ```js title="emigrate.config.js" {1}
import 'tsx'; import 'tsx';

View file

@ -3,11 +3,13 @@ title: "FAQ"
description: "Frequently asked questions about Emigrate." description: "Frequently asked questions about Emigrate."
--- ---
import Link from '@components/Link.astro';
## Why no `down` migrations? ## Why no `down` migrations?
> Always forward never backwards. > Always forward never backwards.
Many other migration tools support `down` migrations, but in all the years we have been Many other migration tools support `down` (or undo) migrations, but in all the years we have been
doing migrations we have never needed to rollback a migration in production, doing migrations we have never needed to rollback a migration in production,
in that case we would just write a new migration to fix the problem. in that case we would just write a new migration to fix the problem.
@ -17,3 +19,7 @@ and in such case you just revert the migration manually and fix the `up` migrati
The benefit of this is that you don't have to worry about writing `down` migrations, and you can focus on writing the `up` migrations. The benefit of this is that you don't have to worry about writing `down` migrations, and you can focus on writing the `up` migrations.
This way you will only ever have to write `down` migrations when they are really necessary instead of for every migration This way you will only ever have to write `down` migrations when they are really necessary instead of for every migration
(which makes it the exception rather than the rule, which is closer to the truth). (which makes it the exception rather than the rule, which is closer to the truth).
## Can I use Emigrate with my existing database?
Yes, you can use Emigrate with an existing database. See the <Link href="/guides/baseline/">Baseline guide</Link> for more information.

View file

@ -22,5 +22,5 @@ The generator is responsible for generating migration files in a specific format
</CardGrid> </CardGrid>
:::note :::note
Instead of having to install a generator plugin, you can also use the much simpler <Link href="/commands/new/#-t---template-path">`--template`</Link> option to specify a custom template file for new migrations. Instead of having to install a generator plugin, you can also use the much simpler <Link href="/cli/new/#-t---template-path">`--template`</Link> option to specify a custom template file for new migrations.
::: :::

View file

@ -79,4 +79,4 @@ A <Link href="/plugins/generators/">generator plugin</Link> for generating new m
</TabItem> </TabItem>
</Tabs> </Tabs>
For more information see <Link href="/commands/new/">the `new` command</Link>'s documentation. For more information see <Link href="/cli/new/">the `new` command</Link>'s documentation.

View file

@ -79,4 +79,4 @@ The MySQL generator creates new migration files with the `.sql` extension. In th
</TabItem> </TabItem>
</Tabs> </Tabs>
For more information see <Link href="/commands/new/">the `new` command</Link>'s documentation. For more information see <Link href="/cli/new/">the `new` command</Link>'s documentation.

View file

@ -79,4 +79,4 @@ The PostgreSQL generator creates new migration files with the `.sql` extension.
</TabItem> </TabItem>
</Tabs> </Tabs>
For more information see <Link href="/commands/new/">the `new` command</Link>'s documentation. For more information see <Link href="/cli/new/">the `new` command</Link>'s documentation.

View file

@ -1,23 +0,0 @@
---
title: Default Reporter
---
Emigrate's default reporter. The default reporter recognizes if the current terminal is an interactive shell (or if it's a CI environment), if that's the case _no_ animations will be shown.
## Usage
By default, Emigrate uses the default reporter.
## Example output
```bash
Emigrate up v0.10.0 /Users/joakim/dev/@aboviq/test-emigrate (dry run)
1 pending migrations to run
migration-folder/20231218135441244_create_some_table.sql (pending)
1 pending (1 total)
```

View file

@ -20,6 +20,7 @@ Or set it up in your configuration file, see <Link href="/reference/configuratio
## Available Reporters ## Available Reporters
<CardGrid> <CardGrid>
<LinkCard title="Default Reporter" href="default/" /> <LinkCard title="Pretty Reporter" description="The default reporter" href="pretty/" />
<LinkCard title="Pino Reporter" href="pino/" /> <LinkCard title="JSON Reporter" description="A built-in reporter for outputing a JSON object" href="json/" />
<LinkCard title="Pino Reporter" description="A reporter package for outputting new line delimited JSON" href="pino/" />
</CardGrid> </CardGrid>

View file

@ -0,0 +1,102 @@
---
title: JSON Reporter
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
An Emigrate reporter that outputs a JSON object.
The reporter is included by default and does not need to be installed separately.
## Usage
### Via CLI
<Tabs>
<TabItem label="npm">
```bash
npx emigrate <command> --reporter json
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate <command> --reporter json
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate <command> --reporter json
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate <command> --reporter json
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash
deno task emigrate <command> --reporter json
```
</TabItem>
</Tabs>
See for instance the <Link href="/cli/up/#-r---reporter-name">Reporter Option</Link> for the `up` command for more information.
### Via configuration file
<Tabs>
<TabItem label="JavaScript">
```js title="emigrate.config.js"
/** @type {import('@emigrate/cli').EmigrateConfig} */
export default {
reporter: 'json',
};
```
</TabItem>
<TabItem label="TypeScript">
```ts title="emigrate.config.ts"
import { type EmigrateConfig } from '@emigrate/cli';
const config: EmigrateConfig = {
reporter: 'json',
};
export default config;
```
</TabItem>
</Tabs>
See <Link href="/reference/configuration/#reporter">Reporter Configuration</Link> for more information.
## Example output
```json
{
"command": "up",
"version": "0.17.2",
"numberTotalMigrations": 1,
"numberDoneMigrations": 0,
"numberSkippedMigrations": 0,
"numberFailedMigrations": 0,
"numberPendingMigrations": 1,
"success": true,
"startTime": 1707206599968,
"endTime": 1707206600005,
"migrations": [
{
"name": "/your/project/migrations/20240206075446123_some_other_table.sql",
"status": "pending",
"duration": 0
}
]
}
```

View file

@ -87,15 +87,31 @@ The `@emigrate/reporter-` prefix is optional when using this reporter.
</TabItem> </TabItem>
</Tabs> </Tabs>
See for instance the <Link href="/commands/up/#-r---reporter-name">Reporter Option</Link> for the `up` command for more information. See for instance the <Link href="/cli/up/#-r---reporter-name">Reporter Option</Link> for the `up` command for more information.
### Via configuration file ### Via configuration file
```js title="emigrate.config.js" {2} <Tabs>
export default { <TabItem label="JavaScript">
```js title="emigrate.config.js"
/** @type {import('@emigrate/cli').EmigrateConfig} */
export default {
reporter: 'pino', reporter: 'pino',
}; };
``` ```
</TabItem>
<TabItem label="TypeScript">
```ts title="emigrate.config.ts"
import { type EmigrateConfig } from '@emigrate/cli';
const config: EmigrateConfig = {
reporter: 'pino',
};
export default config;
```
</TabItem>
</Tabs>
See <Link href="/reference/configuration/#reporter">Reporter Configuration</Link> for more information. See <Link href="/reference/configuration/#reporter">Reporter Configuration</Link> for more information.

View file

@ -0,0 +1,90 @@
---
title: Pretty Reporter (default)
---
import { Tabs, TabItem } from '@astrojs/starlight/components';
import Link from '@components/Link.astro';
Emigrate's default reporter. It recognizes if the current terminal is an interactive shell (or if it's a CI environment), if that's the case _no_ animations will be shown.
The reporter is included by default and does not need to be installed separately.
## Usage
By default, Emigrate uses the "pretty" reporter, but it can also be explicitly set by using the <Link href="/cli/up/#-r---reporter-name">`--reporter`</Link> flag.
<Tabs>
<TabItem label="npm">
```bash
npx emigrate <command> --reporter pretty
```
</TabItem>
<TabItem label="pnpm">
```bash
pnpm emigrate <command> --reporter pretty
```
</TabItem>
<TabItem label="yarn">
```bash
yarn emigrate <command> --reporter pretty
```
</TabItem>
<TabItem label="bun">
```bash
bunx --bun emigrate <command> --reporter pretty
```
</TabItem>
<TabItem label="deno">
```json title="package.json" {3}
{
"scripts": {
"emigrate": "emigrate"
}
}
```
```bash
deno task emigrate <command> --reporter pretty
```
</TabItem>
</Tabs>
Or by setting it in the configuration file.
<Tabs>
<TabItem label="JavaScript">
```js title="emigrate.config.js"
/** @type {import('@emigrate/cli').EmigrateConfig} */
export default {
reporter: 'pretty',
};
```
</TabItem>
<TabItem label="TypeScript">
```ts title="emigrate.config.ts"
import { type EmigrateConfig } from '@emigrate/cli';
const config: EmigrateConfig = {
reporter: 'pretty',
};
export default config;
```
</TabItem>
</Tabs>
See <Link href="/reference/configuration/#reporter">Reporter Configuration</Link> for more information.
## Example output
```bash
Emigrate up v0.17.2 /your/working/directory (dry run)
1 pending migrations to run
migration-folder/20231218135441244_create_some_table.sql (pending)
1 pending (1 total)
```

View file

@ -45,9 +45,9 @@ Set the directory where your migrations are located, relative to the project roo
### `reporter` ### `reporter`
**type:** `string | EmigrateReporter | Promise<EmigrateReporter> | (() => Promise<EmigrateReporter>)` **type:** `"pretty" | "json" | string | EmigrateReporter | Promise<EmigrateReporter> | (() => Promise<EmigrateReporter>)`
**default:** `"default"` - the default reporter **default:** `"pretty"` - the default reporter
Set the reporter to use for the different commands. Specifying a <Link href="/plugins/reporters/">reporter</Link> is most useful in a CI or production environment where you either ship logs or want to have a machine-readable format. Set the reporter to use for the different commands. Specifying a <Link href="/plugins/reporters/">reporter</Link> is most useful in a CI or production environment where you either ship logs or want to have a machine-readable format.
@ -64,6 +64,9 @@ export default {
up: { up: {
reporter: 'json', reporter: 'json',
}, },
new: {
reporter: 'pretty', // Not really necessary, as it's the default
},
}; };
``` ```

View file

@ -37,9 +37,10 @@
"bugs": "https://github.com/aboviq/emigrate/issues", "bugs": "https://github.com/aboviq/emigrate/issues",
"license": "MIT", "license": "MIT",
"volta": { "volta": {
"node": "20.9.0", "node": "22.15.0",
"pnpm": "8.10.2" "pnpm": "9.4.0"
}, },
"packageManager": "pnpm@9.4.0",
"engines": { "engines": {
"node": ">=18" "node": ">=18"
}, },
@ -61,7 +62,10 @@
}, },
"overrides": [ "overrides": [
{ {
"files": "packages/**/*.test.ts", "files": [
"packages/**/*.test.ts",
"packages/**/*.integration.ts"
],
"rules": { "rules": {
"@typescript-eslint/no-floating-promises": 0, "@typescript-eslint/no-floating-promises": 0,
"max-params": 0 "max-params": 0
@ -71,17 +75,18 @@
}, },
"dependencies": { "dependencies": {
"@changesets/cli": "2.27.1", "@changesets/cli": "2.27.1",
"@commitlint/cli": "18.4.3", "@commitlint/cli": "18.6.1",
"@commitlint/config-conventional": "18.4.3", "@commitlint/config-conventional": "18.6.1",
"@types/node": "20.10.4", "@types/node": "20.10.4",
"glob": "10.3.10", "glob": "10.3.10",
"husky": "8.0.3", "husky": "8.0.3",
"lint-staged": "15.2.0", "lint-staged": "15.2.0",
"npm-run-all": "4.1.5", "npm-run-all": "4.1.5",
"prettier": "3.1.1", "prettier": "3.1.1",
"tsx": "4.7.0", "testcontainers": "10.24.2",
"turbo": "1.10.16", "tsx": "4.15.7",
"typescript": "5.3.3", "turbo": "2.0.5",
"typescript": "5.5.2",
"xo": "0.56.0" "xo": "0.56.0"
} }
} }

View file

@ -1,5 +1,53 @@
# @emigrate/cli # @emigrate/cli
## 0.18.4
### Patch Changes
- d779286: Upgrade TypeScript to v5.5 and enable [isolatedDeclarations](https://devblogs.microsoft.com/typescript/announcing-typescript-5-5/#isolated-declarations)
- Updated dependencies [d779286]
- @emigrate/plugin-tools@0.9.8
- @emigrate/types@0.12.2
## 0.18.3
### Patch Changes
- ca154fa: Minimize package size by excluding \*.tsbuildinfo files
- Updated dependencies [ca154fa]
- @emigrate/plugin-tools@0.9.7
- @emigrate/types@0.12.2
## 0.18.2
### Patch Changes
- 4152209: Handle the case where the config is returned as an object with a nested `default` property
## 0.18.1
### Patch Changes
- 57a0991: Cleanup AbortSignal listeners when they are not needed to avoid MaxListenersExceededWarning when migrating many migrations at once
## 0.18.0
### Minor Changes
- c838ffb: Make it possible to write the Emigrate configuration file in TypeScript and load it using `tsx` in a NodeJS environment by importing packages provided using the `--import` CLI option before loading the configuration file. This makes it possible to run Emigrate in production with a configuration file written in TypeScript without having the `typescript` package installed.
- 18382ce: Add a built-in "json" reporter for outputting a single JSON object
- 18382ce: Rename the "default" reporter to "pretty" and make it possible to specify it using the `--reporter` CLI option or in the configuration file
### Patch Changes
- c838ffb: Don't use the `typescript` package for loading an Emigrate configuration file written in TypeScript in a Bun or Deno environment
## 0.17.2
### Patch Changes
- 61cbcbd: Force exiting after 10 seconds should not change the exit code, i.e. if all migrations have run successfully the exit code should be 0
## 0.17.1 ## 0.17.1
### Patch Changes ### Patch Changes

View file

@ -1,6 +1,6 @@
{ {
"name": "@emigrate/cli", "name": "@emigrate/cli",
"version": "0.17.1", "version": "0.18.4",
"publishConfig": { "publishConfig": {
"access": "public", "access": "public",
"provenance": true "provenance": true
@ -19,7 +19,8 @@
"emigrate": "dist/cli.js" "emigrate": "dist/cli.js"
}, },
"files": [ "files": [
"dist" "dist",
"!dist/*.tsbuildinfo"
], ],
"scripts": { "scripts": {
"build": "tsc --pretty", "build": "tsc --pretty",
@ -36,7 +37,9 @@
"immigration" "immigration"
], ],
"devDependencies": { "devDependencies": {
"@emigrate/tsconfig": "workspace:*" "@emigrate/tsconfig": "workspace:*",
"@types/bun": "1.0.5",
"bun-types": "1.0.26"
}, },
"author": "Aboviq AB <dev@aboviq.com> (https://www.aboviq.com)", "author": "Aboviq AB <dev@aboviq.com> (https://www.aboviq.com)",
"homepage": "https://github.com/aboviq/emigrate/tree/main/packages/cli#readme", "homepage": "https://github.com/aboviq/emigrate/tree/main/packages/cli#readme",

View file

@ -24,7 +24,6 @@ const importAll = async (cwd: string, modules: string[]) => {
}; };
const up: Action = async (args, abortSignal) => { const up: Action = async (args, abortSignal) => {
const config = await getConfig('up');
const { values } = parseArgs({ const { values } = parseArgs({
args, args,
options: { options: {
@ -104,7 +103,7 @@ Options:
-p, --plugin <name> The plugin(s) to use (can be specified multiple times) -p, --plugin <name> The plugin(s) to use (can be specified multiple times)
-r, --reporter <name> The reporter to use for reporting the migration progress -r, --reporter <name> The reporter to use for reporting the migration progress (default: pretty)
-l, --limit <count> Limit the number of migrations to run -l, --limit <count> Limit the number of migrations to run
@ -143,6 +142,14 @@ Examples:
} }
const cwd = process.cwd(); const cwd = process.cwd();
if (values.import) {
await importAll(cwd, values.import);
}
const forceImportTypeScriptAsIs = values.import?.some((module) => module === 'tsx' || module.startsWith('tsx/'));
const config = await getConfig('up', forceImportTypeScriptAsIs);
const { const {
directory = config.directory, directory = config.directory,
storage = config.storage, storage = config.storage,
@ -151,7 +158,6 @@ Examples:
from, from,
to, to,
limit: limitString, limit: limitString,
import: imports = [],
'abort-respite': abortRespiteString, 'abort-respite': abortRespiteString,
'no-execution': noExecution, 'no-execution': noExecution,
} = values; } = values;
@ -177,8 +183,6 @@ Examples:
return; return;
} }
await importAll(cwd, imports);
try { try {
const { default: upCommand } = await import('./commands/up.js'); const { default: upCommand } = await import('./commands/up.js');
process.exitCode = await upCommand({ process.exitCode = await upCommand({
@ -209,7 +213,6 @@ Examples:
}; };
const newMigration: Action = async (args) => { const newMigration: Action = async (args) => {
const config = await getConfig('new');
const { values, positionals } = parseArgs({ const { values, positionals } = parseArgs({
args, args,
options: { options: {
@ -239,6 +242,12 @@ const newMigration: Action = async (args) => {
multiple: true, multiple: true,
default: [], default: [],
}, },
import: {
type: 'string',
short: 'i',
multiple: true,
default: [],
},
color: { color: {
type: 'boolean', type: 'boolean',
}, },
@ -260,14 +269,24 @@ Arguments:
Options: Options:
-h, --help Show this help message and exit -h, --help Show this help message and exit
-d, --directory <path> The directory where the migration files are located (required) -d, --directory <path> The directory where the migration files are located (required)
-r, --reporter <name> The reporter to use for reporting the migration file creation progress
-i, --import <module> Additional modules/packages to import before creating the migration (can be specified multiple times)
For example if you want to use Dotenv to load environment variables or when using TypeScript
-r, --reporter <name> The reporter to use for reporting the migration file creation progress (default: pretty)
-p, --plugin <name> The plugin(s) to use (can be specified multiple times) -p, --plugin <name> The plugin(s) to use (can be specified multiple times)
-t, --template <path> A template file to use as contents for the new migration file -t, --template <path> A template file to use as contents for the new migration file
(if the extension option is not provided the template file's extension will be used) (if the extension option is not provided the template file's extension will be used)
-x, --extension <ext> The extension to use for the new migration file -x, --extension <ext> The extension to use for the new migration file
(if no template or plugin is provided an empty migration file will be created with the given extension) (if no template or plugin is provided an empty migration file will be created with the given extension)
--color Force color output (this option is passed to the reporter) --color Force color output (this option is passed to the reporter)
--no-color Disable color output (this option is passed to the reporter) --no-color Disable color output (this option is passed to the reporter)
One of the --template, --extension or the --plugin options must be specified One of the --template, --extension or the --plugin options must be specified
@ -287,6 +306,14 @@ Examples:
} }
const cwd = process.cwd(); const cwd = process.cwd();
if (values.import) {
await importAll(cwd, values.import);
}
const forceImportTypeScriptAsIs = values.import?.some((module) => module === 'tsx' || module.startsWith('tsx/'));
const config = await getConfig('new', forceImportTypeScriptAsIs);
const { const {
directory = config.directory, directory = config.directory,
template = config.template, template = config.template,
@ -312,7 +339,6 @@ Examples:
}; };
const list: Action = async (args) => { const list: Action = async (args) => {
const config = await getConfig('list');
const { values } = parseArgs({ const { values } = parseArgs({
args, args,
options: { options: {
@ -355,12 +381,18 @@ List all migrations and their status. This command does not run any migrations.
Options: Options:
-h, --help Show this help message and exit -h, --help Show this help message and exit
-d, --directory <path> The directory where the migration files are located (required) -d, --directory <path> The directory where the migration files are located (required)
-i, --import <module> Additional modules/packages to import before listing the migrations (can be specified multiple times) -i, --import <module> Additional modules/packages to import before listing the migrations (can be specified multiple times)
For example if you want to use Dotenv to load environment variables For example if you want to use Dotenv to load environment variables
-r, --reporter <name> The reporter to use for reporting the migrations
-r, --reporter <name> The reporter to use for reporting the migrations (default: pretty)
-s, --storage <name> The storage to use to get the migration history (required) -s, --storage <name> The storage to use to get the migration history (required)
--color Force color output (this option is passed to the reporter) --color Force color output (this option is passed to the reporter)
--no-color Disable color output (this option is passed to the reporter) --no-color Disable color output (this option is passed to the reporter)
Examples: Examples:
@ -376,14 +408,15 @@ Examples:
} }
const cwd = process.cwd(); const cwd = process.cwd();
const {
directory = config.directory,
storage = config.storage,
reporter = config.reporter,
import: imports = [],
} = values;
await importAll(cwd, imports); if (values.import) {
await importAll(cwd, values.import);
}
const forceImportTypeScriptAsIs = values.import?.some((module) => module === 'tsx' || module.startsWith('tsx/'));
const config = await getConfig('list', forceImportTypeScriptAsIs);
const { directory = config.directory, storage = config.storage, reporter = config.reporter } = values;
try { try {
const { default: listCommand } = await import('./commands/list.js'); const { default: listCommand } = await import('./commands/list.js');
@ -401,7 +434,6 @@ Examples:
}; };
const remove: Action = async (args) => { const remove: Action = async (args) => {
const config = await getConfig('remove');
const { values, positionals } = parseArgs({ const { values, positionals } = parseArgs({
args, args,
options: { options: {
@ -453,13 +485,20 @@ Arguments:
Options: Options:
-h, --help Show this help message and exit -h, --help Show this help message and exit
-d, --directory <path> The directory where the migration files are located (required) -d, --directory <path> The directory where the migration files are located (required)
-i, --import <module> Additional modules/packages to import before removing the migration (can be specified multiple times) -i, --import <module> Additional modules/packages to import before removing the migration (can be specified multiple times)
For example if you want to use Dotenv to load environment variables For example if you want to use Dotenv to load environment variables
-r, --reporter <name> The reporter to use for reporting the removal process
-r, --reporter <name> The reporter to use for reporting the removal process (default: pretty)
-s, --storage <name> The storage to use to get the migration history (required) -s, --storage <name> The storage to use to get the migration history (required)
-f, --force Force removal of the migration history entry even if the migration is not in a failed state -f, --force Force removal of the migration history entry even if the migration is not in a failed state
--color Force color output (this option is passed to the reporter) --color Force color output (this option is passed to the reporter)
--no-color Disable color output (this option is passed to the reporter) --no-color Disable color output (this option is passed to the reporter)
Examples: Examples:
@ -477,15 +516,15 @@ Examples:
} }
const cwd = process.cwd(); const cwd = process.cwd();
const {
directory = config.directory,
storage = config.storage,
reporter = config.reporter,
force,
import: imports = [],
} = values;
await importAll(cwd, imports); if (values.import) {
await importAll(cwd, values.import);
}
const forceImportTypeScriptAsIs = values.import?.some((module) => module === 'tsx' || module.startsWith('tsx/'));
const config = await getConfig('remove', forceImportTypeScriptAsIs);
const { directory = config.directory, storage = config.storage, reporter = config.reporter, force } = values;
try { try {
const { default: removeCommand } = await import('./commands/remove.js'); const { default: removeCommand } = await import('./commands/remove.js');
@ -602,5 +641,5 @@ await main(process.argv.slice(2), controller.signal);
setTimeout(() => { setTimeout(() => {
console.error('Process did not exit within 10 seconds, forcing exit'); console.error('Process did not exit within 10 seconds, forcing exit');
process.exit(1); process.exit(process.exitCode);
}, 10_000).unref(); }, 10_000).unref();

View file

@ -1,12 +1,12 @@
import { type MigrationHistoryEntry, type MigrationMetadata, type MigrationMetadataFinished } from '@emigrate/types'; import { type MigrationHistoryEntry, type MigrationMetadata, type MigrationMetadataFinished } from '@emigrate/types';
import { toMigrationMetadata } from './to-migration-metadata.js'; import { toMigrationMetadata } from './to-migration-metadata.js';
import { getMigrations as getMigrationsOriginal } from './get-migrations.js'; import { getMigrations as getMigrationsOriginal, type GetMigrationsFunction } from './get-migrations.js';
export async function* collectMigrations( export async function* collectMigrations(
cwd: string, cwd: string,
directory: string, directory: string,
history: AsyncIterable<MigrationHistoryEntry>, history: AsyncIterable<MigrationHistoryEntry>,
getMigrations = getMigrationsOriginal, getMigrations: GetMigrationsFunction = getMigrationsOriginal,
): AsyncIterable<MigrationMetadata | MigrationMetadataFinished> { ): AsyncIterable<MigrationMetadata | MigrationMetadataFinished> {
const allMigrations = await getMigrations(cwd, directory); const allMigrations = await getMigrations(cwd, directory);
const seen = new Set<string>(); const seen = new Set<string>();

View file

@ -5,8 +5,7 @@ import { exec } from '../exec.js';
import { migrationRunner } from '../migration-runner.js'; import { migrationRunner } from '../migration-runner.js';
import { collectMigrations } from '../collect-migrations.js'; import { collectMigrations } from '../collect-migrations.js';
import { version } from '../get-package-info.js'; import { version } from '../get-package-info.js';
import { getStandardReporter } from '../reporters/get.js';
const lazyDefaultReporter = async () => import('../reporters/default.js');
type ExtraFlags = { type ExtraFlags = {
cwd: string; cwd: string;
@ -18,7 +17,7 @@ export default async function listCommand({
storage: storageConfig, storage: storageConfig,
color, color,
cwd, cwd,
}: Config & ExtraFlags) { }: Config & ExtraFlags): Promise<number> {
if (!directory) { if (!directory) {
throw MissingOptionError.fromOption('directory'); throw MissingOptionError.fromOption('directory');
} }
@ -29,7 +28,7 @@ export default async function listCommand({
throw BadOptionError.fromOption('storage', 'No storage found, please specify a storage using the storage option'); throw BadOptionError.fromOption('storage', 'No storage found, please specify a storage using the storage option');
} }
const reporter = await getOrLoadReporter([reporterConfig ?? lazyDefaultReporter]); const reporter = getStandardReporter(reporterConfig) ?? (await getOrLoadReporter([reporterConfig]));
if (!reporter) { if (!reporter) {
throw BadOptionError.fromOption( throw BadOptionError.fromOption(

View file

@ -15,8 +15,7 @@ import { type Config } from '../types.js';
import { withLeadingPeriod } from '../with-leading-period.js'; import { withLeadingPeriod } from '../with-leading-period.js';
import { version } from '../get-package-info.js'; import { version } from '../get-package-info.js';
import { getDuration } from '../get-duration.js'; import { getDuration } from '../get-duration.js';
import { getStandardReporter } from '../reporters/get.js';
const lazyDefaultReporter = async () => import('../reporters/default.js');
type ExtraFlags = { type ExtraFlags = {
cwd: string; cwd: string;
@ -25,7 +24,7 @@ type ExtraFlags = {
export default async function newCommand( export default async function newCommand(
{ directory, template, reporter: reporterConfig, plugins = [], cwd, extension, color }: Config & ExtraFlags, { directory, template, reporter: reporterConfig, plugins = [], cwd, extension, color }: Config & ExtraFlags,
name: string, name: string,
) { ): Promise<void> {
if (!directory) { if (!directory) {
throw MissingOptionError.fromOption('directory'); throw MissingOptionError.fromOption('directory');
} }
@ -38,7 +37,7 @@ export default async function newCommand(
throw MissingOptionError.fromOption(['extension', 'template', 'plugin']); throw MissingOptionError.fromOption(['extension', 'template', 'plugin']);
} }
const reporter = await getOrLoadReporter([reporterConfig ?? lazyDefaultReporter]); const reporter = getStandardReporter(reporterConfig) ?? (await getOrLoadReporter([reporterConfig]));
if (!reporter) { if (!reporter) {
throw BadOptionError.fromOption( throw BadOptionError.fromOption(

View file

@ -11,6 +11,7 @@ import {
StorageInitError, StorageInitError,
} from '../errors.js'; } from '../errors.js';
import { import {
assertErrorEqualEnough,
getErrorCause, getErrorCause,
getMockedReporter, getMockedReporter,
getMockedStorage, getMockedStorage,
@ -199,6 +200,11 @@ function assertPreconditionsFailed(reporter: Mocked<Required<EmigrateReporter>>,
assert.strictEqual(reporter.onMigrationSkip.mock.calls.length, 0, 'Total pending and skipped'); assert.strictEqual(reporter.onMigrationSkip.mock.calls.length, 0, 'Total pending and skipped');
assert.strictEqual(reporter.onFinished.mock.calls.length, 1, 'Finished called once'); assert.strictEqual(reporter.onFinished.mock.calls.length, 1, 'Finished called once');
const [entries, error] = reporter.onFinished.mock.calls[0]?.arguments ?? []; const [entries, error] = reporter.onFinished.mock.calls[0]?.arguments ?? [];
// hackety hack:
if (finishedError) {
finishedError.stack = error?.stack;
}
assert.deepStrictEqual(error, finishedError, 'Finished error'); assert.deepStrictEqual(error, finishedError, 'Finished error');
const cause = getErrorCause(error); const cause = getErrorCause(error);
const expectedCause = finishedError?.cause; const expectedCause = finishedError?.cause;
@ -288,14 +294,7 @@ function assertPreconditionsFulfilled(
assert.strictEqual(reporter.onMigrationSkip.mock.calls.length, 0, 'Total pending and skipped'); assert.strictEqual(reporter.onMigrationSkip.mock.calls.length, 0, 'Total pending and skipped');
assert.strictEqual(reporter.onFinished.mock.calls.length, 1, 'Finished called once'); assert.strictEqual(reporter.onFinished.mock.calls.length, 1, 'Finished called once');
const [entries, error] = reporter.onFinished.mock.calls[0]?.arguments ?? []; const [entries, error] = reporter.onFinished.mock.calls[0]?.arguments ?? [];
assert.deepStrictEqual(error, finishedError, 'Finished error'); assertErrorEqualEnough(error, finishedError, 'Finished error');
const cause = getErrorCause(error);
const expectedCause = finishedError?.cause;
assert.deepStrictEqual(
cause,
expectedCause ? deserializeError(expectedCause) : expectedCause,
'Finished error cause',
);
assert.strictEqual(entries?.length, expected.length, 'Finished entries length'); assert.strictEqual(entries?.length, expected.length, 'Finished entries length');
assert.deepStrictEqual( assert.deepStrictEqual(
entries.map((entry) => `${entry.name} (${entry.status})`), entries.map((entry) => `${entry.name} (${entry.status})`),

View file

@ -18,6 +18,7 @@ import { collectMigrations } from '../collect-migrations.js';
import { migrationRunner } from '../migration-runner.js'; import { migrationRunner } from '../migration-runner.js';
import { arrayMapAsync } from '../array-map-async.js'; import { arrayMapAsync } from '../array-map-async.js';
import { type GetMigrationsFunction } from '../get-migrations.js'; import { type GetMigrationsFunction } from '../get-migrations.js';
import { getStandardReporter } from '../reporters/get.js';
type ExtraFlags = { type ExtraFlags = {
cwd: string; cwd: string;
@ -27,8 +28,6 @@ type ExtraFlags = {
type RemovableMigrationMetadata = MigrationMetadata & { originalStatus?: 'done' | 'failed' }; type RemovableMigrationMetadata = MigrationMetadata & { originalStatus?: 'done' | 'failed' };
const lazyDefaultReporter = async () => import('../reporters/default.js');
export default async function removeCommand( export default async function removeCommand(
{ {
directory, directory,
@ -40,7 +39,7 @@ export default async function removeCommand(
getMigrations, getMigrations,
}: Config & ExtraFlags, }: Config & ExtraFlags,
name: string, name: string,
) { ): Promise<number> {
if (!directory) { if (!directory) {
throw MissingOptionError.fromOption('directory'); throw MissingOptionError.fromOption('directory');
} }
@ -55,7 +54,7 @@ export default async function removeCommand(
throw BadOptionError.fromOption('storage', 'No storage found, please specify a storage using the storage option'); throw BadOptionError.fromOption('storage', 'No storage found, please specify a storage using the storage option');
} }
const reporter = await getOrLoadReporter([reporterConfig ?? lazyDefaultReporter]); const reporter = getStandardReporter(reporterConfig) ?? (await getOrLoadReporter([reporterConfig]));
if (!reporter) { if (!reporter) {
throw BadOptionError.fromOption( throw BadOptionError.fromOption(

View file

@ -1,13 +1,6 @@
import { describe, it, mock } from 'node:test'; import { describe, it, mock } from 'node:test';
import assert from 'node:assert'; import assert from 'node:assert';
import { import { type EmigrateReporter, type Storage, type Plugin, type MigrationMetadataFinished } from '@emigrate/types';
type EmigrateReporter,
type Storage,
type Plugin,
type SerializedError,
type MigrationMetadataFinished,
} from '@emigrate/types';
import { deserializeError } from 'serialize-error';
import { version } from '../get-package-info.js'; import { version } from '../get-package-info.js';
import { import {
BadOptionError, BadOptionError,
@ -16,7 +9,6 @@ import {
MigrationHistoryError, MigrationHistoryError,
MigrationRunError, MigrationRunError,
StorageInitError, StorageInitError,
toSerializedError,
} from '../errors.js'; } from '../errors.js';
import { import {
type Mocked, type Mocked,
@ -24,7 +16,7 @@ import {
toMigrations, toMigrations,
getMockedReporter, getMockedReporter,
getMockedStorage, getMockedStorage,
getErrorCause, assertErrorEqualEnough,
} from '../test-utils.js'; } from '../test-utils.js';
import upCommand from './up.js'; import upCommand from './up.js';
@ -930,15 +922,13 @@ function assertPreconditionsFulfilled(
for (const [index, entry] of failedEntries.entries()) { for (const [index, entry] of failedEntries.entries()) {
if (entry.status === 'failed') { if (entry.status === 'failed') {
const error = reporter.onMigrationError.mock.calls[index]?.arguments[1]; const error = reporter.onMigrationError.mock.calls[index]?.arguments[1];
assert.deepStrictEqual(error, entry.error, 'Error'); assertErrorEqualEnough(error, entry.error, 'Error');
const cause = entry.error?.cause;
assert.deepStrictEqual(error?.cause, cause ? deserializeError(cause) : cause, 'Error cause');
if (entry.started) { if (entry.started) {
const [finishedMigration, error] = storage.onError.mock.calls[index]?.arguments ?? []; const [finishedMigration, error] = storage.onError.mock.calls[index]?.arguments ?? [];
assert.strictEqual(finishedMigration?.name, entry.name); assert.strictEqual(finishedMigration?.name, entry.name);
assert.strictEqual(finishedMigration?.status, entry.status); assert.strictEqual(finishedMigration?.status, entry.status);
assertErrorEqualEnough(error, entry.error); assertErrorEqualEnough(error, entry.error, `Entry error (${entry.name})`);
} }
} }
} }
@ -946,15 +936,7 @@ function assertPreconditionsFulfilled(
assert.strictEqual(reporter.onMigrationSkip.mock.calls.length, pending + skipped, 'Total pending and skipped'); assert.strictEqual(reporter.onMigrationSkip.mock.calls.length, pending + skipped, 'Total pending and skipped');
assert.strictEqual(reporter.onFinished.mock.calls.length, 1, 'Finished called once'); assert.strictEqual(reporter.onFinished.mock.calls.length, 1, 'Finished called once');
const [entries, error] = reporter.onFinished.mock.calls[0]?.arguments ?? []; const [entries, error] = reporter.onFinished.mock.calls[0]?.arguments ?? [];
assertErrorEqualEnough(error, finishedError); assertErrorEqualEnough(error, finishedError, 'Finished error');
const cause = getErrorCause(error);
const expectedCause = finishedError?.cause;
assert.deepStrictEqual(
cause,
expectedCause ? deserializeError(expectedCause) : expectedCause,
'Finished error cause',
);
assert.strictEqual(entries?.length, expected.length, 'Finished entries length'); assert.strictEqual(entries?.length, expected.length, 'Finished entries length');
assert.deepStrictEqual( assert.deepStrictEqual(
entries.map((entry) => `${entry.name} (${entry.status})`), entries.map((entry) => `${entry.name} (${entry.status})`),
@ -995,33 +977,6 @@ function assertPreconditionsFailed(
assert.strictEqual(reporter.onMigrationSkip.mock.calls.length, 0, 'Total pending and skipped'); assert.strictEqual(reporter.onMigrationSkip.mock.calls.length, 0, 'Total pending and skipped');
assert.strictEqual(reporter.onFinished.mock.calls.length, 1, 'Finished called once'); assert.strictEqual(reporter.onFinished.mock.calls.length, 1, 'Finished called once');
const [entries, error] = reporter.onFinished.mock.calls[0]?.arguments ?? []; const [entries, error] = reporter.onFinished.mock.calls[0]?.arguments ?? [];
assert.deepStrictEqual(error, finishedError, 'Finished error'); assertErrorEqualEnough(error, finishedError, 'Finished error');
const cause = getErrorCause(error);
const expectedCause = finishedError?.cause;
assert.deepStrictEqual(
cause,
expectedCause ? deserializeError(expectedCause) : expectedCause,
'Finished error cause',
);
assert.strictEqual(entries?.length, 0, 'Finished entries length'); assert.strictEqual(entries?.length, 0, 'Finished entries length');
} }
function assertErrorEqualEnough(actual?: Error | SerializedError, expected?: Error) {
if (expected === undefined) {
assert.strictEqual(actual, undefined);
return;
}
const {
cause: actualCause,
stack: actualStack,
...actualError
} = actual instanceof Error ? toSerializedError(actual) : actual ?? {};
const { cause: expectedCause, stack: expectedStack, ...expectedError } = toSerializedError(expected);
// @ts-expect-error Ignore
const { stack: actualCauseStack, ...actualCauseRest } = actualCause ?? {};
// @ts-expect-error Ignore
const { stack: expectedCauseStack, ...expectedCauseRest } = expectedCause ?? {};
assert.deepStrictEqual(actualError, expectedError);
assert.deepStrictEqual(actualCauseRest, expectedCauseRest);
}

View file

@ -16,6 +16,7 @@ import { exec } from '../exec.js';
import { migrationRunner } from '../migration-runner.js'; import { migrationRunner } from '../migration-runner.js';
import { collectMigrations } from '../collect-migrations.js'; import { collectMigrations } from '../collect-migrations.js';
import { version } from '../get-package-info.js'; import { version } from '../get-package-info.js';
import { getStandardReporter } from '../reporters/get.js';
type ExtraFlags = { type ExtraFlags = {
cwd: string; cwd: string;
@ -29,7 +30,6 @@ type ExtraFlags = {
abortRespite?: number; abortRespite?: number;
}; };
const lazyDefaultReporter = async () => import('../reporters/default.js');
const lazyPluginLoaderJs = async () => import('../plugin-loader-js.js'); const lazyPluginLoaderJs = async () => import('../plugin-loader-js.js');
export default async function upCommand({ export default async function upCommand({
@ -58,7 +58,7 @@ export default async function upCommand({
throw BadOptionError.fromOption('storage', 'No storage found, please specify a storage using the storage option'); throw BadOptionError.fromOption('storage', 'No storage found, please specify a storage using the storage option');
} }
const reporter = await getOrLoadReporter([reporterConfig ?? lazyDefaultReporter]); const reporter = getStandardReporter(reporterConfig) ?? (await getOrLoadReporter([reporterConfig]));
if (!reporter) { if (!reporter) {
throw BadOptionError.fromOption( throw BadOptionError.fromOption(

6
packages/cli/src/deno.d.ts vendored Normal file
View file

@ -0,0 +1,6 @@
declare global {
// eslint-disable-next-line @typescript-eslint/naming-convention
const Deno: any;
}
export {};

View file

@ -8,7 +8,7 @@ import { serializeError, errorConstructors, deserializeError } from 'serialize-e
const formatter = new Intl.ListFormat('en', { style: 'long', type: 'disjunction' }); const formatter = new Intl.ListFormat('en', { style: 'long', type: 'disjunction' });
export const toError = (error: unknown) => (error instanceof Error ? error : new Error(String(error))); export const toError = (error: unknown): Error => (error instanceof Error ? error : new Error(String(error)));
export const toSerializedError = (error: unknown) => { export const toSerializedError = (error: unknown) => {
const errorInstance = toError(error); const errorInstance = toError(error);
@ -30,7 +30,7 @@ export class EmigrateError extends Error {
export class ShowUsageError extends EmigrateError {} export class ShowUsageError extends EmigrateError {}
export class MissingOptionError extends ShowUsageError { export class MissingOptionError extends ShowUsageError {
static fromOption(option: string | string[]) { static fromOption(option: string | string[]): MissingOptionError {
return new MissingOptionError( return new MissingOptionError(
`Missing required option: ${Array.isArray(option) ? formatter.format(option) : option}`, `Missing required option: ${Array.isArray(option) ? formatter.format(option) : option}`,
undefined, undefined,
@ -48,7 +48,7 @@ export class MissingOptionError extends ShowUsageError {
} }
export class MissingArgumentsError extends ShowUsageError { export class MissingArgumentsError extends ShowUsageError {
static fromArgument(argument: string) { static fromArgument(argument: string): MissingArgumentsError {
return new MissingArgumentsError(`Missing required argument: ${argument}`, undefined, argument); return new MissingArgumentsError(`Missing required argument: ${argument}`, undefined, argument);
} }
@ -62,7 +62,7 @@ export class MissingArgumentsError extends ShowUsageError {
} }
export class OptionNeededError extends ShowUsageError { export class OptionNeededError extends ShowUsageError {
static fromOption(option: string, message: string) { static fromOption(option: string, message: string): OptionNeededError {
return new OptionNeededError(message, undefined, option); return new OptionNeededError(message, undefined, option);
} }
@ -76,7 +76,7 @@ export class OptionNeededError extends ShowUsageError {
} }
export class BadOptionError extends ShowUsageError { export class BadOptionError extends ShowUsageError {
static fromOption(option: string, message: string) { static fromOption(option: string, message: string): BadOptionError {
return new BadOptionError(message, undefined, option); return new BadOptionError(message, undefined, option);
} }
@ -96,7 +96,7 @@ export class UnexpectedError extends EmigrateError {
} }
export class MigrationHistoryError extends EmigrateError { export class MigrationHistoryError extends EmigrateError {
static fromHistoryEntry(entry: FailedMigrationHistoryEntry) { static fromHistoryEntry(entry: FailedMigrationHistoryEntry): MigrationHistoryError {
return new MigrationHistoryError(`Migration ${entry.name} is in a failed state, it should be fixed and removed`, { return new MigrationHistoryError(`Migration ${entry.name} is in a failed state, it should be fixed and removed`, {
cause: deserializeError(entry.error), cause: deserializeError(entry.error),
}); });
@ -108,7 +108,7 @@ export class MigrationHistoryError extends EmigrateError {
} }
export class MigrationLoadError extends EmigrateError { export class MigrationLoadError extends EmigrateError {
static fromMetadata(metadata: MigrationMetadata, cause?: Error) { static fromMetadata(metadata: MigrationMetadata, cause?: Error): MigrationLoadError {
return new MigrationLoadError(`Failed to load migration file: ${metadata.relativeFilePath}`, { cause }); return new MigrationLoadError(`Failed to load migration file: ${metadata.relativeFilePath}`, { cause });
} }
@ -118,7 +118,7 @@ export class MigrationLoadError extends EmigrateError {
} }
export class MigrationRunError extends EmigrateError { export class MigrationRunError extends EmigrateError {
static fromMetadata(metadata: FailedMigrationMetadata) { static fromMetadata(metadata: FailedMigrationMetadata): MigrationRunError {
return new MigrationRunError(`Failed to run migration: ${metadata.relativeFilePath}`, { cause: metadata.error }); return new MigrationRunError(`Failed to run migration: ${metadata.relativeFilePath}`, { cause: metadata.error });
} }
@ -128,7 +128,7 @@ export class MigrationRunError extends EmigrateError {
} }
export class MigrationNotRunError extends EmigrateError { export class MigrationNotRunError extends EmigrateError {
static fromMetadata(metadata: MigrationMetadata, cause?: Error) { static fromMetadata(metadata: MigrationMetadata, cause?: Error): MigrationNotRunError {
return new MigrationNotRunError(`Migration "${metadata.name}" is not in the migration history`, { cause }); return new MigrationNotRunError(`Migration "${metadata.name}" is not in the migration history`, { cause });
} }
@ -138,7 +138,7 @@ export class MigrationNotRunError extends EmigrateError {
} }
export class MigrationRemovalError extends EmigrateError { export class MigrationRemovalError extends EmigrateError {
static fromMetadata(metadata: MigrationMetadata, cause?: Error) { static fromMetadata(metadata: MigrationMetadata, cause?: Error): MigrationRemovalError {
return new MigrationRemovalError(`Failed to remove migration: ${metadata.relativeFilePath}`, { cause }); return new MigrationRemovalError(`Failed to remove migration: ${metadata.relativeFilePath}`, { cause });
} }
@ -148,7 +148,7 @@ export class MigrationRemovalError extends EmigrateError {
} }
export class StorageInitError extends EmigrateError { export class StorageInitError extends EmigrateError {
static fromError(error: Error) { static fromError(error: Error): StorageInitError {
return new StorageInitError('Could not initialize storage', { cause: error }); return new StorageInitError('Could not initialize storage', { cause: error });
} }
@ -158,11 +158,11 @@ export class StorageInitError extends EmigrateError {
} }
export class CommandAbortError extends EmigrateError { export class CommandAbortError extends EmigrateError {
static fromSignal(signal: NodeJS.Signals) { static fromSignal(signal: NodeJS.Signals): CommandAbortError {
return new CommandAbortError(`Command aborted due to signal: ${signal}`); return new CommandAbortError(`Command aborted due to signal: ${signal}`);
} }
static fromReason(reason: string, cause?: unknown) { static fromReason(reason: string, cause?: unknown): CommandAbortError {
return new CommandAbortError(`Command aborted: ${reason}`, { cause }); return new CommandAbortError(`Command aborted: ${reason}`, { cause });
} }
@ -172,7 +172,7 @@ export class CommandAbortError extends EmigrateError {
} }
export class ExecutionDesertedError extends EmigrateError { export class ExecutionDesertedError extends EmigrateError {
static fromReason(reason: string, cause?: Error) { static fromReason(reason: string, cause?: Error): ExecutionDesertedError {
return new ExecutionDesertedError(`Execution deserted: ${reason}`, { cause }); return new ExecutionDesertedError(`Execution deserted: ${reason}`, { cause });
} }

View file

@ -28,6 +28,8 @@ export const exec = async <Return extends Promise<any>>(
const aborter = options.abortSignal ? getAborter(options.abortSignal, options.abortRespite) : undefined; const aborter = options.abortSignal ? getAborter(options.abortSignal, options.abortRespite) : undefined;
const result = await Promise.race(aborter ? [aborter, fn()] : [fn()]); const result = await Promise.race(aborter ? [aborter, fn()] : [fn()]);
aborter?.cancel();
return [result, undefined]; return [result, undefined];
} catch (error) { } catch (error) {
return [undefined, toError(error)]; return [undefined, toError(error)];
@ -40,27 +42,44 @@ export const exec = async <Return extends Promise<any>>(
* @param signal The abort signal to listen to * @param signal The abort signal to listen to
* @param respite The time in milliseconds to wait before rejecting * @param respite The time in milliseconds to wait before rejecting
*/ */
const getAborter = async (signal: AbortSignal, respite = DEFAULT_RESPITE_SECONDS * 1000): Promise<never> => { const getAborter = (
return new Promise((_, reject) => { signal: AbortSignal,
if (signal.aborted) { respite = DEFAULT_RESPITE_SECONDS * 1000,
setTimeout( ): PromiseLike<never> & { cancel: () => void } => {
const cleanups: Array<() => void> = [];
const aborter = new Promise<never>((_, reject) => {
const abortListener = () => {
const timer = setTimeout(
reject, reject,
respite, respite,
ExecutionDesertedError.fromReason(`Deserted after ${prettyMs(respite)}`, toError(signal.reason)), ExecutionDesertedError.fromReason(`Deserted after ${prettyMs(respite)}`, toError(signal.reason)),
).unref(); );
timer.unref();
cleanups.push(() => {
clearTimeout(timer);
});
};
if (signal.aborted) {
abortListener();
return; return;
} }
signal.addEventListener( signal.addEventListener('abort', abortListener, { once: true });
'abort',
() => { cleanups.push(() => {
setTimeout( signal.removeEventListener('abort', abortListener);
reject,
respite,
ExecutionDesertedError.fromReason(`Deserted after ${prettyMs(respite)}`, toError(signal.reason)),
).unref();
},
{ once: true },
);
}); });
});
const cancel = () => {
for (const cleanup of cleanups) {
cleanup();
}
cleanups.length = 0;
};
return Object.assign(aborter, { cancel });
}; };

View file

@ -1,11 +1,28 @@
import { cosmiconfig } from 'cosmiconfig'; import process from 'node:process';
import { cosmiconfig, defaultLoaders } from 'cosmiconfig';
import { type Config, type EmigrateConfig } from './types.js'; import { type Config, type EmigrateConfig } from './types.js';
const commands = ['up', 'list', 'new', 'remove'] as const; const commands = ['up', 'list', 'new', 'remove'] as const;
type Command = (typeof commands)[number]; type Command = (typeof commands)[number];
const canImportTypeScriptAsIs = Boolean(process.isBun) || typeof Deno !== 'undefined';
export const getConfig = async (command: Command): Promise<Config> => { const getEmigrateConfig = (config: any): EmigrateConfig => {
const explorer = cosmiconfig('emigrate'); if ('default' in config && typeof config.default === 'object' && config.default !== null) {
return config.default as EmigrateConfig;
}
if (typeof config === 'object' && config !== null) {
return config as EmigrateConfig;
}
return {};
};
export const getConfig = async (command: Command, forceImportTypeScriptAsIs = false): Promise<Config> => {
const explorer = cosmiconfig('emigrate', {
// eslint-disable-next-line @typescript-eslint/naming-convention
loaders: forceImportTypeScriptAsIs || canImportTypeScriptAsIs ? { '.ts': defaultLoaders['.js'] } : undefined,
});
const result = await explorer.search(); const result = await explorer.search();
@ -13,7 +30,7 @@ export const getConfig = async (command: Command): Promise<Config> => {
return {}; return {};
} }
const config = result.config as EmigrateConfig; const config = getEmigrateConfig(result.config);
const commandConfig = config[command]; const commandConfig = config[command];

View file

@ -1,6 +1,6 @@
import process from 'node:process'; import process from 'node:process';
export const getDuration = (start: [number, number]) => { export const getDuration = (start: [number, number]): number => {
const [seconds, nanoseconds] = process.hrtime(start); const [seconds, nanoseconds] = process.hrtime(start);
return seconds * 1000 + nanoseconds / 1_000_000; return seconds * 1000 + nanoseconds / 1_000_000;
}; };

View file

@ -39,6 +39,6 @@ export const getMigrations = async (cwd: string, directory: string): Promise<Mig
extension: withLeadingPeriod(path.extname(name)), extension: withLeadingPeriod(path.extname(name)),
directory, directory,
cwd, cwd,
} satisfies MigrationMetadata; };
}); });
}; };

View file

@ -28,4 +28,7 @@ const getPackageInfo = async () => {
throw new UnexpectedError(`Could not read package info from: ${packageInfoPath}`); throw new UnexpectedError(`Could not read package info from: ${packageInfoPath}`);
}; };
export const { version } = await getPackageInfo(); const packageInfo = await getPackageInfo();
// eslint-disable-next-line prefer-destructuring
export const version: string = packageInfo.version;

View file

@ -1,5 +1,5 @@
export * from './types.js'; export * from './types.js';
export const emigrate = () => { export const emigrate = (): void => {
// console.log('Done!'); // console.log('Done!');
}; };

View file

@ -471,6 +471,6 @@ class DefaultReporter implements Required<EmigrateReporter> {
} }
} }
const reporterDefault = interactive ? new DefaultFancyReporter() : new DefaultReporter(); const reporterDefault: EmigrateReporter = interactive ? new DefaultFancyReporter() : new DefaultReporter();
export default reporterDefault; export default reporterDefault;

View file

@ -0,0 +1,15 @@
import type { EmigrateReporter } from '@emigrate/types';
import { type Config } from '../types.js';
import * as reporters from './index.js';
export const getStandardReporter = (reporter?: Config['reporter']): EmigrateReporter | undefined => {
if (!reporter) {
return reporters.pretty;
}
if (typeof reporter === 'string' && reporter in reporters) {
return reporters[reporter as keyof typeof reporters];
}
return undefined;
};

View file

@ -0,0 +1,2 @@
export { default as pretty } from './default.js';
export { default as json } from './json.js';

View file

@ -0,0 +1,60 @@
import { type ReporterInitParameters, type EmigrateReporter, type MigrationMetadataFinished } from '@emigrate/types';
import { toSerializedError } from '../errors.js';
class JsonReporter implements EmigrateReporter {
#parameters!: ReporterInitParameters;
#startTime!: number;
onInit(parameters: ReporterInitParameters): void {
this.#startTime = Date.now();
this.#parameters = parameters;
}
onFinished(migrations: MigrationMetadataFinished[], error?: Error | undefined): void {
const { command, version } = this.#parameters;
let numberDoneMigrations = 0;
let numberSkippedMigrations = 0;
let numberFailedMigrations = 0;
let numberPendingMigrations = 0;
for (const migration of migrations) {
// eslint-disable-next-line unicorn/prefer-switch
if (migration.status === 'done') {
numberDoneMigrations++;
} else if (migration.status === 'skipped') {
numberSkippedMigrations++;
} else if (migration.status === 'failed') {
numberFailedMigrations++;
} else {
numberPendingMigrations++;
}
}
const result = {
command,
version,
numberTotalMigrations: migrations.length,
numberDoneMigrations,
numberSkippedMigrations,
numberFailedMigrations,
numberPendingMigrations,
success: !error,
startTime: this.#startTime,
endTime: Date.now(),
error: error ? toSerializedError(error) : undefined,
migrations: migrations.map((migration) => ({
name: migration.filePath,
status: migration.status,
duration: 'duration' in migration ? migration.duration : 0,
error: 'error' in migration ? toSerializedError(migration.error) : undefined,
})),
};
console.log(JSON.stringify(result, undefined, 2));
}
}
const jsonReporter: EmigrateReporter = new JsonReporter();
export default jsonReporter;

View file

@ -1,5 +1,6 @@
import { mock, type Mock } from 'node:test'; import { mock, type Mock } from 'node:test';
import path from 'node:path'; import path from 'node:path';
import assert from 'node:assert';
import { import {
type SerializedError, type SerializedError,
type EmigrateReporter, type EmigrateReporter,
@ -9,13 +10,14 @@ import {
type NonFailedMigrationHistoryEntry, type NonFailedMigrationHistoryEntry,
type Storage, type Storage,
} from '@emigrate/types'; } from '@emigrate/types';
import { toSerializedError } from './errors.js';
export type Mocked<T> = { export type Mocked<T> = {
// @ts-expect-error - This is a mock // @ts-expect-error - This is a mock
[K in keyof T]: Mock<T[K]>; [K in keyof T]: Mock<T[K]>;
}; };
export async function noop() { export async function noop(): Promise<void> {
// noop // noop
} }
@ -31,8 +33,8 @@ export function getErrorCause(error: Error | undefined): Error | SerializedError
return undefined; return undefined;
} }
export function getMockedStorage(historyEntries: Array<string | MigrationHistoryEntry>) { export function getMockedStorage(historyEntries: Array<string | MigrationHistoryEntry>): Mocked<Storage> {
const storage: Mocked<Storage> = { return {
lock: mock.fn(async (migrations) => migrations), lock: mock.fn(async (migrations) => migrations),
unlock: mock.fn(async () => { unlock: mock.fn(async () => {
// void // void
@ -45,8 +47,6 @@ export function getMockedStorage(historyEntries: Array<string | MigrationHistory
onError: mock.fn(), onError: mock.fn(),
end: mock.fn(), end: mock.fn(),
}; };
return storage;
} }
export function getMockedReporter(): Mocked<Required<EmigrateReporter>> { export function getMockedReporter(): Mocked<Required<EmigrateReporter>> {
@ -112,3 +112,23 @@ export function toEntries(
): MigrationHistoryEntry[] { ): MigrationHistoryEntry[] {
return names.map((name) => (typeof name === 'string' ? toEntry(name, status) : name)); return names.map((name) => (typeof name === 'string' ? toEntry(name, status) : name));
} }
export function assertErrorEqualEnough(actual?: Error | SerializedError, expected?: Error, message?: string): void {
if (expected === undefined) {
assert.strictEqual(actual, undefined);
return;
}
const {
cause: actualCause,
stack: actualStack,
...actualError
} = actual instanceof Error ? toSerializedError(actual) : actual ?? {};
const { cause: expectedCause, stack: expectedStack, ...expectedError } = toSerializedError(expected);
// @ts-expect-error Ignore
const { stack: actualCauseStack, ...actualCauseRest } = actualCause ?? {};
// @ts-expect-error Ignore
const { stack: expectedCauseStack, ...expectedCauseRest } = expectedCause ?? {};
assert.deepStrictEqual(actualError, expectedError, message);
assert.deepStrictEqual(actualCauseRest, expectedCauseRest, message ? `${message} (cause)` : undefined);
}

View file

@ -1,4 +1,7 @@
import { type EmigrateStorage, type Awaitable, type Plugin, type EmigrateReporter } from '@emigrate/types'; import { type EmigrateStorage, type Awaitable, type Plugin, type EmigrateReporter } from '@emigrate/types';
import type * as reporters from './reporters/index.js';
export type StandardReporter = keyof typeof reporters;
export type EmigratePlugin = Plugin; export type EmigratePlugin = Plugin;
@ -6,7 +9,7 @@ type StringOrModule<T> = string | T | (() => Awaitable<T>) | (() => Awaitable<{
export type Config = { export type Config = {
storage?: StringOrModule<EmigrateStorage>; storage?: StringOrModule<EmigrateStorage>;
reporter?: StringOrModule<EmigrateReporter>; reporter?: StandardReporter | StringOrModule<EmigrateReporter>;
plugins?: Array<StringOrModule<EmigratePlugin>>; plugins?: Array<StringOrModule<EmigratePlugin>>;
directory?: string; directory?: string;
template?: string; template?: string;

View file

@ -1 +1 @@
export const withLeadingPeriod = (string: string) => (string.startsWith('.') ? string : `.${string}`); export const withLeadingPeriod = (string: string): string => (string.startsWith('.') ? string : `.${string}`);

View file

@ -1,8 +1,3 @@
{ {
"extends": "@emigrate/tsconfig/build.json", "extends": "@emigrate/tsconfig/build.json"
"compilerOptions": {
"outDir": "dist"
},
"include": ["src"],
"exclude": ["node_modules", "dist"]
} }

View file

@ -1,5 +1,53 @@
# @emigrate/mysql # @emigrate/mysql
## 0.3.3
### Patch Changes
- 26240f4: Make sure we can initialize multiple running instances of Emigrate using @emigrate/mysql concurrently without issues with creating the history table (for instance in a Kubernetes environment and/or with a Percona cluster).
- d779286: Upgrade TypeScript to v5.5 and enable [isolatedDeclarations](https://devblogs.microsoft.com/typescript/announcing-typescript-5-5/#isolated-declarations)
- 26240f4: Either lock all or none of the migrations to run to make sure they run in order when multiple instances of Emigrate runs concurrently (for instance in a Kubernetes environment)
- Updated dependencies [d779286]
- @emigrate/plugin-tools@0.9.8
- @emigrate/types@0.12.2
## 0.3.2
### Patch Changes
- 57498db: Unreference all connections when run using Bun, to not keep the process open unnecessarily long
## 0.3.1
### Patch Changes
- ca154fa: Minimize package size by excluding \*.tsbuildinfo files
- Updated dependencies [ca154fa]
- @emigrate/plugin-tools@0.9.7
- @emigrate/types@0.12.2
## 0.3.0
### Minor Changes
- 4442604: Automatically create the database if it doesn't exist, and the user have the permissions to do so
### Patch Changes
- aef2d7c: Avoid "CREATE TABLE IF NOT EXISTS" as it's too locking in a clustered database when running it concurrently
## 0.2.8
### Patch Changes
- 17feb2d: Only unreference connections in a Bun environment as it crashes Node for some reason, without even throwing an error that is
## 0.2.7
### Patch Changes
- 198aa54: Unreference all connections automatically so that they don't hinder the process from exiting. This is especially needed in Bun environments as it seems to handle sockets differently regarding this matter than NodeJS.
## 0.2.6 ## 0.2.6
### Patch Changes ### Patch Changes

View file

@ -1,6 +1,6 @@
{ {
"name": "@emigrate/mysql", "name": "@emigrate/mysql",
"version": "0.2.6", "version": "0.3.3",
"publishConfig": { "publishConfig": {
"access": "public", "access": "public",
"provenance": true "provenance": true
@ -16,12 +16,17 @@
} }
}, },
"files": [ "files": [
"dist" "dist",
"!dist/*.tsbuildinfo",
"!dist/**/*.test.js",
"!dist/tests/*"
], ],
"scripts": { "scripts": {
"build": "tsc --pretty", "build": "tsc --pretty",
"build:watch": "tsc --pretty --watch", "build:watch": "tsc --pretty --watch",
"lint": "xo --cwd=../.. $(pwd)" "lint": "xo --cwd=../.. $(pwd)",
"integration": "glob -c \"node --import tsx --test-reporter spec --test\" \"./src/**/*.integration.ts\"",
"integration:watch": "glob -c \"node --watch --import tsx --test-reporter spec --test\" \"./src/**/*.integration.ts\""
}, },
"keywords": [ "keywords": [
"emigrate", "emigrate",
@ -43,7 +48,9 @@
"mysql2": "3.6.5" "mysql2": "3.6.5"
}, },
"devDependencies": { "devDependencies": {
"@emigrate/tsconfig": "workspace:*" "@emigrate/tsconfig": "workspace:*",
"@types/bun": "1.1.2",
"bun-types": "1.1.8"
}, },
"volta": { "volta": {
"extends": "../../package.json" "extends": "../../package.json"

View file

@ -0,0 +1,103 @@
import assert from 'node:assert';
import path from 'node:path';
import { before, after, describe, it } from 'node:test';
import type { MigrationMetadata } from '@emigrate/types';
import { startDatabase, stopDatabase } from './tests/database.js';
import { createMysqlStorage } from './index.js';
let db: { port: number; host: string };
const toEnd = new Set<{ end: () => Promise<void> }>();
describe('emigrate-mysql', async () => {
before(
async () => {
db = await startDatabase();
},
{ timeout: 60_000 },
);
after(
async () => {
for (const storage of toEnd) {
// eslint-disable-next-line no-await-in-loop
await storage.end();
}
toEnd.clear();
await stopDatabase();
},
{ timeout: 10_000 },
);
describe('migration locks', async () => {
it('either locks none or all of the given migrations', async () => {
const { initializeStorage } = createMysqlStorage({
table: 'migrations',
connection: {
host: db.host,
user: 'emigrate',
password: 'emigrate',
database: 'emigrate',
port: db.port,
},
});
const [storage1, storage2] = await Promise.all([initializeStorage(), initializeStorage()]);
toEnd.add(storage1);
toEnd.add(storage2);
const migrations = toMigrations('/emigrate', 'migrations', [
'2023-10-01-01-test.js',
'2023-10-01-02-test.js',
'2023-10-01-03-test.js',
'2023-10-01-04-test.js',
'2023-10-01-05-test.js',
'2023-10-01-06-test.js',
'2023-10-01-07-test.js',
'2023-10-01-08-test.js',
'2023-10-01-09-test.js',
'2023-10-01-10-test.js',
'2023-10-01-11-test.js',
'2023-10-01-12-test.js',
'2023-10-01-13-test.js',
'2023-10-01-14-test.js',
'2023-10-01-15-test.js',
'2023-10-01-16-test.js',
'2023-10-01-17-test.js',
'2023-10-01-18-test.js',
'2023-10-01-19-test.js',
'2023-10-01-20-test.js',
]);
const [locked1, locked2] = await Promise.all([storage1.lock(migrations), storage2.lock(migrations)]);
assert.strictEqual(
locked1.length === 0 || locked2.length === 0,
true,
'One of the processes should have no locks',
);
assert.strictEqual(
locked1.length === 20 || locked2.length === 20,
true,
'One of the processes should have all locks',
);
});
});
});
function toMigration(cwd: string, directory: string, name: string): MigrationMetadata {
return {
name,
filePath: `${cwd}/${directory}/${name}`,
relativeFilePath: `${directory}/${name}`,
extension: path.extname(name),
directory,
cwd,
};
}
function toMigrations(cwd: string, directory: string, names: string[]): MigrationMetadata[] {
return names.map((name) => toMigration(cwd, directory, name));
}

View file

@ -1,5 +1,6 @@
import process from 'node:process'; import process from 'node:process';
import fs from 'node:fs/promises'; import fs from 'node:fs/promises';
import { setTimeout } from 'node:timers/promises';
import { import {
createConnection, createConnection,
createPool, createPool,
@ -9,10 +10,13 @@ import {
type Pool, type Pool,
type ResultSetHeader, type ResultSetHeader,
type RowDataPacket, type RowDataPacket,
type Connection,
} from 'mysql2/promise'; } from 'mysql2/promise';
import { getTimestampPrefix, sanitizeMigrationName } from '@emigrate/plugin-tools'; import { getTimestampPrefix, sanitizeMigrationName } from '@emigrate/plugin-tools';
import { import {
type Awaitable,
type MigrationMetadata, type MigrationMetadata,
type MigrationFunction,
type EmigrateStorage, type EmigrateStorage,
type LoaderPlugin, type LoaderPlugin,
type Storage, type Storage,
@ -40,27 +44,39 @@ export type MysqlLoaderOptions = {
connection: ConnectionOptions | string; connection: ConnectionOptions | string;
}; };
const getConnection = async (connection: ConnectionOptions | string) => { const getConnection = async (options: ConnectionOptions | string) => {
if (typeof connection === 'string') { let connection: Connection;
const uri = new URL(connection);
if (typeof options === 'string') {
const uri = new URL(options);
// client side connectTimeout is unstable in mysql2 library // client side connectTimeout is unstable in mysql2 library
// it throws an error you can't catch and crashes node // it throws an error you can't catch and crashes node
// best to leave this at 0 (disabled) // best to leave this at 0 (disabled)
uri.searchParams.set('connectTimeout', '0'); uri.searchParams.set('connectTimeout', '0');
uri.searchParams.set('multipleStatements', 'true'); uri.searchParams.set('multipleStatements', 'true');
uri.searchParams.set('flags', '-FOUND_ROWS');
return createConnection(uri.toString()); connection = await createConnection(uri.toString());
} } else {
connection = await createConnection({
return createConnection({ ...options,
...connection,
// client side connectTimeout is unstable in mysql2 library // client side connectTimeout is unstable in mysql2 library
// it throws an error you can't catch and crashes node // it throws an error you can't catch and crashes node
// best to leave this at 0 (disabled) // best to leave this at 0 (disabled)
connectTimeout: 0, connectTimeout: 0,
multipleStatements: true, multipleStatements: true,
flags: ['-FOUND_ROWS'],
}); });
}
if (process.isBun) {
// @ts-expect-error the connection is not in the types but it's there
// eslint-disable-next-line @typescript-eslint/no-unsafe-call
connection.connection.stream.unref();
}
return connection;
}; };
const getPool = (connection: PoolOptions | string) => { const getPool = (connection: PoolOptions | string) => {
@ -71,6 +87,7 @@ const getPool = (connection: PoolOptions | string) => {
// it throws an error you can't catch and crashes node // it throws an error you can't catch and crashes node
// best to leave this at 0 (disabled) // best to leave this at 0 (disabled)
uri.searchParams.set('connectTimeout', '0'); uri.searchParams.set('connectTimeout', '0');
uri.searchParams.set('flags', '-FOUND_ROWS');
return createPool(uri.toString()); return createPool(uri.toString());
} }
@ -81,6 +98,7 @@ const getPool = (connection: PoolOptions | string) => {
// it throws an error you can't catch and crashes node // it throws an error you can't catch and crashes node
// best to leave this at 0 (disabled) // best to leave this at 0 (disabled)
connectTimeout: 0, connectTimeout: 0,
flags: ['-FOUND_ROWS'],
}); });
}; };
@ -91,8 +109,8 @@ type HistoryEntry = {
error?: SerializedError; error?: SerializedError;
}; };
const lockMigration = async (pool: Pool, table: string, migration: MigrationMetadata) => { const lockMigration = async (connection: Connection, table: string, migration: MigrationMetadata) => {
const [result] = await pool.execute<ResultSetHeader>({ const [result] = await connection.execute<ResultSetHeader>({
sql: ` sql: `
INSERT INTO ${escapeId(table)} (name, status, date) INSERT INTO ${escapeId(table)} (name, status, date)
VALUES (?, ?, NOW()) VALUES (?, ?, NOW())
@ -155,42 +173,186 @@ const deleteMigration = async (pool: Pool, table: string, migration: MigrationMe
return result.affectedRows === 1; return result.affectedRows === 1;
}; };
const initializeTable = async (pool: Pool, table: string) => { const getDatabaseName = (config: ConnectionOptions | string) => {
if (typeof config === 'string') {
const uri = new URL(config);
return uri.pathname.replace(/^\//u, '');
}
return config.database ?? '';
};
const setDatabaseName = <T extends ConnectionOptions | string>(config: T, databaseName: string): T => {
if (typeof config === 'string') {
const uri = new URL(config);
uri.pathname = `/${databaseName}`;
return uri.toString() as T;
}
if (typeof config === 'object') {
return {
...config,
database: databaseName,
};
}
throw new Error('Invalid connection config');
};
const initializeDatabase = async (config: ConnectionOptions | string) => {
let connection: Connection | undefined;
try {
connection = await getConnection(config);
await connection.query('SELECT 1');
await connection.end();
} catch (error) {
await connection?.end();
// The ER_BAD_DB_ERROR error code is thrown when the database does not exist but the user might have the permissions to create it
// Otherwise the error code is ER_DBACCESS_DENIED_ERROR
if (error && typeof error === 'object' && 'code' in error && error.code === 'ER_BAD_DB_ERROR') {
const databaseName = getDatabaseName(config);
const informationSchemaConfig = setDatabaseName(config, 'information_schema');
const informationSchemaConnection = await getConnection(informationSchemaConfig);
try {
await informationSchemaConnection.query(`CREATE DATABASE ${escapeId(databaseName)}`);
// Any database creation error here will be propagated
} finally {
await informationSchemaConnection.end();
}
} else {
// In this case we don't know how to handle the error, so we rethrow it
throw error;
}
}
};
const lockWaitTimeout = 10; // seconds
const isHistoryTableExisting = async (connection: Connection, table: string) => {
const [result] = await connection.execute<RowDataPacket[]>({
sql: `
SELECT
1 as table_exists
FROM
information_schema.tables
WHERE
table_schema = DATABASE()
AND table_name = ?
`,
values: [table],
});
return result[0]?.['table_exists'] === 1;
};
const initializeTable = async (config: ConnectionOptions | string, table: string) => {
const connection = await getConnection(config);
if (await isHistoryTableExisting(connection, table)) {
await connection.end();
return;
}
const lockName = `emigrate_init_table_lock_${table}`;
const [lockResult] = await connection.query<RowDataPacket[]>(`SELECT GET_LOCK(?, ?) AS got_lock`, [
lockName,
lockWaitTimeout,
]);
const didGetLock = lockResult[0]?.['got_lock'] === 1;
if (didGetLock) {
try {
// This table definition is compatible with the one used by the immigration-mysql package // This table definition is compatible with the one used by the immigration-mysql package
await pool.execute(` await connection.execute(`
CREATE TABLE IF NOT EXISTS ${escapeId(table)} ( CREATE TABLE IF NOT EXISTS ${escapeId(table)} (
name varchar(255) not null primary key, name varchar(255) not null primary key,
status varchar(32), status varchar(32),
date datetime not null date datetime not null
) Engine=InnoDB; ) Engine=InnoDB;
`); `);
} finally {
await connection.query(`SELECT RELEASE_LOCK(?)`, [lockName]);
await connection.end();
}
return;
}
// Didn't get the lock, wait to see if the table was created by another process
const maxWait = lockWaitTimeout * 1000; // milliseconds
const checkInterval = 250; // milliseconds
const start = Date.now();
try {
while (Date.now() - start < maxWait) {
// eslint-disable-next-line no-await-in-loop
if (await isHistoryTableExisting(connection, table)) {
return;
}
// eslint-disable-next-line no-await-in-loop
await setTimeout(checkInterval);
}
throw new Error(`Timeout waiting for table ${table} to be created by other process`);
} finally {
await connection.end();
}
}; };
export const createMysqlStorage = ({ table = defaultTable, connection }: MysqlStorageOptions): EmigrateStorage => { export const createMysqlStorage = ({ table = defaultTable, connection }: MysqlStorageOptions): EmigrateStorage => {
return { return {
async initializeStorage() { async initializeStorage() {
await initializeDatabase(connection);
await initializeTable(connection, table);
const pool = getPool(connection); const pool = getPool(connection);
await pool.query('SELECT 1'); if (process.isBun) {
pool.on('connection', (connection) => {
try { // @ts-expect-error stream is not in the types but it's there
await initializeTable(pool, table); // eslint-disable-next-line @typescript-eslint/no-unsafe-call
} catch (error) { connection.stream.unref();
await pool.end(); });
throw error;
} }
const storage: Storage = { const storage: Storage = {
async lock(migrations) { async lock(migrations) {
const connection = await pool.getConnection();
try {
await connection.beginTransaction();
const lockedMigrations: MigrationMetadata[] = []; const lockedMigrations: MigrationMetadata[] = [];
for await (const migration of migrations) { for await (const migration of migrations) {
if (await lockMigration(pool, table, migration)) { if (await lockMigration(connection, table, migration)) {
lockedMigrations.push(migration); lockedMigrations.push(migration);
} }
} }
if (lockedMigrations.length === migrations.length) {
await connection.commit();
return lockedMigrations; return lockedMigrations;
}
await connection.rollback();
return [];
} catch (error) {
await connection.rollback();
throw error;
} finally {
connection.release();
}
}, },
async unlock(migrations) { async unlock(migrations) {
for await (const migration of migrations) { for await (const migration of migrations) {
@ -249,17 +411,6 @@ export const createMysqlStorage = ({ table = defaultTable, connection }: MysqlSt
}; };
}; };
export const { initializeStorage } = createMysqlStorage({
table: process.env['MYSQL_TABLE'],
connection: process.env['MYSQL_URL'] ?? {
host: process.env['MYSQL_HOST'],
port: process.env['MYSQL_PORT'] ? Number.parseInt(process.env['MYSQL_PORT'], 10) : undefined,
user: process.env['MYSQL_USER'],
password: process.env['MYSQL_PASSWORD'],
database: process.env['MYSQL_DATABASE'],
},
});
export const createMysqlLoader = ({ connection }: MysqlLoaderOptions): LoaderPlugin => { export const createMysqlLoader = ({ connection }: MysqlLoaderOptions): LoaderPlugin => {
return { return {
loadableExtensions: ['.sql'], loadableExtensions: ['.sql'],
@ -278,7 +429,16 @@ export const createMysqlLoader = ({ connection }: MysqlLoaderOptions): LoaderPlu
}; };
}; };
export const { loadableExtensions, loadMigration } = createMysqlLoader({ export const generateMigration: GenerateMigrationFunction = async (name) => {
return {
filename: `${getTimestampPrefix()}_${sanitizeMigrationName(name)}.sql`,
content: `-- Migration: ${name}
`,
};
};
const storage = createMysqlStorage({
table: process.env['MYSQL_TABLE'],
connection: process.env['MYSQL_URL'] ?? { connection: process.env['MYSQL_URL'] ?? {
host: process.env['MYSQL_HOST'], host: process.env['MYSQL_HOST'],
port: process.env['MYSQL_PORT'] ? Number.parseInt(process.env['MYSQL_PORT'], 10) : undefined, port: process.env['MYSQL_PORT'] ? Number.parseInt(process.env['MYSQL_PORT'], 10) : undefined,
@ -288,13 +448,22 @@ export const { loadableExtensions, loadMigration } = createMysqlLoader({
}, },
}); });
export const generateMigration: GenerateMigrationFunction = async (name) => { const loader = createMysqlLoader({
return { connection: process.env['MYSQL_URL'] ?? {
filename: `${getTimestampPrefix()}_${sanitizeMigrationName(name)}.sql`, host: process.env['MYSQL_HOST'],
content: `-- Migration: ${name} port: process.env['MYSQL_PORT'] ? Number.parseInt(process.env['MYSQL_PORT'], 10) : undefined,
`, user: process.env['MYSQL_USER'],
}; password: process.env['MYSQL_PASSWORD'],
}; database: process.env['MYSQL_DATABASE'],
},
});
// eslint-disable-next-line prefer-destructuring
export const initializeStorage: () => Promise<Storage> = storage.initializeStorage;
// eslint-disable-next-line prefer-destructuring
export const loadableExtensions: string[] = loader.loadableExtensions;
// eslint-disable-next-line prefer-destructuring
export const loadMigration: (migration: MigrationMetadata) => Awaitable<MigrationFunction> = loader.loadMigration;
const defaultExport: EmigrateStorage & LoaderPlugin & GeneratorPlugin = { const defaultExport: EmigrateStorage & LoaderPlugin & GeneratorPlugin = {
initializeStorage, initializeStorage,

View file

@ -0,0 +1,49 @@
/* eslint @typescript-eslint/naming-convention:0, import/no-extraneous-dependencies: 0 */
import process from 'node:process';
import { GenericContainer, type StartedTestContainer } from 'testcontainers';
let container: StartedTestContainer | undefined;
export const startDatabase = async (): Promise<{ port: number; host: string }> => {
if (process.env['CI']) {
const config = {
port: process.env['MYSQL_PORT'] ? Number.parseInt(process.env['MYSQL_PORT'], 10) : 3306,
// eslint-disable-next-line @typescript-eslint/prefer-nullish-coalescing
host: process.env['MYSQL_HOST'] || 'localhost',
};
console.log(`Connecting to MySQL from environment variables: ${JSON.stringify(config)}`);
return config;
}
if (!container) {
console.log('Starting MySQL container...');
const containerSetup = new GenericContainer('mysql:8.2')
.withEnvironment({
MYSQL_ROOT_PASSWORD: 'admin',
MYSQL_USER: 'emigrate',
MYSQL_PASSWORD: 'emigrate',
MYSQL_DATABASE: 'emigrate',
})
.withTmpFs({ '/var/lib/mysql': 'rw' })
.withCommand(['--sql-mode=NO_ENGINE_SUBSTITUTION', '--default-authentication-plugin=mysql_native_password'])
.withExposedPorts(3306)
.withReuse();
container = await containerSetup.start();
console.log('MySQL container started');
}
return { port: container.getMappedPort(3306), host: container.getHost() };
};
export const stopDatabase = async (): Promise<void> => {
if (container) {
console.log('Stopping MySQL container...');
await container.stop();
console.log('MySQL container stopped');
container = undefined;
}
};

View file

@ -1,8 +1,3 @@
{ {
"extends": "@emigrate/tsconfig/build.json", "extends": "@emigrate/tsconfig/build.json"
"compilerOptions": {
"outDir": "dist"
},
"include": ["src"],
"exclude": ["node_modules", "dist"]
} }

View file

@ -1,5 +1,21 @@
# @emigrate/plugin-generate-js # @emigrate/plugin-generate-js
## 0.3.8
### Patch Changes
- Updated dependencies [d779286]
- @emigrate/plugin-tools@0.9.8
- @emigrate/types@0.12.2
## 0.3.7
### Patch Changes
- Updated dependencies [ca154fa]
- @emigrate/plugin-tools@0.9.7
- @emigrate/types@0.12.2
## 0.3.6 ## 0.3.6
### Patch Changes ### Patch Changes

View file

@ -1,6 +1,6 @@
{ {
"name": "@emigrate/plugin-generate-js", "name": "@emigrate/plugin-generate-js",
"version": "0.3.6", "version": "0.3.8",
"publishConfig": { "publishConfig": {
"access": "public" "access": "public"
}, },

View file

@ -1,8 +1,3 @@
{ {
"extends": "@emigrate/tsconfig/build.json", "extends": "@emigrate/tsconfig/build.json"
"compilerOptions": {
"outDir": "dist"
},
"include": ["src"],
"exclude": ["node_modules", "dist"]
} }

View file

@ -1,5 +1,20 @@
# @emigrate/plugin-tools # @emigrate/plugin-tools
## 0.9.8
### Patch Changes
- d779286: Upgrade TypeScript to v5.5 and enable [isolatedDeclarations](https://devblogs.microsoft.com/typescript/announcing-typescript-5-5/#isolated-declarations)
- @emigrate/types@0.12.2
## 0.9.7
### Patch Changes
- ca154fa: Minimize package size by excluding \*.tsbuildinfo files
- Updated dependencies [ca154fa]
- @emigrate/types@0.12.2
## 0.9.6 ## 0.9.6
### Patch Changes ### Patch Changes

View file

@ -1,6 +1,6 @@
{ {
"name": "@emigrate/plugin-tools", "name": "@emigrate/plugin-tools",
"version": "0.9.6", "version": "0.9.8",
"publishConfig": { "publishConfig": {
"access": "public", "access": "public",
"provenance": true "provenance": true
@ -16,7 +16,8 @@
} }
}, },
"files": [ "files": [
"dist" "dist",
"!dist/*.tsbuildinfo"
], ],
"scripts": { "scripts": {
"build": "tsc --pretty", "build": "tsc --pretty",

View file

@ -204,7 +204,7 @@ const load = async <T>(
* *
* @returns A timestamp string in the format YYYYMMDDHHmmssmmm * @returns A timestamp string in the format YYYYMMDDHHmmssmmm
*/ */
export const getTimestampPrefix = () => new Date().toISOString().replaceAll(/[-:ZT.]/g, ''); export const getTimestampPrefix = (): string => new Date().toISOString().replaceAll(/[-:ZT.]/g, '');
/** /**
* A utility function to sanitize a migration name so that it can be used as a filename * A utility function to sanitize a migration name so that it can be used as a filename
@ -212,7 +212,7 @@ export const getTimestampPrefix = () => new Date().toISOString().replaceAll(/[-:
* @param name A migration name to sanitize * @param name A migration name to sanitize
* @returns A sanitized migration name that can be used as a filename * @returns A sanitized migration name that can be used as a filename
*/ */
export const sanitizeMigrationName = (name: string) => export const sanitizeMigrationName = (name: string): string =>
name name
.replaceAll(/[\W/\\:|*?'"<>_]+/g, '_') .replaceAll(/[\W/\\:|*?'"<>_]+/g, '_')
.trim() .trim()

View file

@ -1,8 +1,3 @@
{ {
"extends": "@emigrate/tsconfig/build.json", "extends": "@emigrate/tsconfig/build.json"
"compilerOptions": {
"outDir": "dist"
},
"include": ["src"],
"exclude": ["node_modules", "dist"]
} }

View file

@ -1,5 +1,29 @@
# @emigrate/postgres # @emigrate/postgres
## 0.3.2
### Patch Changes
- d779286: Upgrade TypeScript to v5.5 and enable [isolatedDeclarations](https://devblogs.microsoft.com/typescript/announcing-typescript-5-5/#isolated-declarations)
- Updated dependencies [d779286]
- @emigrate/plugin-tools@0.9.8
- @emigrate/types@0.12.2
## 0.3.1
### Patch Changes
- ca154fa: Minimize package size by excluding \*.tsbuildinfo files
- Updated dependencies [ca154fa]
- @emigrate/plugin-tools@0.9.7
- @emigrate/types@0.12.2
## 0.3.0
### Minor Changes
- 4442604: Automatically create the database if it doesn't exist, and the user have the permissions to do so
## 0.2.6 ## 0.2.6
### Patch Changes ### Patch Changes

View file

@ -1,6 +1,6 @@
{ {
"name": "@emigrate/postgres", "name": "@emigrate/postgres",
"version": "0.2.6", "version": "0.3.2",
"publishConfig": { "publishConfig": {
"access": "public", "access": "public",
"provenance": true "provenance": true
@ -16,7 +16,8 @@
} }
}, },
"files": [ "files": [
"dist" "dist",
"!dist/*.tsbuildinfo"
], ],
"scripts": { "scripts": {
"build": "tsc --pretty", "build": "tsc --pretty",

View file

@ -11,6 +11,8 @@ import {
type GeneratorPlugin, type GeneratorPlugin,
type SerializedError, type SerializedError,
type MigrationHistoryEntry, type MigrationHistoryEntry,
type Awaitable,
type MigrationFunction,
} from '@emigrate/types'; } from '@emigrate/types';
const defaultTable = 'migrations'; const defaultTable = 'migrations';
@ -92,6 +94,64 @@ const deleteMigration = async (sql: Sql, table: string, migration: MigrationMeta
return result.count === 1; return result.count === 1;
}; };
const getDatabaseName = (config: ConnectionOptions | string) => {
if (typeof config === 'string') {
const uri = new URL(config);
return uri.pathname.replace(/^\//u, '');
}
return config.database ?? '';
};
const setDatabaseName = <T extends ConnectionOptions | string>(config: T, databaseName: string): T => {
if (typeof config === 'string') {
const uri = new URL(config);
uri.pathname = `/${databaseName}`;
return uri.toString() as T;
}
if (typeof config === 'object') {
return {
...config,
database: databaseName,
};
}
throw new Error('Invalid connection config');
};
const initializeDatabase = async (config: ConnectionOptions | string) => {
let sql: Sql | undefined;
try {
sql = await getPool(config);
await sql.end();
} catch (error) {
await sql?.end();
// The error code 3D000 means that the database does not exist, but the user might have the permissions to create it
if (error && typeof error === 'object' && 'code' in error && error.code === '3D000') {
const databaseName = getDatabaseName(config);
const postgresConfig = setDatabaseName(config, 'postgres');
const postgresSql = await getPool(postgresConfig);
try {
await postgresSql`CREATE DATABASE ${postgresSql(databaseName)}`;
// Any database creation error here will be propagated
} finally {
await postgresSql.end();
}
} else {
// In this case we don't know how to handle the error, so we rethrow it
throw error;
}
}
};
const initializeTable = async (sql: Sql, table: string) => { const initializeTable = async (sql: Sql, table: string) => {
const [row] = await sql<Array<{ exists: 1 }>>` const [row] = await sql<Array<{ exists: 1 }>>`
SELECT 1 as exists SELECT 1 as exists
@ -122,6 +182,8 @@ export const createPostgresStorage = ({
}: PostgresStorageOptions): EmigrateStorage => { }: PostgresStorageOptions): EmigrateStorage => {
return { return {
async initializeStorage() { async initializeStorage() {
await initializeDatabase(connection);
const sql = await getPool(connection); const sql = await getPool(connection);
try { try {
@ -195,17 +257,6 @@ export const createPostgresStorage = ({
}; };
}; };
export const { initializeStorage } = createPostgresStorage({
table: process.env['POSTGRES_TABLE'],
connection: process.env['POSTGRES_URL'] ?? {
host: process.env['POSTGRES_HOST'],
port: process.env['POSTGRES_PORT'] ? Number.parseInt(process.env['POSTGRES_PORT'], 10) : undefined,
user: process.env['POSTGRES_USER'],
password: process.env['POSTGRES_PASSWORD'],
database: process.env['POSTGRES_DB'],
},
});
export const createPostgresLoader = ({ connection }: PostgresLoaderOptions): LoaderPlugin => { export const createPostgresLoader = ({ connection }: PostgresLoaderOptions): LoaderPlugin => {
return { return {
loadableExtensions: ['.sql'], loadableExtensions: ['.sql'],
@ -224,7 +275,16 @@ export const createPostgresLoader = ({ connection }: PostgresLoaderOptions): Loa
}; };
}; };
export const { loadableExtensions, loadMigration } = createPostgresLoader({ export const generateMigration: GenerateMigrationFunction = async (name) => {
return {
filename: `${getTimestampPrefix()}_${sanitizeMigrationName(name)}.sql`,
content: `-- Migration: ${name}
`,
};
};
const storage = createPostgresStorage({
table: process.env['POSTGRES_TABLE'],
connection: process.env['POSTGRES_URL'] ?? { connection: process.env['POSTGRES_URL'] ?? {
host: process.env['POSTGRES_HOST'], host: process.env['POSTGRES_HOST'],
port: process.env['POSTGRES_PORT'] ? Number.parseInt(process.env['POSTGRES_PORT'], 10) : undefined, port: process.env['POSTGRES_PORT'] ? Number.parseInt(process.env['POSTGRES_PORT'], 10) : undefined,
@ -234,13 +294,22 @@ export const { loadableExtensions, loadMigration } = createPostgresLoader({
}, },
}); });
export const generateMigration: GenerateMigrationFunction = async (name) => { const loader = createPostgresLoader({
return { connection: process.env['POSTGRES_URL'] ?? {
filename: `${getTimestampPrefix()}_${sanitizeMigrationName(name)}.sql`, host: process.env['POSTGRES_HOST'],
content: `-- Migration: ${name} port: process.env['POSTGRES_PORT'] ? Number.parseInt(process.env['POSTGRES_PORT'], 10) : undefined,
`, user: process.env['POSTGRES_USER'],
}; password: process.env['POSTGRES_PASSWORD'],
}; database: process.env['POSTGRES_DB'],
},
});
// eslint-disable-next-line prefer-destructuring
export const initializeStorage: () => Promise<Storage> = storage.initializeStorage;
// eslint-disable-next-line prefer-destructuring
export const loadableExtensions: string[] = loader.loadableExtensions;
// eslint-disable-next-line prefer-destructuring
export const loadMigration: (migration: MigrationMetadata) => Awaitable<MigrationFunction> = loader.loadMigration;
const defaultExport: EmigrateStorage & LoaderPlugin & GeneratorPlugin = { const defaultExport: EmigrateStorage & LoaderPlugin & GeneratorPlugin = {
initializeStorage, initializeStorage,

View file

@ -1,8 +1,3 @@
{ {
"extends": "@emigrate/tsconfig/build.json", "extends": "@emigrate/tsconfig/build.json"
"compilerOptions": {
"outDir": "dist"
},
"include": ["src"],
"exclude": ["node_modules", "dist"]
} }

View file

@ -1,5 +1,32 @@
# @emigrate/reporter-pino # @emigrate/reporter-pino
## 0.6.5
### Patch Changes
- d779286: Upgrade TypeScript to v5.5 and enable [isolatedDeclarations](https://devblogs.microsoft.com/typescript/announcing-typescript-5-5/#isolated-declarations)
- @emigrate/types@0.12.2
## 0.6.4
### Patch Changes
- ca154fa: Minimize package size by excluding \*.tsbuildinfo files
- Updated dependencies [ca154fa]
- @emigrate/types@0.12.2
## 0.6.3
### Patch Changes
- 081ab34: Make sure Pino outputs logs in Bun environments
## 0.6.2
### Patch Changes
- 1065322: Show correct status for migrations for the "list" and "new" commands
## 0.6.1 ## 0.6.1
### Patch Changes ### Patch Changes

View file

@ -1,6 +1,6 @@
{ {
"name": "@emigrate/reporter-pino", "name": "@emigrate/reporter-pino",
"version": "0.6.1", "version": "0.6.5",
"publishConfig": { "publishConfig": {
"access": "public", "access": "public",
"provenance": true "provenance": true
@ -16,7 +16,8 @@
} }
}, },
"files": [ "files": [
"dist" "dist",
"!dist/*.tsbuildinfo"
], ],
"scripts": { "scripts": {
"build": "tsc --pretty", "build": "tsc --pretty",
@ -40,7 +41,9 @@
"pino": "8.16.2" "pino": "8.16.2"
}, },
"devDependencies": { "devDependencies": {
"@emigrate/tsconfig": "workspace:*" "@emigrate/tsconfig": "workspace:*",
"@types/bun": "1.0.5",
"bun-types": "1.0.26"
}, },
"volta": { "volta": {
"extends": "../../package.json" "extends": "../../package.json"

View file

@ -52,6 +52,7 @@ class PinoReporter implements Required<EmigrateReporter> {
scope: command, scope: command,
version, version,
}, },
transport: process.isBun ? { target: 'pino/file', options: { destination: 1 } } : undefined,
}); });
this.#logger.info({ parameters }, `Emigrate "${command}" initialized${parameters.dry ? ' (dry-run)' : ''}`); this.#logger.info({ parameters }, `Emigrate "${command}" initialized${parameters.dry ? ' (dry-run)' : ''}`);
@ -116,12 +117,26 @@ class PinoReporter implements Required<EmigrateReporter> {
} }
onMigrationStart(migration: MigrationMetadata): Awaitable<void> { onMigrationStart(migration: MigrationMetadata): Awaitable<void> {
const status = this.#command === 'up' ? 'running' : 'removing'; let status = 'running';
if (this.#command === 'remove') {
status = 'removing';
} else if (this.#command === 'new') {
status = 'creating';
}
this.#logger.info({ migration: migration.relativeFilePath }, `${migration.name} (${status})`); this.#logger.info({ migration: migration.relativeFilePath }, `${migration.name} (${status})`);
} }
onMigrationSuccess(migration: MigrationMetadataFinished): Awaitable<void> { onMigrationSuccess(migration: MigrationMetadataFinished): Awaitable<void> {
const status = this.#command === 'up' ? 'done' : 'removed'; let status = 'done';
if (this.#command === 'remove') {
status = 'removed';
} else if (this.#command === 'new') {
status = 'created';
}
this.#logger.info({ migration: migration.relativeFilePath }, `${migration.name} (${status})`); this.#logger.info({ migration: migration.relativeFilePath }, `${migration.name} (${status})`);
} }
@ -189,6 +204,8 @@ export const createPinoReporter = (options: PinoReporterOptions = {}): EmigrateR
return new PinoReporter(options); return new PinoReporter(options);
}; };
export default createPinoReporter({ const defaultExport: EmigrateReporter = createPinoReporter({
level: process.env['LOG_LEVEL'], level: process.env['LOG_LEVEL'],
}); });
export default defaultExport;

View file

@ -1,8 +1,3 @@
{ {
"extends": "@emigrate/tsconfig/build.json", "extends": "@emigrate/tsconfig/build.json"
"compilerOptions": {
"outDir": "dist"
},
"include": ["src"],
"exclude": ["node_modules", "dist"]
} }

View file

@ -1,5 +1,13 @@
# @emigrate/storage-fs # @emigrate/storage-fs
## 0.4.7
### Patch Changes
- ca154fa: Minimize package size by excluding \*.tsbuildinfo files
- Updated dependencies [ca154fa]
- @emigrate/types@0.12.2
## 0.4.6 ## 0.4.6
### Patch Changes ### Patch Changes

View file

@ -1,6 +1,6 @@
{ {
"name": "@emigrate/storage-fs", "name": "@emigrate/storage-fs",
"version": "0.4.6", "version": "0.4.7",
"publishConfig": { "publishConfig": {
"access": "public", "access": "public",
"provenance": true "provenance": true
@ -16,7 +16,8 @@
} }
}, },
"files": [ "files": [
"dist" "dist",
"!dist/*.tsbuildinfo"
], ],
"scripts": { "scripts": {
"build": "tsc --pretty", "build": "tsc --pretty",

View file

@ -1,8 +1,3 @@
{ {
"extends": "@emigrate/tsconfig/build.json", "extends": "@emigrate/tsconfig/build.json"
"compilerOptions": {
"outDir": "dist"
},
"include": ["src"],
"exclude": ["node_modules", "dist"]
} }

View file

@ -1,5 +1,11 @@
# @emigrate/tsconfig # @emigrate/tsconfig
## 1.0.3
### Patch Changes
- d779286: Upgrade TypeScript to v5.5 and enable [isolatedDeclarations](https://devblogs.microsoft.com/typescript/announcing-typescript-5-5/#isolated-declarations)
## 1.0.2 ## 1.0.2
### Patch Changes ### Patch Changes

View file

@ -11,6 +11,7 @@
"forceConsistentCasingInFileNames": true, "forceConsistentCasingInFileNames": true,
"inlineSources": false, "inlineSources": false,
"isolatedModules": true, "isolatedModules": true,
"isolatedDeclarations": true,
"incremental": true, "incremental": true,
"module": "NodeNext", "module": "NodeNext",
"moduleResolution": "NodeNext", "moduleResolution": "NodeNext",
@ -31,5 +32,7 @@
"strict": true, "strict": true,
"target": "ES2022", "target": "ES2022",
"lib": ["ESNext", "DOM", "DOM.Iterable"] "lib": ["ESNext", "DOM", "DOM.Iterable"]
} },
"include": ["${configDir}/src"],
"exclude": ["${configDir}/dist"]
} }

View file

@ -3,6 +3,7 @@
"display": "Build", "display": "Build",
"extends": "./base.json", "extends": "./base.json",
"compilerOptions": { "compilerOptions": {
"noEmit": false "noEmit": false,
"outDir": "${configDir}/dist"
} }
} }

View file

@ -1,6 +1,6 @@
{ {
"name": "@emigrate/tsconfig", "name": "@emigrate/tsconfig",
"version": "1.0.2", "version": "1.0.3",
"publishConfig": { "publishConfig": {
"access": "public", "access": "public",
"provenance": true "provenance": true

View file

@ -1,5 +1,11 @@
# @emigrate/types # @emigrate/types
## 0.12.2
### Patch Changes
- ca154fa: Minimize package size by excluding \*.tsbuildinfo files
## 0.12.1 ## 0.12.1
### Patch Changes ### Patch Changes

View file

@ -1,6 +1,6 @@
{ {
"name": "@emigrate/types", "name": "@emigrate/types",
"version": "0.12.1", "version": "0.12.2",
"publishConfig": { "publishConfig": {
"access": "public", "access": "public",
"provenance": true "provenance": true
@ -16,7 +16,8 @@
} }
}, },
"files": [ "files": [
"dist" "dist",
"!dist/*.tsbuildinfo"
], ],
"scripts": { "scripts": {
"build": "tsc --pretty", "build": "tsc --pretty",

View file

@ -1,8 +1,3 @@
{ {
"extends": "@emigrate/tsconfig/build.json", "extends": "@emigrate/tsconfig/build.json"
"compilerOptions": {
"outDir": "dist"
},
"include": ["src"],
"exclude": ["node_modules", "dist"]
} }

12029
pnpm-lock.yaml generated

File diff suppressed because it is too large Load diff

View file

@ -1,6 +1,7 @@
{ {
"$schema": "https://turborepo.org/schema.json", "$schema": "https://turborepo.org/schema.json",
"pipeline": { "ui": "stream",
"tasks": {
"build": { "build": {
"dependsOn": ["^build"], "dependsOn": ["^build"],
"inputs": ["src/**/*", "!src/**/*.test.ts", "tsconfig.json", "tsconfig.build.json"], "inputs": ["src/**/*", "!src/**/*.test.ts", "tsconfig.json", "tsconfig.build.json"],