diff --git a/.cirrus.yml b/.cirrus.yml new file mode 100644 index 000000000..5a6fb8c59 --- /dev/null +++ b/.cirrus.yml @@ -0,0 +1,18 @@ +freebsd_instance: + image_family: freebsd-16-0-snap + +task: + name: FreeBSD + env: + IGNORE_OSVERSION: yes + skip_notifications: true + prerequisites_script: + - pkg update -f + - pkg upgrade -y + - pkg install -y devel/git devel/pkgconf graphics/vips www/node22 www/npm + - pkg-config --modversion vips-cpp + install_script: + - npm install + - npm run build + test_script: + - node --test test/unit/io.js diff --git a/.editorconfig b/.editorconfig index 5760be583..62fb51fed 100644 --- a/.editorconfig +++ b/.editorconfig @@ -7,6 +7,4 @@ indent_size = 2 charset = utf-8 trim_trailing_whitespace = true insert_final_newline = true - -[*.md] -trim_trailing_whitespace = false +max_line_length = 120 diff --git a/CONTRIBUTING.md b/.github/CONTRIBUTING.md similarity index 53% rename from CONTRIBUTING.md rename to .github/CONTRIBUTING.md index 89fbdc1c3..b65d48b98 100644 --- a/CONTRIBUTING.md +++ b/.github/CONTRIBUTING.md @@ -6,53 +6,49 @@ Hello, thank you for your interest in helping! Please create a [new issue](https://github.com/lovell/sharp/issues/new) containing the steps to reproduce the problem. -If you're having installation problems, please include the output of running `npm install --verbose sharp`. - New bugs are assigned a `triage` label whilst under investigation. ## Submit a new feature request -If a [similar request](https://github.com/lovell/sharp/labels/enhancement) exists, it's probably fastest to add a comment to it about your requirement. +If a [similar request](https://github.com/lovell/sharp/labels/enhancement) exists, +it's probably fastest to add a comment to it about your requirement. -Implementation is usually straightforward if _libvips_ [already supports](http://www.vips.ecs.soton.ac.uk/supported/current/doc/html/libvips/ch03.html) the feature you need. +Implementation is usually straightforward if libvips +[already supports](https://www.libvips.org/API/current/function-list.html) +the feature you need. ## Submit a Pull Request to fix a bug Thank you! To prevent the problem occurring again, please add unit tests that would have failed. -Please select the `master` branch as the destination for your Pull Request so your fix can be included in the next minor release. +Please select the `main` branch as the destination for your Pull Request so your fix can be included in the next minor release. -Please squash your changes into a single commit using a command like `git rebase -i upstream/master`. +Please squash your changes into a single commit using a command like `git rebase -i upstream/main`. -To test C++ changes, you can compile the module using `npm install` and then run the tests using `npm test`. +To test C++ changes, you can compile the module using `npm run build` and then run the tests using `npm test`. ## Submit a Pull Request with a new feature -Please add JavaScript [unit tests](https://github.com/lovell/sharp/tree/master/test/unit) to cover your new feature. -A test coverage report for the JavaScript code is generated in the `coverage/lcov-report` directory. +Please add JavaScript [unit tests](https://github.com/lovell/sharp/tree/main/test/unit) to cover your new feature. +Please also update the [TypeScript definitions](https://github.com/lovell/sharp/tree/main/lib/index.d.ts), along with the [type definition tests](https://github.com/lovell/sharp/tree/main/test/types/sharp.test-d.ts). Where possible, the functional tests use gradient-based perceptual hashes based on [dHash](http://www.hackerfactor.com/blog/index.php?/archives/529-Kind-of-Like-That.html) to compare expected vs actual images. -You deserve to add your details to the [list of contributors](https://github.com/lovell/sharp/blob/master/package.json#L5). +You deserve to add your details to the [list of contributors](https://github.com/lovell/sharp/blob/main/package.json#L5). Any change that modifies the existing public API should be added to the relevant work-in-progress branch for inclusion in the next major release. -| Release | WIP branch | -| ------: | :--------- | -| v0.18.0 | ridge | -| v0.19.0 | suit | - Please squash your changes into a single commit using a command like `git rebase -i upstream/`. ### Add a new public method -The API tries to be as fluent as possible. Image processing concepts follow the naming conventions from _libvips_ and, to a lesser extent, _ImageMagick_. +The API tries to be as fluent as possible. +Image processing concepts follow the naming conventions from libvips and, to a lesser extent, ImageMagick. -Most methods have optional parameters and assume sensible defaults. Methods with mandatory parameters often have names like `doSomethingWith(X)`. - -Please ensure backwards compatibility where possible. Methods to modify previously default behaviour often have names like `withoutOptionY()` or `withExtraZ()`. +Most methods have optional parameters and assume sensible defaults. +Please ensure backwards compatibility where possible. Feel free to create a [new issue](https://github.com/lovell/sharp/issues/new) to gather feedback on a potential API change. @@ -60,15 +56,15 @@ Feel free to create a [new issue](https://github.com/lovell/sharp/issues/new) to A method to be removed should be deprecated in the next major version then removed in the following major version. -By way of example, the [bilinearInterpolation method](https://github.com/lovell/sharp/blob/v0.6.0/index.js#L155) present in v0.5.0 was deprecated in v0.6.0 and removed in v0.7.0. +By way of example, the `background()` method present in v0.20.0 was deprecated in v0.21.0 and removed in v0.22.0. ## Documentation -The public API is documented with [JSDoc](http://usejsdoc.org/) annotated comments. +The public API is documented with [JSDoc](https://jsdoc.app/) annotated comments. These can be converted to Markdown by running: ```sh -npm run docs +npm run docs-build ``` Please include documentation updates in any Pull Request that modifies the public API. @@ -89,20 +85,11 @@ Requires [Valgrind](http://valgrind.org/). npm run test-leak ``` -### Packaging tests - -Tests the installation on a number of Linux-based operating systems. -Requires docker. - -```sh -npm run test-packaging -``` - ## Finally Please feel free to ask any questions via a [new issue](https://github.com/lovell/sharp/issues/new). If you're unable to post details publicly, please -[e-mail](https://github.com/lovell/sharp/blob/master/package.json#L4) +[e-mail](https://github.com/lovell/sharp/blob/main/package.json#L5) for private, paid consulting. diff --git a/.github/FUNDING.yml b/.github/FUNDING.yml new file mode 100644 index 000000000..c7fe901e5 --- /dev/null +++ b/.github/FUNDING.yml @@ -0,0 +1 @@ +open_collective: libvips diff --git a/.github/ISSUE_TEMPLATE/config.yml b/.github/ISSUE_TEMPLATE/config.yml new file mode 100644 index 000000000..3390f2471 --- /dev/null +++ b/.github/ISSUE_TEMPLATE/config.yml @@ -0,0 +1,5 @@ +blank_issues_enabled: false +contact_links: + - name: Documentation + url: https://sharp.pixelplumbing.com/ + about: Installation instructions, complete API documentation with examples, changelog diff --git a/.github/ISSUE_TEMPLATE/feature_request.md b/.github/ISSUE_TEMPLATE/feature_request.md new file mode 100644 index 000000000..d87e090af --- /dev/null +++ b/.github/ISSUE_TEMPLATE/feature_request.md @@ -0,0 +1,28 @@ +--- +name: Feature request +about: Suggest an idea +labels: enhancement + +--- + +## Feature request + +### What are you trying to achieve? + + + +### When you searched for similar feature requests, what did you find that might be related? + + + +### What would you expect the API to look like? + + + +### What alternatives have you considered? + + + +### Please provide sample image(s) that help explain this feature + + diff --git a/.github/ISSUE_TEMPLATE/installation.md b/.github/ISSUE_TEMPLATE/installation.md new file mode 100644 index 000000000..b47d10e9a --- /dev/null +++ b/.github/ISSUE_TEMPLATE/installation.md @@ -0,0 +1,69 @@ +--- +name: Installation +about: Something went wrong during either 'npm install sharp' or 'require("sharp")' +labels: installation + +--- + + + +## Possible install-time or require-time problem + + + +- [ ] I have read and understood all of the [documentation relating to installation](https://sharp.pixelplumbing.com/install). +- [ ] I have searched for known bugs relating to this problem in my choice of package manager. + +You must confirm both of these before continuing. + +### Are you using the latest version of sharp? + + + +- [ ] I am using the latest version of `sharp` as reported by `npm view sharp dist-tags.latest`. + +If you cannot confirm this, please upgrade to the latest version and try again before opening an issue. + +If you are using another package which depends on a version of `sharp` that is not the latest, +please open an issue against that package instead. + +### Are you using a supported runtime? + + + +- [ ] I am using Node.js with a version that satisfies `^18.17.0 || ^20.3.0 || >=21.0.0` +- [ ] I am using Deno +- [ ] I am using Bun + +If you cannot confirm any of these, +please upgrade to the latest version +and try again before opening an issue. + +### Are you using a supported package manager and installing optional dependencies? + + + +- [ ] I am using npm >= 10.1.0 with `--include=optional` +- [ ] I am using yarn >= 3.2.0 +- [ ] I am using pnpm >= 7.1.0 with `--no-optional=false` +- [ ] I am using Deno +- [ ] I am using Bun + +If you cannot confirm any of these, please upgrade to the latest version of your chosen package manager +and ensure you are allowing the installation of optional or multi-platform dependencies before opening an issue. + +### What is the complete error message, including the full stack trace? + + + +### What is the complete output of running `npm install --verbose --foreground-scripts sharp` in an empty directory? + +
+ + + +
+ +### What is the output of running `npx envinfo --binaries --system --npmPackages=sharp --npmGlobalPackages=sharp`? + + diff --git a/.github/ISSUE_TEMPLATE/possible-bug.md b/.github/ISSUE_TEMPLATE/possible-bug.md new file mode 100644 index 000000000..c7e130e17 --- /dev/null +++ b/.github/ISSUE_TEMPLATE/possible-bug.md @@ -0,0 +1,65 @@ +--- +name: Possible bug +about: Installation of sharp was successful but then something unexpected occurred using one of its features +labels: triage + +--- + + + +## Possible bug + +### Is this a possible bug in a feature of sharp, unrelated to installation? + + + +- [ ] Running `npm install sharp` completes without error. +- [ ] Running `node -e "require('sharp')"` completes without error. + +If you cannot confirm both of these, please open an [installation issue](https://github.com/lovell/sharp/issues/new?labels=installation&template=installation.md) instead. + +### Are you using the latest version of sharp? + + + +- [ ] I am using the latest version of `sharp` as reported by `npm view sharp dist-tags.latest`. + +If you cannot confirm this, please upgrade to the latest version and try again before opening an issue. + +If you are using another package which depends on a version of `sharp` that is not the latest, please open an issue against that package instead. + +### What is the output of running `npx envinfo --binaries --system --npmPackages=sharp --npmGlobalPackages=sharp`? + + + +### Does this problem relate to file caching? + +The default behaviour of libvips is to cache input files, which can lead to `EBUSY` or `EPERM` errors on Windows. +Use [`sharp.cache(false)`](https://sharp.pixelplumbing.com/api-utility#cache) to switch this feature off. + +- [ ] Adding `sharp.cache(false)` does not fix this problem. + +### Does this problem relate to images appearing to have been rotated by 90 degrees? + +Images that contain EXIF Orientation metadata are not auto-oriented. By default, EXIF metadata is removed. + +- To auto-orient pixel values use the parameter-less [`rotate()`](https://sharp.pixelplumbing.com/api-operation#rotate) operation. +- To retain EXIF Orientation use [`keepExif()`](https://sharp.pixelplumbing.com/api-output#keepexif). + +- [ ] Using `rotate()` or `keepExif()` does not fix this problem. + +### What are the steps to reproduce? + + + +### What is the expected behaviour? + + + +### Please provide a minimal, standalone code sample, without other dependencies, that demonstrates this problem + + + +### Please provide sample image(s) that help explain this problem + + diff --git a/.github/ISSUE_TEMPLATE/question.md b/.github/ISSUE_TEMPLATE/question.md new file mode 100644 index 000000000..8fbdb5f4d --- /dev/null +++ b/.github/ISSUE_TEMPLATE/question.md @@ -0,0 +1,26 @@ +--- +name: Question +about: For help understanding an existing feature +labels: question + +--- + + + +## Question about an existing feature + +### What are you trying to achieve? + + + +### When you searched for similar issues, what did you find that might be related? + + + +### Please provide a minimal, standalone code sample, without other dependencies, that demonstrates this question + + + +### Please provide sample image(s) that help explain this question + + diff --git a/.github/SECURITY.md b/.github/SECURITY.md new file mode 100644 index 000000000..e17f100b8 --- /dev/null +++ b/.github/SECURITY.md @@ -0,0 +1,18 @@ +# Security Policy + +## Supported Versions + +The latest version of `sharp` as published to npm +and reported by `npm view sharp dist-tags.latest` +is supported with security updates. + +## Reporting a Vulnerability + +Please use +[e-mail](https://github.com/lovell/sharp/blob/main/package.json#L5) +to report a vulnerability. + +You can expect a response within 48 hours +if you are a human reporting a genuine issue. + +Thank you in advance. diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml new file mode 100644 index 000000000..cd6838590 --- /dev/null +++ b/.github/workflows/ci.yml @@ -0,0 +1,356 @@ +name: CI +on: + - push + - pull_request +permissions: {} +jobs: + lint: + permissions: + contents: read + runs-on: ubuntu-24.04 + steps: + - uses: actions/checkout@v4 + - uses: actions/setup-node@v5 + with: + node-version: "24" + - run: npm install --ignore-scripts + - run: npm run lint-cpp + - run: npm run lint-js + - run: npm run lint-types + build-native: + permissions: + contents: read + needs: lint + name: "build-${{ matrix.platform }} [Node.js ${{ matrix.nodejs_version_major }}] ${{ matrix.package && '[package]' }}" + runs-on: ${{ matrix.os }} + container: ${{ matrix.container }} + strategy: + fail-fast: false + matrix: + include: + - os: ubuntu-24.04 + container: rockylinux:8 + nodejs_arch: x64 + nodejs_version: "^18.17.0" + nodejs_version_major: 18 + platform: linux-x64 + package: true + - os: ubuntu-24.04 + container: rockylinux:8 + nodejs_arch: x64 + nodejs_version: "^20.3.0" + nodejs_version_major: 20 + platform: linux-x64 + - os: ubuntu-24.04 + container: rockylinux:8 + nodejs_arch: x64 + nodejs_version: "^22.9.0" + nodejs_version_major: 22 + platform: linux-x64 + - os: ubuntu-24.04 + container: node:18-alpine3.17 + nodejs_version_major: 18 + platform: linuxmusl-x64 + package: true + - os: ubuntu-24.04 + container: node:20-alpine3.18 + nodejs_version_major: 20 + platform: linuxmusl-x64 + - os: ubuntu-24.04 + container: node:22-alpine3.20 + nodejs_version_major: 22 + platform: linuxmusl-x64 + - os: ubuntu-24.04-arm + container: arm64v8/rockylinux:8 + nodejs_arch: arm64 + nodejs_version: "^18.17.0" + nodejs_version_major: 18 + platform: linux-arm64 + package: true + - os: ubuntu-24.04-arm + container: arm64v8/rockylinux:8 + nodejs_arch: arm64 + nodejs_version: "^20.3.0" + nodejs_version_major: 20 + platform: linux-arm64 + - os: macos-15-intel + nodejs_arch: x64 + nodejs_version: "^18.17.0" + nodejs_version_major: 18 + platform: darwin-x64 + package: true + - os: macos-15-intel + nodejs_arch: x64 + nodejs_version: "^20.3.0" + nodejs_version_major: 20 + platform: darwin-x64 + - os: macos-15-intel + nodejs_arch: x64 + nodejs_version: "^22.9.0" + nodejs_version_major: 22 + platform: darwin-x64 + - os: macos-15 + nodejs_arch: arm64 + nodejs_version: "^18.17.0" + nodejs_version_major: 18 + platform: darwin-arm64 + package: true + - os: macos-15 + nodejs_arch: arm64 + nodejs_version: "^20.3.0" + nodejs_version_major: 20 + platform: darwin-arm64 + - os: macos-15 + nodejs_arch: arm64 + nodejs_version: "^22.9.0" + nodejs_version_major: 22 + platform: darwin-arm64 + - os: windows-2022 + nodejs_arch: x86 + nodejs_version: "18.18.2" # pinned to avoid 18.19.0 and npm 10 + nodejs_version_major: 18 + platform: win32-ia32 + package: true + - os: windows-2022 + nodejs_arch: x86 + nodejs_version: "^20.3.0" + nodejs_version_major: 20 + platform: win32-ia32 + - os: windows-2022 + nodejs_arch: x86 + nodejs_version: "^22.9.0" + nodejs_version_major: 22 + platform: win32-ia32 + - os: windows-2022 + nodejs_arch: x64 + nodejs_version: "^18.17.0" + nodejs_version_major: 18 + platform: win32-x64 + package: true + - os: windows-2022 + nodejs_arch: x64 + nodejs_version: "^20.3.0" + nodejs_version_major: 20 + platform: win32-x64 + - os: windows-2022 + nodejs_arch: x64 + nodejs_version: "^22.9.0" + nodejs_version_major: 22 + platform: win32-x64 + - os: windows-11-arm + nodejs_arch: arm64 + nodejs_version: "^20.3.0" + nodejs_version_major: 20 + platform: win32-arm64 + package: true + - os: windows-11-arm + nodejs_arch: arm64 + nodejs_version: "^22.9.0" + nodejs_version_major: 22 + platform: win32-arm64 + steps: + - name: Dependencies (Rocky Linux glibc) + if: contains(matrix.container, 'rockylinux') + run: | + dnf install -y gcc-toolset-14-gcc-c++ make git python3.12 fontconfig google-noto-sans-fonts + echo "/opt/rh/gcc-toolset-14/root/usr/bin" >> $GITHUB_PATH + - name: Dependencies (Linux musl) + if: contains(matrix.container, 'alpine') + run: apk add build-base git python3 font-noto --update-cache + - name: Dependencies (Python 3.11 - macOS, Windows) + if: contains(matrix.os, 'macos') || contains(matrix.os, 'windows') + uses: actions/setup-python@v5 + with: + python-version: "3.12" + - name: Dependencies (Node.js) + if: "!contains(matrix.platform, 'linuxmusl')" + uses: actions/setup-node@v5 + with: + node-version: ${{ matrix.nodejs_version }} + architecture: ${{ matrix.nodejs_arch }} + - uses: actions/checkout@v4 + - run: npm install + - run: npm run build + - run: npm run test-unit + - if: matrix.package + run: npm run package-from-local-build + - uses: actions/upload-artifact@v4 + if: matrix.package + with: + name: ${{ matrix.platform }} + path: npm/${{ matrix.platform }} + retention-days: 1 + if-no-files-found: error + build-linuxmusl-arm64: + permissions: + contents: read + needs: lint + name: "build-linuxmusl-arm64 [Node.js ${{ matrix.nodejs_version_major }}] ${{ matrix.package && '[package]' }}" + runs-on: ubuntu-24.04-arm + container: + image: ${{ matrix.container }} + volumes: + - /opt:/opt:rw,rshared + - /opt:/__e/node20:ro,rshared + strategy: + fail-fast: false + matrix: + include: + - container: node:18-alpine3.17 + nodejs_version_major: 18 + package: true + - container: node:20-alpine3.18 + nodejs_version_major: 20 + steps: + - name: Allow Linux musl containers on ARM64 runners # https://github.com/actions/runner/issues/801#issuecomment-2394425757 + shell: sh + run: | + sed -i "/^ID=/s/alpine/NotpineForGHA/" /etc/os-release + apk add nodejs --update-cache + mkdir /opt/bin + ln -s /usr/bin/node /opt/bin/node + - name: Dependencies + run: apk add build-base git python3 font-noto --update-cache + - uses: actions/checkout@v4 + - run: npm install + - run: npm run build + - run: npm run test-unit + - if: matrix.package + run: npm run package-from-local-build + - uses: actions/upload-artifact@v4 + if: matrix.package + with: + name: linuxmusl-arm64 + path: npm/linuxmusl-arm64 + retention-days: 1 + if-no-files-found: error + build-qemu: + permissions: + contents: read + needs: lint + name: "build-${{ matrix.platform }} [Node.js ${{ matrix.nodejs_version_major }}] [package]" + runs-on: ubuntu-24.04 + strategy: + fail-fast: false + matrix: + include: + - platform: linux-arm + base_image: "balenalib/rpi-raspbian:bullseye" + nodejs_arch: armv6l + nodejs_hostname: unofficial-builds.nodejs.org + nodejs_version: "18.17.0" + nodejs_version_major: 18 + - platform: linux-s390x + base_image: "--platform=linux/s390x s390x/debian:bookworm" + nodejs_arch: s390x + nodejs_hostname: nodejs.org + nodejs_version: "18.17.0" + nodejs_version_major: 18 + - platform: linux-ppc64 + base_image: "--platform=linux/ppc64le ppc64le/debian:bookworm" + nodejs_arch: ppc64le + nodejs_hostname: nodejs.org + nodejs_version: "18.17.0" + nodejs_version_major: 18 + - platform: linux-riscv64 + base_image: "--platform=linux/riscv64 riscv64/debian:trixie" + compiler_flags: "-march=rv64gc" + nodejs_arch: riscv64 + nodejs_hostname: unofficial-builds.nodejs.org + nodejs_version: "20.19.5" + nodejs_version_major: 20 + steps: + - uses: actions/checkout@v4 + - uses: uraimo/run-on-arch-action@v3 + with: + arch: none + distro: none + base_image: ${{ matrix.base_image }} + env: | + CFLAGS: "${{ matrix.compiler_flags }}" + CXXFLAGS: "${{ matrix.compiler_flags }}" + run: | + apt-get update + apt-get install -y curl g++ git libatomic1 make python3 xz-utils + mkdir /opt/nodejs + curl --silent https://${{ matrix.nodejs_hostname }}/download/release/v${{ matrix.nodejs_version}}/node-v${{ matrix.nodejs_version}}-linux-${{ matrix.nodejs_arch }}.tar.xz | tar xJC /opt/nodejs --strip-components=1 + export PATH=$PATH:/opt/nodejs/bin + npm install + npm run build + node --test test/unit/io.js + npm run package-from-local-build + - uses: actions/upload-artifact@v4 + with: + name: ${{ matrix.platform }} + path: npm/${{ matrix.platform }} + retention-days: 1 + if-no-files-found: error + build-emscripten: + permissions: + contents: read + needs: lint + name: "build-wasm32 [package]" + runs-on: ubuntu-24.04 + container: "emscripten/emsdk:4.0.18" + steps: + - uses: actions/checkout@v4 + - name: Dependencies + run: apt-get update && apt-get install -y pkg-config + - name: Dependencies (Node.js) + uses: actions/setup-node@v5 + with: + node-version: "20" + - run: npm install + - run: emmake npm run build + - name: Verify emscripten versions match + run: | + EMSCRIPTEN_VERSION_LIBVIPS=$(node -p "require('@img/sharp-libvips-dev-wasm32/versions').emscripten") + EMSCRIPTEN_VERSION_SHARP=$(emcc -dumpversion) + echo "libvips built with emscripten $EMSCRIPTEN_VERSION_LIBVIPS" + echo "sharp built with emscripten $EMSCRIPTEN_VERSION_SHARP" + test "$EMSCRIPTEN_VERSION_LIBVIPS" = "$EMSCRIPTEN_VERSION_SHARP" + - run: emmake npm run test-unit + - run: emmake npm run package-from-local-build + - uses: actions/upload-artifact@v4 + with: + name: wasm32 + path: npm/wasm32 + retention-days: 1 + if-no-files-found: error + release: + permissions: + contents: write + id-token: write + runs-on: ubuntu-24.04 + needs: + - build-native + - build-linuxmusl-arm64 + - build-qemu + - build-emscripten + steps: + - uses: actions/checkout@v4 + - uses: actions/download-artifact@v4 + with: + path: npm + - name: Create npm workspace tarball + run: tar -vcaf npm-workspace.tar.xz --directory npm --exclude=from-local-build.js . + - uses: actions/setup-node@v5 + with: + node-version: '24' + - name: Create release notes + run: npm run package-release-notes + - name: Create GitHub release for tag + if: startsWith(github.ref, 'refs/tags/v') + uses: ncipollo/release-action@v1 + with: + artifacts: npm-workspace.tar.xz + artifactContentType: application/x-xz + prerelease: ${{ contains(github.ref, '-rc') }} + makeLatest: ${{ !contains(github.ref, '-rc') }} + bodyFile: release-notes.md + - name: Publish platform-specific npm packages + if: startsWith(github.ref, 'refs/tags/v') + run: cd npm && npm publish --workspaces --tag=${{ contains(github.ref, '-rc') && 'next' || 'latest' }} + - name: Publish sharp npm package + if: startsWith(github.ref, 'refs/tags/v') + run: npm publish --tag=${{ contains(github.ref, '-rc') && 'next' || 'latest' }} diff --git a/.github/workflows/npm.yml b/.github/workflows/npm.yml new file mode 100644 index 000000000..4db44a5b4 --- /dev/null +++ b/.github/workflows/npm.yml @@ -0,0 +1,196 @@ +name: "CI: npm smoke test" + +on: + push: + tags: + - "v**" + +permissions: {} + +jobs: + release-smoke-test: + name: "${{ github.ref_name }} ${{ matrix.name }}" + runs-on: ${{ matrix.runs-on }} + strategy: + fail-fast: false + matrix: + include: + - name: linux-x64-node-npm + runs-on: ubuntu-24.04 + runtime: node + package-manager: npm + - name: linux-x64-node-pnpm + runs-on: ubuntu-24.04 + runtime: node + package-manager: pnpm + - name: linux-x64-node-yarn + runs-on: ubuntu-24.04 + runtime: node + package-manager: yarn + - name: linux-x64-node-yarn-pnp + runs-on: ubuntu-24.04 + runtime: node + package-manager: yarn-pnp + - name: linux-x64-node-yarn-v1 + runs-on: ubuntu-24.04 + runtime: node + package-manager: yarn-v1 + - name: linux-x64-deno + runs-on: ubuntu-24.04 + runtime: deno + - name: linux-x64-bun + runs-on: ubuntu-24.04 + runtime: bun + + - name: darwin-x64-node-npm + runs-on: macos-15-intel + runtime: node + package-manager: npm + - name: darwin-x64-node-pnpm + runs-on: macos-15-intel + runtime: node + package-manager: pnpm + - name: darwin-x64-node-yarn + runs-on: macos-15-intel + runtime: node + package-manager: yarn + - name: darwin-x64-node-yarn-pnp + runs-on: macos-15-intel + runtime: node + package-manager: yarn-pnp + - name: darwin-x64-node-yarn-v1 + runs-on: macos-15-intel + runtime: node + package-manager: yarn-v1 + - name: darwin-x64-deno + runs-on: macos-15-intel + runtime: deno + - name: darwin-x64-bun + runs-on: macos-15-intel + runtime: bun + + - name: win32-x64-node-npm + runs-on: windows-2022 + runtime: node + package-manager: npm + - name: win32-x64-node-pnpm + runs-on: windows-2022 + runtime: node + package-manager: pnpm + - name: win32-x64-node-yarn + runs-on: windows-2022 + runtime: node + package-manager: yarn + - name: win32-x64-node-yarn-pnp + runs-on: windows-2022 + runtime: node + package-manager: yarn-pnp + - name: win32-x64-node-yarn-v1 + runs-on: windows-2022 + runtime: node + package-manager: yarn-v1 + - name: win32-x64-deno + runs-on: windows-2022 + runtime: deno + + steps: + - name: Install Node.js + if: ${{ matrix.runtime == 'node' }} + uses: actions/setup-node@v5 + with: + node-version: 20 + - name: Install pnpm + if: ${{ matrix.package-manager == 'pnpm' }} + uses: pnpm/action-setup@v4 + with: + version: 8 + - name: Install Deno + if: ${{ matrix.runtime == 'deno' }} + uses: denoland/setup-deno@v2 + with: + deno-version: v2.x + - name: Install Bun + if: ${{ matrix.runtime == 'bun' }} + uses: oven-sh/setup-bun@v2 + with: + bun-version: latest + + - name: Version + id: version + uses: actions/github-script@v8 + with: + script: | + core.setOutput('semver', context.ref.replace('refs/tags/v','')) + - name: Create package.json + uses: DamianReeves/write-file-action@v1.3 + with: + path: package.json + contents: | + { + "dependencies": { + "sharp": "${{ steps.version.outputs.semver }}" + }, + "type": "module" + } + - name: Create release.mjs + uses: DamianReeves/write-file-action@v1.3 + with: + path: release.mjs + contents: | + import { deepStrictEqual } from 'node:assert'; + import sharp from 'sharp'; + deepStrictEqual(['.jpg', '.jpeg', '.jpe', '.jfif'], sharp.format.jpeg.input.fileSuffix); + + - name: Run with Node.js + npm + if: ${{ matrix.package-manager == 'npm' }} + run: | + npm install --ignore-scripts + node release.mjs + + - name: Run with Node.js + pnpm + if: ${{ matrix.package-manager == 'pnpm' }} + run: | + pnpm install --ignore-scripts + node release.mjs + + - name: Run with Node.js + yarn + if: ${{ matrix.package-manager == 'yarn' }} + run: | + corepack enable + yarn set version stable + yarn config set enableImmutableInstalls false + yarn config set enableScripts false + yarn config set nodeLinker node-modules + yarn install + node release.mjs + + - name: Run with Node.js + yarn pnp + if: ${{ matrix.package-manager == 'yarn-pnp' }} + run: | + corepack enable + yarn set version stable + yarn config set enableImmutableInstalls false + yarn config set enableScripts false + yarn config set nodeLinker pnp + yarn install + yarn node release.mjs + + - name: Run with Node.js + yarn v1 + if: ${{ matrix.package-manager == 'yarn-v1' }} + run: | + corepack enable + yarn set version classic + yarn install + node release.mjs + + - name: Run with Deno + if: ${{ matrix.runtime == 'deno' }} + run: | + deno install + deno run --allow-env --allow-ffi --allow-read --allow-sys release.mjs + + - name: Run with Bun + if: ${{ matrix.runtime == 'bun' }} + run: | + bun install --ignore-scripts + bun release.mjs diff --git a/.gitignore b/.gitignore index 94a990fb9..82f73b3f0 100644 --- a/.gitignore +++ b/.gitignore @@ -1,16 +1,19 @@ -build +src/build +src/node_modules node_modules -coverage +/coverage +npm/*/* +!npm/*/package.json test/bench/node_modules test/fixtures/output* +test/fixtures/vips-properties.xml test/leak/libvips.supp -test/saliency/report.json -test/saliency/Image* -test/saliency/[Uu]serData* -!test/saliency/userData.js -vendor -packaging/libvips* -packaging/*.log -!packaging/build .DS_Store .nyc_output +.vscode/ +package-lock.json +.idea +.firebase +.astro +docs/dist +release-notes.md diff --git a/.npmignore b/.npmignore deleted file mode 100644 index c4a8fd68d..000000000 --- a/.npmignore +++ /dev/null @@ -1,14 +0,0 @@ -build -node_modules -coverage -.editorconfig -.gitignore -test -.travis.yml -appveyor.yml -circle.yml -mkdocs.yml -vendor -packaging -preinstall.sh -.nyc_output diff --git a/.travis.yml b/.travis.yml deleted file mode 100644 index 24ac83ccc..000000000 --- a/.travis.yml +++ /dev/null @@ -1,27 +0,0 @@ -language: node_js -matrix: - include: - - os: linux - dist: trusty - sudo: false - node_js: "4" - - os: linux - dist: trusty - sudo: false - node_js: "6" - - os: linux - dist: trusty - sudo: false - node_js: "8" - - os: osx - osx_image: xcode8 - node_js: "4" - - os: osx - osx_image: xcode8 - node_js: "6" - - os: osx - osx_image: xcode8 - node_js: "8" -after_success: - - npm install coveralls - - cat ./coverage/lcov.info | ./node_modules/coveralls/bin/coveralls.js diff --git a/README.md b/README.md index 15731a923..47da52e8b 100644 --- a/README.md +++ b/README.md @@ -1,15 +1,18 @@ # sharp -```sh -npm install sharp -``` +sharp logo -The typical use case for this high speed Node.js module +The typical use case for this high speed Node-API module is to convert large images in common formats to -smaller, web-friendly JPEG, PNG and WebP images of varying dimensions. +smaller, web-friendly JPEG, PNG, WebP, GIF and AVIF images of varying dimensions. + +It can be used with all JavaScript runtimes +that provide support for Node-API v9, including +Node.js (^18.17.0 or >= 20.3.0), Deno and Bun. Resizing an image is typically 4x-5x faster than using the -quickest ImageMagick and GraphicsMagick settings. +quickest ImageMagick and GraphicsMagick settings +due to its use of [libvips](https://github.com/libvips/libvips). Colour spaces, embedded ICC profiles and alpha transparency channels are all handled correctly. Lanczos resampling ensures quality is not sacrificed for speed. @@ -17,39 +20,76 @@ Lanczos resampling ensures quality is not sacrificed for speed. As well as image resizing, operations such as rotation, extraction, compositing and gamma correction are available. -OS X, Windows (x64), Linux (x64, ARM) systems do not require -the installation of any external runtime dependencies. +Most modern macOS, Windows and Linux systems +do not require any additional install or runtime dependencies. + +## Documentation + +Visit [sharp.pixelplumbing.com](https://sharp.pixelplumbing.com/) for complete +[installation instructions](https://sharp.pixelplumbing.com/install), +[API documentation](https://sharp.pixelplumbing.com/api-constructor), +[benchmark tests](https://sharp.pixelplumbing.com/performance) and +[changelog](https://sharp.pixelplumbing.com/changelog). ## Examples +```sh +npm install sharp +``` + ```javascript -import sharp from 'sharp'; +const sharp = require('sharp'); ``` +### Callback + ```javascript sharp(inputBuffer) .resize(320, 240) - .toFile('output.webp', (err, info) => ... ); + .toFile('output.webp', (err, info) => { ... }); ``` +### Promise + ```javascript sharp('input.jpg') .rotate() .resize(200) + .jpeg({ mozjpeg: true }) .toBuffer() - .then( data => ... ) - .catch( err => ... ); + .then( data => { ... }) + .catch( err => { ... }); +``` + +### Async/await + +```javascript +const semiTransparentRedPng = await sharp({ + create: { + width: 48, + height: 48, + channels: 4, + background: { r: 255, g: 0, b: 0, alpha: 0.5 } + } +}) + .png() + .toBuffer(); ``` +### Stream + ```javascript -const roundedCorners = new Buffer( +const roundedCorners = Buffer.from( '' ); const roundedCornerResizer = sharp() .resize(200, 200) - .overlayWith(roundedCorners, { cutout: true }) + .composite([{ + input: roundedCorners, + blend: 'dest-in' + }]) .png(); readableStream @@ -57,29 +97,19 @@ readableStream .pipe(writableStream); ``` -[![Test Coverage](https://coveralls.io/repos/lovell/sharp/badge.png?branch=master)](https://coveralls.io/r/lovell/sharp?branch=master) - -### Documentation - -Visit [sharp.dimens.io](http://sharp.dimens.io/) for complete -[installation instructions](http://sharp.dimens.io/page/install), -[API documentation](http://sharp.dimens.io/page/api), -[benchmark tests](http://sharp.dimens.io/page/performance) and -[changelog](http://sharp.dimens.io/page/changelog). - -### Contributing +## Contributing -A [guide for contributors](https://github.com/lovell/sharp/blob/master/CONTRIBUTING.md) +A [guide for contributors](https://github.com/lovell/sharp/blob/main/.github/CONTRIBUTING.md) covers reporting bugs, requesting features and submitting code changes. -### Licence +## Licensing -Copyright 2013, 2014, 2015, 2016, 2017 Lovell Fuller and contributors. +Copyright 2013 Lovell Fuller and others. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at -[http://www.apache.org/licenses/LICENSE-2.0](http://www.apache.org/licenses/LICENSE-2.0.html) +[https://www.apache.org/licenses/LICENSE-2.0](https://www.apache.org/licenses/LICENSE-2.0) Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, diff --git a/appveyor.yml b/appveyor.yml deleted file mode 100644 index c77dfaece..000000000 --- a/appveyor.yml +++ /dev/null @@ -1,15 +0,0 @@ -os: Visual Studio 2015 -version: "{build}" -build: off -platform: x64 -environment: - matrix: - - nodejs_version: "4" - - nodejs_version: "6" - - nodejs_version: "7" -install: - - ps: Install-Product node $env:nodejs_version x64 - - npm install -g npm@latest - - npm install -test_script: - - npm test diff --git a/binding.gyp b/binding.gyp deleted file mode 100644 index f20036a28..000000000 --- a/binding.gyp +++ /dev/null @@ -1,278 +0,0 @@ -{ - 'targets': [{ - 'target_name': 'libvips-cpp', - 'conditions': [ - ['OS == "win"', { - # Build libvips C++ binding for Windows due to MSVC std library ABI changes - 'type': 'shared_library', - 'variables': { - 'download_vips': '/dev/null 2>&1 && eval $(brew --env) && echo $PKG_CONFIG_LIBDIR || true):$PKG_CONFIG_PATH:/usr/local/lib/pkgconfig:/usr/lib/pkgconfig' - }, { - 'pkg_config_path': '' - }] - ], - }, - 'conditions': [ - ['OS != "win"', { - # Which version, if any, of libvips is available globally via pkg-config? - 'global_vips_version': '/dev/null || true)' - }, { - 'global_vips_version': '' - }] - ], - 'pkg_config_path%': '<(pkg_config_path)' - }, - 'pkg_config_path%': '<(pkg_config_path)', - 'runtime_link%': 'shared', - 'conditions': [ - ['OS != "win"', { - # Does the globally available version of libvips, if any, meet the minimum version requirement? - 'use_global_vips': '= 2.13 - if (detectLibc.family === detectLibc.GLIBC && detectLibc.version && semver.lt(`${detectLibc.version}.0`, '2.13.0')) { - error(`Use with glibc version ${detectLibc.version} requires manual installation of libvips - please see http://sharp.dimens.io/page/install`); - } - // Arch/platform-specific .tar.gz - const tarFilename = ['libvips', minimumLibvipsVersion, platformId()].join('-') + '.tar.gz'; - const tarPathLocal = path.join(__dirname, 'packaging', tarFilename); - if (isFile(tarPathLocal)) { - unpack(tarPathLocal); - } else { - // Download to per-process temporary file - const tarPathTemp = path.join(os.tmpdir(), process.pid + '-' + tarFilename); - const tmpFile = fs.createWriteStream(tarPathTemp).on('finish', function () { - unpack(tarPathTemp, function () { - // Attempt to remove temporary file - try { - fs.unlinkSync(tarPathTemp); - } catch (err) {} - }); - }); - const gotOpt = { - agent: caw(null, { - protocol: 'https' - }) - }; - const url = distBaseUrl + tarFilename; - got.stream(url, gotOpt).on('response', function (response) { - if (response.statusCode !== 200) { - error(url + ' status code ' + response.statusCode); - } - }).on('error', function (err) { - error('Download of ' + url + ' failed: ' + err.message); - }).pipe(tmpFile); - } - } -}; - -module.exports.use_global_vips = function () { - const globalVipsVersion = process.env.GLOBAL_VIPS_VERSION; - if (globalVipsVersion) { - const useGlobalVips = semver.gte( - globalVipsVersion, - minimumLibvipsVersion - ); - process.stdout.write(useGlobalVips ? 'true' : 'false'); - } else { - process.stdout.write('false'); - } -}; diff --git a/biome.json b/biome.json new file mode 100644 index 000000000..7946049c8 --- /dev/null +++ b/biome.json @@ -0,0 +1,26 @@ +{ + "$schema": "https://biomejs.dev/schemas/2.3.4/schema.json", + "vcs": { + "enabled": true, + "clientKind": "git", + "useIgnoreFile": true + }, + "files": { + "ignoreUnknown": true + }, + "linter": { + "enabled": true, + "rules": { + "recommended": true + } + }, + "formatter": { + "enabled": false, + "useEditorconfig": true + }, + "javascript": { + "formatter": { + "quoteStyle": "single" + } + } +} diff --git a/circle.yml b/circle.yml deleted file mode 100644 index 5f7a6210b..000000000 --- a/circle.yml +++ /dev/null @@ -1,8 +0,0 @@ -machine: - node: - version: v4.8.4 - services: - - docker -test: - override: - - ./packaging/test-linux-x64.sh diff --git a/docs/api-channel.md b/docs/api-channel.md deleted file mode 100644 index e295f12a8..000000000 --- a/docs/api-channel.md +++ /dev/null @@ -1,77 +0,0 @@ - - -### Table of Contents - -- [extractChannel](#extractchannel) -- [joinChannel](#joinchannel) -- [bandbool](#bandbool) - -## extractChannel - -Extract a single channel from a multi-channel image. - -**Parameters** - -- `channel` **([Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number) \| [String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String))** zero-indexed band number to extract, or `red`, `green` or `blue` as alternative to `0`, `1` or `2` respectively. - -**Examples** - -```javascript -sharp(input) - .extractChannel('green') - .toFile('input_green.jpg', function(err, info) { - // info.channels === 1 - // input_green.jpg contains the green channel of the input image - }); -``` - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid channel - -Returns **Sharp** - -## joinChannel - -Join one or more channels to the image. -The meaning of the added channels depends on the output colourspace, set with `toColourspace()`. -By default the output image will be web-friendly sRGB, with additional channels interpreted as alpha channels. -Channel ordering follows vips convention: - -- sRGB: 0: Red, 1: Green, 2: Blue, 3: Alpha. -- CMYK: 0: Magenta, 1: Cyan, 2: Yellow, 3: Black, 4: Alpha. - -Buffers may be any of the image formats supported by sharp: JPEG, PNG, WebP, GIF, SVG, TIFF or raw pixel image data. -For raw pixel input, the `options` object should contain a `raw` attribute, which follows the format of the attribute of the same name in the `sharp()` constructor. - -**Parameters** - -- `images` **([Array](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array)<([String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String) \| [Buffer](https://nodejs.org/api/buffer.html))> | [String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String) \| [Buffer](https://nodejs.org/api/buffer.html))** one or more images (file paths, Buffers). -- `options` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)** image options, see `sharp()` constructor. - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** - -## bandbool - -Perform a bitwise boolean operation on all input image channels (bands) to produce a single channel output image. - -**Parameters** - -- `boolOp` **[String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String)** one of `and`, `or` or `eor` to perform that bitwise operation, like the C logic operators `&`, `|` and `^` respectively. - -**Examples** - -```javascript -sharp('3-channel-rgb-input.png') - .bandbool(sharp.bool.and) - .toFile('1-channel-output.png', function (err, info) { - // The output will be a single channel image where each pixel `P = R & G & B`. - // If `I(1,1) = [247, 170, 14] = [0b11110111, 0b10101010, 0b00001111]` - // then `O(1,1) = 0b11110111 & 0b10101010 & 0b00001111 = 0b00000010 = 2`. - }); -``` - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** diff --git a/docs/api-colour.md b/docs/api-colour.md deleted file mode 100644 index 7a8141176..000000000 --- a/docs/api-colour.md +++ /dev/null @@ -1,79 +0,0 @@ - - -### Table of Contents - -- [background](#background) -- [greyscale](#greyscale) -- [grayscale](#grayscale) -- [toColourspace](#tocolourspace) -- [toColorspace](#tocolorspace) - -## background - -Set the background for the `embed`, `flatten` and `extend` operations. -The default background is `{r: 0, g: 0, b: 0, alpha: 1}`, black without transparency. - -Delegates to the _color_ module, which can throw an Error -but is liberal in what it accepts, clipping values to sensible min/max. -The alpha value is a float between `0` (transparent) and `1` (opaque). - -**Parameters** - -- `rgba` **([String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String) \| [Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object))** parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameter - -Returns **Sharp** - -## greyscale - -Convert to 8-bit greyscale; 256 shades of grey. -This is a linear operation. If the input image is in a non-linear colour space such as sRGB, use `gamma()` with `greyscale()` for the best results. -By default the output image will be web-friendly sRGB and contain three (identical) color channels. -This may be overridden by other sharp operations such as `toColourspace('b-w')`, -which will produce an output image containing one color channel. -An alpha channel may be present, and will be unchanged by the operation. - -**Parameters** - -- `greyscale` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** (optional, default `true`) - -Returns **Sharp** - -## grayscale - -Alternative spelling of `greyscale`. - -**Parameters** - -- `grayscale` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** (optional, default `true`) - -Returns **Sharp** - -## toColourspace - -Set the output colourspace. -By default output image will be web-friendly sRGB, with additional channels interpreted as alpha channels. - -**Parameters** - -- `colourspace` **[String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String)?** output colourspace e.g. `srgb`, `rgb`, `cmyk`, `lab`, `b-w` [...](https://github.com/jcupitt/libvips/blob/master/libvips/iofuncs/enumtypes.c#L568) - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** - -## toColorspace - -Alternative spelling of `toColourspace`. - -**Parameters** - -- `colorspace` **[String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String)?** output colorspace. - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** diff --git a/docs/api-composite.md b/docs/api-composite.md deleted file mode 100644 index 284441c83..000000000 --- a/docs/api-composite.md +++ /dev/null @@ -1,59 +0,0 @@ - - -### Table of Contents - -- [overlayWith](#overlaywith) - -## overlayWith - -Overlay (composite) an image over the processed (resized, extracted etc.) image. - -The overlay image must be the same size or smaller than the processed image. -If both `top` and `left` options are provided, they take precedence over `gravity`. - -If the overlay image contains an alpha channel then composition with premultiplication will occur. - -**Parameters** - -- `overlay` **([Buffer](https://nodejs.org/api/buffer.html) \| [String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String))** Buffer containing image data or String containing the path to an image file. -- `options` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)?** - - `options.gravity` **[String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String)** gravity at which to place the overlay. (optional, default `'centre'`) - - `options.top` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** the pixel offset from the top edge. - - `options.left` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** the pixel offset from the left edge. - - `options.tile` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** set to true to repeat the overlay image across the entire image with the given `gravity`. (optional, default `false`) - - `options.cutout` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** set to true to apply only the alpha channel of the overlay image to the input image, giving the appearance of one image being cut out of another. (optional, default `false`) - - `options.density` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** integral number representing the DPI for vector overlay image. (optional, default `72`) - - `options.raw` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)?** describes overlay when using raw pixel data. - - `options.raw.width` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** - - `options.raw.height` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** - - `options.raw.channels` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** - - `options.create` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)?** describes a blank overlay to be created. - - `options.create.width` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** - - `options.create.height` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** - - `options.create.channels` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** 3-4 - - `options.create.background` **([String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String) \| [Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object))?** parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. - -**Examples** - -```javascript -sharp('input.png') - .rotate(180) - .resize(300) - .flatten() - .background('#ff6600') - .overlayWith('overlay.png', { gravity: sharp.gravity.southeast } ) - .sharpen() - .withMetadata() - .quality(90) - .webp() - .toBuffer() - .then(function(outputBuffer) { - // outputBuffer contains upside down, 300px wide, alpha channel flattened - // onto orange background, composited with overlay.png with SE gravity, - // sharpened, with metadata, 90% quality WebP image data. Phew! - }); -``` - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** diff --git a/docs/api-constructor.md b/docs/api-constructor.md deleted file mode 100644 index b5b48b2b0..000000000 --- a/docs/api-constructor.md +++ /dev/null @@ -1,108 +0,0 @@ - - -### Table of Contents - -- [Sharp](#sharp) - - [format](#format) - - [versions](#versions) -- [queue](#queue) - -## Sharp - -**Parameters** - -- `input` **([Buffer](https://nodejs.org/api/buffer.html) \| [String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String))?** if present, can be - a Buffer containing JPEG, PNG, WebP, GIF, SVG, TIFF or raw pixel image data, or - a String containing the path to an JPEG, PNG, WebP, GIF, SVG or TIFF image file. - JPEG, PNG, WebP, GIF, SVG, TIFF or raw pixel image data can be streamed into the object when not present. -- `options` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)?** if present, is an Object with optional attributes. - - `options.density` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** integral number representing the DPI for vector images. (optional, default `72`) - - `options.raw` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)?** describes raw pixel input image data. See `raw()` for pixel ordering. - - `options.raw.width` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** - - `options.raw.height` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** - - `options.raw.channels` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** 1-4 - - `options.create` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)?** describes a new image to be created. - - `options.create.width` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** - - `options.create.height` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** - - `options.create.channels` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** 3-4 - - `options.create.background` **([String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String) \| [Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object))?** parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. - -**Examples** - -```javascript -sharp('input.jpg') - .resize(300, 200) - .toFile('output.jpg', function(err) { - // output.jpg is a 300 pixels wide and 200 pixels high image - // containing a scaled and cropped version of input.jpg - }); -``` - -```javascript -// Read image data from readableStream, -// resize to 300 pixels wide, -// emit an 'info' event with calculated dimensions -// and finally write image data to writableStream -var transformer = sharp() - .resize(300) - .on('info', function(info) { - console.log('Image height is ' + info.height); - }); -readableStream.pipe(transformer).pipe(writableStream); -``` - -```javascript -// Create a blank 300x200 PNG image of semi-transluent red pixels -sharp({ - create: { - width: 300, - height: 200, - channels: 4, - background: { r: 255, g: 0, b: 0, alpha: 128 } - } -}) -.png() -.toBuffer() -.then( ... ); -``` - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **[Sharp](#sharp)** - -### format - -An Object containing nested boolean values representing the available input and output formats/methods. - -**Examples** - -```javascript -console.log(sharp.format); -``` - -Returns **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)** - -### versions - -An Object containing the version numbers of libvips and its dependencies. - -**Examples** - -```javascript -console.log(sharp.versions); -``` - -## queue - -An EventEmitter that emits a `change` event when a task is either: - -- queued, waiting for _libuv_ to provide a worker thread -- complete - -**Examples** - -```javascript -sharp.queue.on('change', function(queueLength) { - console.log('Queue contains ' + queueLength + ' task(s)'); -}); -``` diff --git a/docs/api-input.md b/docs/api-input.md deleted file mode 100644 index 98ebb0159..000000000 --- a/docs/api-input.md +++ /dev/null @@ -1,94 +0,0 @@ - - -### Table of Contents - -- [clone](#clone) -- [metadata](#metadata) -- [limitInputPixels](#limitinputpixels) -- [sequentialRead](#sequentialread) - -## clone - -Take a "snapshot" of the Sharp instance, returning a new instance. -Cloned instances inherit the input of their parent instance. -This allows multiple output Streams and therefore multiple processing pipelines to share a single input Stream. - -**Examples** - -```javascript -const pipeline = sharp().rotate(); -pipeline.clone().resize(800, 600).pipe(firstWritableStream); -pipeline.clone().extract({ left: 20, top: 20, width: 100, height: 100 }).pipe(secondWritableStream); -readableStream.pipe(pipeline); -// firstWritableStream receives auto-rotated, resized readableStream -// secondWritableStream receives auto-rotated, extracted region of readableStream -``` - -Returns **Sharp** - -## metadata - -Fast access to (uncached) image metadata without decoding any compressed image data. -A Promises/A+ promise is returned when `callback` is not provided. - -- `format`: Name of decoder used to decompress image data e.g. `jpeg`, `png`, `webp`, `gif`, `svg` -- `width`: Number of pixels wide -- `height`: Number of pixels high -- `space`: Name of colour space interpretation e.g. `srgb`, `rgb`, `cmyk`, `lab`, `b-w` [...](https://github.com/jcupitt/libvips/blob/master/libvips/iofuncs/enumtypes.c#L636) -- `channels`: Number of bands e.g. `3` for sRGB, `4` for CMYK -- `depth`: Name of pixel depth format e.g. `uchar`, `char`, `ushort`, `float` [...](https://github.com/jcupitt/libvips/blob/master/libvips/iofuncs/enumtypes.c#L672) -- `density`: Number of pixels per inch (DPI), if present -- `hasProfile`: Boolean indicating the presence of an embedded ICC profile -- `hasAlpha`: Boolean indicating the presence of an alpha transparency channel -- `orientation`: Number value of the EXIF Orientation header, if present -- `exif`: Buffer containing raw EXIF data, if present -- `icc`: Buffer containing raw [ICC](https://www.npmjs.com/package/icc) profile data, if present - -**Parameters** - -- `callback` **[Function](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/function)?** called with the arguments `(err, metadata)` - -**Examples** - -```javascript -const image = sharp(inputJpg); -image - .metadata() - .then(function(metadata) { - return image - .resize(Math.round(metadata.width / 2)) - .webp() - .toBuffer(); - }) - .then(function(data) { - // data contains a WebP image half the width and height of the original JPEG - }); -``` - -Returns **([Promise](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise)<[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)> | Sharp)** - -## limitInputPixels - -Do not process input images where the number of pixels (width _ height) exceeds this limit. -Assumes image dimensions contained in the input metadata can be trusted. -The default limit is 268402689 (0x3FFF _ 0x3FFF) pixels. - -**Parameters** - -- `limit` **([Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number) \| [Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean))** an integral Number of pixels, zero or false to remove limit, true to use default limit. - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid limit - -Returns **Sharp** - -## sequentialRead - -An advanced setting that switches the libvips access method to `VIPS_ACCESS_SEQUENTIAL`. -This will reduce memory usage and can improve performance on some systems. - -**Parameters** - -- `sequentialRead` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** (optional, default `true`) - -Returns **Sharp** diff --git a/docs/api-operation.md b/docs/api-operation.md deleted file mode 100644 index 86b76528c..000000000 --- a/docs/api-operation.md +++ /dev/null @@ -1,323 +0,0 @@ - - -### Table of Contents - -- [rotate](#rotate) -- [extract](#extract) -- [flip](#flip) -- [flop](#flop) -- [sharpen](#sharpen) -- [blur](#blur) -- [extend](#extend) -- [flatten](#flatten) -- [trim](#trim) -- [gamma](#gamma) -- [negate](#negate) -- [normalise](#normalise) -- [normalize](#normalize) -- [convolve](#convolve) -- [threshold](#threshold) -- [boolean](#boolean) - -## rotate - -Rotate the output image by either an explicit angle -or auto-orient based on the EXIF `Orientation` tag. - -If an angle is provided, it is converted to a valid 90/180/270deg rotation. -For example, `-450` will produce a 270deg rotation. - -If no angle is provided, it is determined from the EXIF data. -Mirroring is supported and may infer the use of a flip operation. - -The use of `rotate` implies the removal of the EXIF `Orientation` tag, if any. - -Method order is important when both rotating and extracting regions, -for example `rotate(x).extract(y)` will produce a different result to `extract(y).rotate(x)`. - -**Parameters** - -- `angle` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** angle of rotation, must be a multiple of 90. (optional, default `auto`) - -**Examples** - -```javascript -const pipeline = sharp() - .rotate() - .resize(null, 200) - .toBuffer(function (err, outputBuffer, info) { - // outputBuffer contains 200px high JPEG image data, - // auto-rotated using EXIF Orientation tag - // info.width and info.height contain the dimensions of the resized image - }); -readableStream.pipe(pipeline); -``` - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** - -## extract - -Extract a region of the image. - -- Use `extract` before `resize` for pre-resize extraction. -- Use `extract` after `resize` for post-resize extraction. -- Use `extract` before and after for both. - -**Parameters** - -- `options` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)** - - `options.left` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** zero-indexed offset from left edge - - `options.top` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** zero-indexed offset from top edge - - `options.width` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** dimension of extracted image - - `options.height` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** dimension of extracted image - -**Examples** - -```javascript -sharp(input) - .extract({ left: left, top: top, width: width, height: height }) - .toFile(output, function(err) { - // Extract a region of the input image, saving in the same format. - }); -``` - -```javascript -sharp(input) - .extract({ left: leftOffsetPre, top: topOffsetPre, width: widthPre, height: heightPre }) - .resize(width, height) - .extract({ left: leftOffsetPost, top: topOffsetPost, width: widthPost, height: heightPost }) - .toFile(output, function(err) { - // Extract a region, resize, then extract from the resized image - }); -``` - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** - -## flip - -Flip the image about the vertical Y axis. This always occurs after rotation, if any. -The use of `flip` implies the removal of the EXIF `Orientation` tag, if any. - -**Parameters** - -- `flip` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** (optional, default `true`) - -Returns **Sharp** - -## flop - -Flop the image about the horizontal X axis. This always occurs after rotation, if any. -The use of `flop` implies the removal of the EXIF `Orientation` tag, if any. - -**Parameters** - -- `flop` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** (optional, default `true`) - -Returns **Sharp** - -## sharpen - -Sharpen the image. -When used without parameters, performs a fast, mild sharpen of the output image. -When a `sigma` is provided, performs a slower, more accurate sharpen of the L channel in the LAB colour space. -Separate control over the level of sharpening in "flat" and "jagged" areas is available. - -**Parameters** - -- `sigma` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`. -- `flat` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** the level of sharpening to apply to "flat" areas. (optional, default `1.0`) -- `jagged` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** the level of sharpening to apply to "jagged" areas. (optional, default `2.0`) - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** - -## blur - -Blur the image. -When used without parameters, performs a fast, mild blur of the output image. -When a `sigma` is provided, performs a slower, more accurate Gaussian blur. - -**Parameters** - -- `sigma` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** a value between 0.3 and 1000 representing the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`. - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** - -## extend - -Extends/pads the edges of the image with the colour provided to the `background` method. -This operation will always occur after resizing and extraction, if any. - -**Parameters** - -- `extend` **([Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number) \| [Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object))** single pixel count to add to all edges or an Object with per-edge counts - - `extend.top` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** - - `extend.left` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** - - `extend.bottom` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** - - `extend.right` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** - -**Examples** - -```javascript -// Resize to 140 pixels wide, then add 10 transparent pixels -// to the top, left and right edges and 20 to the bottom edge -sharp(input) - .resize(140) - .background({r: 0, g: 0, b: 0, alpha: 0}) - .extend({top: 10, bottom: 20, left: 10, right: 10}) - ... -``` - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** - -## flatten - -Merge alpha transparency channel, if any, with `background`. - -**Parameters** - -- `flatten` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** (optional, default `true`) - -Returns **Sharp** - -## trim - -Trim "boring" pixels from all edges that contain values within a percentage similarity of the top-left pixel. - -**Parameters** - -- `tolerance` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** value between 1 and 99 representing the percentage similarity. (optional, default `10`) - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** - -## gamma - -Apply a gamma correction by reducing the encoding (darken) pre-resize at a factor of `1/gamma` -then increasing the encoding (brighten) post-resize at a factor of `gamma`. -This can improve the perceived brightness of a resized image in non-linear colour spaces. -JPEG and WebP input images will not take advantage of the shrink-on-load performance optimisation -when applying a gamma correction. - -**Parameters** - -- `gamma` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** value between 1.0 and 3.0. (optional, default `2.2`) - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** - -## negate - -Produce the "negative" of the image. - -**Parameters** - -- `negate` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** (optional, default `true`) - -Returns **Sharp** - -## normalise - -Enhance output image contrast by stretching its luminance to cover the full dynamic range. - -**Parameters** - -- `normalise` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** (optional, default `true`) - -Returns **Sharp** - -## normalize - -Alternative spelling of normalise. - -**Parameters** - -- `normalize` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** (optional, default `true`) - -Returns **Sharp** - -## convolve - -Convolve the image with the specified kernel. - -**Parameters** - -- `kernel` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)** - - `kernel.width` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** width of the kernel in pixels. - - `kernel.height` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** width of the kernel in pixels. - - `kernel.kernel` **[Array](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array)<[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)>** Array of length `width*height` containing the kernel values. - - `kernel.scale` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** the scale of the kernel in pixels. (optional, default `sum`) - - `kernel.offset` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** the offset of the kernel in pixels. (optional, default `0`) - -**Examples** - -```javascript -sharp(input) - .convolve({ - width: 3, - height: 3, - kernel: [-1, 0, 1, -2, 0, 2, -1, 0, 1] - }) - .raw() - .toBuffer(function(err, data, info) { - // data contains the raw pixel data representing the convolution - // of the input image with the horizontal Sobel operator - }); -``` - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** - -## threshold - -Any pixel value greather than or equal to the threshold value will be set to 255, otherwise it will be set to 0. - -**Parameters** - -- `threshold` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** a value in the range 0-255 representing the level at which the threshold will be applied. (optional, default `128`) -- `options` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)?** - - `options.greyscale` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** convert to single channel greyscale. (optional, default `true`) - - `options.grayscale` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** alternative spelling for greyscale. (optional, default `true`) - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** - -## boolean - -Perform a bitwise boolean operation with operand image. - -This operation creates an output image where each pixel is the result of -the selected bitwise boolean `operation` between the corresponding pixels of the input images. - -**Parameters** - -- `operand` **([Buffer](https://nodejs.org/api/buffer.html) \| [String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String))** Buffer containing image data or String containing the path to an image file. -- `operator` **[String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String)** one of `and`, `or` or `eor` to perform that bitwise operation, like the C logic operators `&`, `|` and `^` respectively. -- `options` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)?** - - `options.raw` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)?** describes operand when using raw pixel data. - - `options.raw.width` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** - - `options.raw.height` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** - - `options.raw.channels` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** diff --git a/docs/api-output.md b/docs/api-output.md deleted file mode 100644 index f992c238d..000000000 --- a/docs/api-output.md +++ /dev/null @@ -1,202 +0,0 @@ - - -### Table of Contents - -- [toFile](#tofile) -- [toBuffer](#tobuffer) -- [withMetadata](#withmetadata) -- [jpeg](#jpeg) -- [png](#png) -- [webp](#webp) -- [tiff](#tiff) -- [raw](#raw) -- [toFormat](#toformat) -- [tile](#tile) - -## toFile - -Write output image data to a file. - -If an explicit output format is not selected, it will be inferred from the extension, -with JPEG, PNG, WebP, TIFF, DZI, and libvips' V format supported. -Note that raw pixel data is only supported for buffer output. - -A Promises/A+ promise is returned when `callback` is not provided. - -**Parameters** - -- `fileOut` **[String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String)** the path to write the image data to. -- `callback` **[Function](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/function)?** called on completion with two arguments `(err, info)`. - `info` contains the output image `format`, `size` (bytes), `width`, `height`, - `channels` and `premultiplied` (indicating if premultiplication was used). - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **[Promise](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise)<[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)>** when no callback is provided - -## toBuffer - -Write output to a Buffer. -JPEG, PNG, WebP, TIFF and RAW output are supported. -By default, the format will match the input image, except GIF and SVG input which become PNG output. - -`callback`, if present, gets three arguments `(err, data, info)` where: - -- `err` is an error, if any. -- `data` is the output image data. -- `info` contains the output image `format`, `size` (bytes), `width`, `height`, - `channels` and `premultiplied` (indicating if premultiplication was used). - A Promise is returned when `callback` is not provided. - -**Parameters** - -- `options` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)?** - - `options.resolveWithObject` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)?** Resolve the Promise with an Object containing `data` and `info` properties instead of resolving only with `data`. -- `callback` **[Function](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/function)?** - -Returns **[Promise](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise)<[Buffer](https://nodejs.org/api/buffer.html)>** when no callback is provided - -## withMetadata - -Include all metadata (EXIF, XMP, IPTC) from the input image in the output image. -The default behaviour, when `withMetadata` is not used, is to strip all metadata and convert to the device-independent sRGB colour space. -This will also convert to and add a web-friendly sRGB ICC profile. - -**Parameters** - -- `withMetadata` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)?** - - `withMetadata.orientation` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** value between 1 and 8, used to update the EXIF `Orientation` tag. - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** - -## jpeg - -Use these JPEG options for output image. - -**Parameters** - -- `options` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)?** output options - - `options.quality` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** quality, integer 1-100 (optional, default `80`) - - `options.progressive` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** use progressive (interlace) scan (optional, default `false`) - - `options.chromaSubsampling` **[String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String)** set to '4:4:4' to prevent chroma subsampling when quality <= 90 (optional, default `'4:2:0'`) - - `options.trellisQuantisation` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** apply trellis quantisation, requires mozjpeg (optional, default `false`) - - `options.overshootDeringing` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** apply overshoot deringing, requires mozjpeg (optional, default `false`) - - `options.optimiseScans` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** optimise progressive scans, forces progressive, requires mozjpeg (optional, default `false`) - - `options.optimizeScans` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** alternative spelling of optimiseScans (optional, default `false`) - - `options.force` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** force JPEG output, otherwise attempt to use input format (optional, default `true`) - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid options - -Returns **Sharp** - -## png - -Use these PNG options for output image. - -**Parameters** - -- `options` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)?** - - `options.progressive` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** use progressive (interlace) scan (optional, default `false`) - - `options.compressionLevel` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** zlib compression level (optional, default `6`) - - `options.adaptiveFiltering` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** use adaptive row filtering (optional, default `true`) - - `options.force` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** force PNG output, otherwise attempt to use input format (optional, default `true`) - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid options - -Returns **Sharp** - -## webp - -Use these WebP options for output image. - -**Parameters** - -- `options` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)?** output options - - `options.quality` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** quality, integer 1-100 (optional, default `80`) - - `options.alphaQuality` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** quality of alpha layer, integer 0-100 (optional, default `100`) - - `options.lossless` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** use lossless compression mode (optional, default `false`) - - `options.nearLossless` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** use near_lossless compression mode (optional, default `false`) - - `options.force` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** force WebP output, otherwise attempt to use input format (optional, default `true`) - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid options - -Returns **Sharp** - -## tiff - -Use these TIFF options for output image. - -**Parameters** - -- `options` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)?** output options - - `options.quality` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** quality, integer 1-100 (optional, default `80`) - - `options.force` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** force TIFF output, otherwise attempt to use input format (optional, default `true`) - - `options.compression` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** compression options: lzw, deflate, jpeg (optional, default `'jpeg'`) - - `options.predictor` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** compression predictor options: none, horizontal, float (optional, default `'none'`) - - `options.xres` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** horizontal resolution in pixels/mm (optional, default `1.0`) - - `options.yres` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** vertical resolution in pixels/mm (optional, default `1.0`) - - `options.squash` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** squash 8-bit images down to 1 bit (optional, default `false`) - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid options - -Returns **Sharp** - -## raw - -Force output to be raw, uncompressed uint8 pixel data. - -Returns **Sharp** - -## toFormat - -Force output to a given format. - -**Parameters** - -- `format` **([String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String) \| [Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object))** as a String or an Object with an 'id' attribute -- `options` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)** output options - - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** unsupported format or options - -Returns **Sharp** - -## tile - -Use tile-based deep zoom (image pyramid) output. -Set the format and options for tile images via the `toFormat`, `jpeg`, `png` or `webp` functions. -Use a `.zip` or `.szi` file extension with `toFile` to write to a compressed archive file format. - -**Parameters** - -- `tile` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)?** - - `tile.size` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** tile size in pixels, a value between 1 and 8192. (optional, default `256`) - - `tile.overlap` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** tile overlap in pixels, a value between 0 and 8192. (optional, default `0`) - - `tile.container` **[String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String)** tile container, with value `fs` (filesystem) or `zip` (compressed file). (optional, default `'fs'`) - - `tile.layout` **[String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String)** filesystem layout, possible values are `dz`, `zoomify` or `google`. (optional, default `'dz'`) - -**Examples** - -```javascript -sharp('input.tiff') - .png() - .tile({ - size: 512 - }) - .toFile('output.dz', function(err, info) { - // output.dzi is the Deep Zoom XML definition - // output_files contains 512x512 tiles grouped by zoom level - }); -``` - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** diff --git a/docs/api-resize.md b/docs/api-resize.md deleted file mode 100644 index af1fdb5f9..000000000 --- a/docs/api-resize.md +++ /dev/null @@ -1,177 +0,0 @@ - - -### Table of Contents - -- [resize](#resize) -- [crop](#crop) -- [embed](#embed) -- [max](#max) -- [min](#min) -- [ignoreAspectRatio](#ignoreaspectratio) -- [withoutEnlargement](#withoutenlargement) - -## resize - -Resize image to `width` x `height`. -By default, the resized image is centre cropped to the exact size specified. - -Possible reduction kernels are: - -- `nearest`: Use [nearest neighbour interpolation](http://en.wikipedia.org/wiki/Nearest-neighbor_interpolation). -- `cubic`: Use a [Catmull-Rom spline](https://en.wikipedia.org/wiki/Centripetal_Catmull%E2%80%93Rom_spline). -- `lanczos2`: Use a [Lanczos kernel](https://en.wikipedia.org/wiki/Lanczos_resampling#Lanczos_kernel) with `a=2`. -- `lanczos3`: Use a Lanczos kernel with `a=3` (the default). - -Possible enlargement interpolators are: - -- `nearest`: Use [nearest neighbour interpolation](http://en.wikipedia.org/wiki/Nearest-neighbor_interpolation). -- `bilinear`: Use [bilinear interpolation](http://en.wikipedia.org/wiki/Bilinear_interpolation), faster than bicubic but with less smooth results. -- `vertexSplitQuadraticBasisSpline`: Use the smoother [VSQBS interpolation](https://github.com/jcupitt/libvips/blob/master/libvips/resample/vsqbs.cpp#L48) to prevent "staircasing" when enlarging. -- `bicubic`: Use [bicubic interpolation](http://en.wikipedia.org/wiki/Bicubic_interpolation) (the default). -- `locallyBoundedBicubic`: Use [LBB interpolation](https://github.com/jcupitt/libvips/blob/master/libvips/resample/lbb.cpp#L100), which prevents some "[acutance](http://en.wikipedia.org/wiki/Acutance)" but typically reduces performance by a factor of 2. -- `nohalo`: Use [Nohalo interpolation](http://eprints.soton.ac.uk/268086/), which prevents acutance but typically reduces performance by a factor of 3. - -**Parameters** - -- `width` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** pixels wide the resultant image should be. Use `null` or `undefined` to auto-scale the width to match the height. -- `height` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** pixels high the resultant image should be. Use `null` or `undefined` to auto-scale the height to match the width. -- `options` **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)?** - - `options.kernel` **[String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String)** the kernel to use for image reduction. (optional, default `'lanczos3'`) - - `options.interpolator` **[String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String)** the interpolator to use for image enlargement. (optional, default `'bicubic'`) - - `options.centreSampling` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** use \*magick centre sampling convention instead of corner sampling. (optional, default `false`) - - `options.centerSampling` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** alternative spelling of centreSampling. (optional, default `false`) - -**Examples** - -```javascript -sharp(inputBuffer) - .resize(200, 300, { - kernel: sharp.kernel.lanczos2, - interpolator: sharp.interpolator.nohalo - }) - .background('white') - .embed() - .toFile('output.tiff') - .then(function() { - // output.tiff is a 200 pixels wide and 300 pixels high image - // containing a lanczos2/nohalo scaled version, embedded on a white canvas, - // of the image data in inputBuffer - }); -``` - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** - -## crop - -Crop the resized image to the exact size specified, the default behaviour. - -Possible attributes of the optional `sharp.gravity` are `north`, `northeast`, `east`, `southeast`, `south`, -`southwest`, `west`, `northwest`, `center` and `centre`. - -The experimental strategy-based approach resizes so one dimension is at its target length -then repeatedly ranks edge regions, discarding the edge with the lowest score based on the selected strategy. - -- `entropy`: focus on the region with the highest [Shannon entropy](https://en.wikipedia.org/wiki/Entropy_%28information_theory%29). -- `attention`: focus on the region with the highest luminance frequency, colour saturation and presence of skin tones. - -**Parameters** - -- `crop` **[String](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String)** A member of `sharp.gravity` to crop to an edge/corner or `sharp.strategy` to crop dynamically. (optional, default `'centre'`) - -**Examples** - -```javascript -const transformer = sharp() - .resize(200, 200) - .crop(sharp.strategy.entropy) - .on('error', function(err) { - console.log(err); - }); -// Read image data from readableStream -// Write 200px square auto-cropped image data to writableStream -readableStream.pipe(transformer).pipe(writableStream); -``` - -- Throws **[Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error)** Invalid parameters - -Returns **Sharp** - -## embed - -Preserving aspect ratio, resize the image to the maximum `width` or `height` specified -then embed on a background of the exact `width` and `height` specified. - -If the background contains an alpha value then WebP and PNG format output images will -contain an alpha channel, even when the input image does not. - -**Examples** - -```javascript -sharp('input.gif') - .resize(200, 300) - .background({r: 0, g: 0, b: 0, alpha: 0}) - .embed() - .toFormat(sharp.format.webp) - .toBuffer(function(err, outputBuffer) { - if (err) { - throw err; - } - // outputBuffer contains WebP image data of a 200 pixels wide and 300 pixels high - // containing a scaled version, embedded on a transparent canvas, of input.gif - }); -``` - -Returns **Sharp** - -## max - -Preserving aspect ratio, resize the image to be as large as possible -while ensuring its dimensions are less than or equal to the `width` and `height` specified. - -Both `width` and `height` must be provided via `resize` otherwise the behaviour will default to `crop`. - -**Examples** - -```javascript -sharp(inputBuffer) - .resize(200, 200) - .max() - .toFormat('jpeg') - .toBuffer() - .then(function(outputBuffer) { - // outputBuffer contains JPEG image data no wider than 200 pixels and no higher - // than 200 pixels regardless of the inputBuffer image dimensions - }); -``` - -Returns **Sharp** - -## min - -Preserving aspect ratio, resize the image to be as small as possible -while ensuring its dimensions are greater than or equal to the `width` and `height` specified. - -Both `width` and `height` must be provided via `resize` otherwise the behaviour will default to `crop`. - -Returns **Sharp** - -## ignoreAspectRatio - -Ignoring the aspect ratio of the input, stretch the image to -the exact `width` and/or `height` provided via `resize`. - -Returns **Sharp** - -## withoutEnlargement - -Do not enlarge the output image if the input image width _or_ height are already less than the required dimensions. -This is equivalent to GraphicsMagick's `>` geometry option: -"_change the dimensions of the image only if its width or height exceeds the geometry specification_". - -**Parameters** - -- `withoutEnlargement` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** (optional, default `true`) - -Returns **Sharp** diff --git a/docs/api-utility.md b/docs/api-utility.md deleted file mode 100644 index 7a07760df..000000000 --- a/docs/api-utility.md +++ /dev/null @@ -1,106 +0,0 @@ - - -### Table of Contents - -- [cache](#cache) -- [concurrency](#concurrency) -- [counters](#counters) -- [simd](#simd) - -## cache - -Gets, or when options are provided sets, the limits of _libvips'_ operation cache. -Existing entries in the cache will be trimmed after any change in limits. -This method always returns cache statistics, -useful for determining how much working memory is required for a particular task. - -**Parameters** - -- `options` **([Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object) \| [Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean))** Object with the following attributes, or Boolean where true uses default cache settings and false removes all caching. - - `options.memory` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** is the maximum memory in MB to use for this cache (optional, default `50`) - - `options.files` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** is the maximum number of files to hold open (optional, default `20`) - - `options.items` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** is the maximum number of operations to cache (optional, default `100`) - -**Examples** - -```javascript -const stats = sharp.cache(); -``` - -```javascript -sharp.cache( { items: 200 } ); -sharp.cache( { files: 0 } ); -sharp.cache(false); -``` - -Returns **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)** - -## concurrency - -Gets, or when a concurrency is provided sets, -the number of threads _libvips'_ should create to process each image. -The default value is the number of CPU cores. -A value of `0` will reset to this default. - -The maximum number of images that can be processed in parallel -is limited by libuv's `UV_THREADPOOL_SIZE` environment variable. - -This method always returns the current concurrency. - -**Parameters** - -- `concurrency` **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)?** - -**Examples** - -```javascript -const threads = sharp.concurrency(); // 4 -sharp.concurrency(2); // 2 -sharp.concurrency(0); // 4 -``` - -Returns **[Number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number)** concurrency - -## counters - -Provides access to internal task counters. - -- queue is the number of tasks this module has queued waiting for _libuv_ to provide a worker thread from its pool. -- process is the number of resize tasks currently being processed. - -**Examples** - -```javascript -const counters = sharp.counters(); // { queue: 2, process: 4 } -``` - -Returns **[Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object)** - -## simd - -Get and set use of SIMD vector unit instructions. -Requires libvips to have been compiled with liborc support. - -Improves the performance of `resize`, `blur` and `sharpen` operations -by taking advantage of the SIMD vector unit of the CPU, e.g. Intel SSE and ARM NEON. - -This feature is currently off by default but future versions may reverse this. -Versions of liborc prior to 0.4.25 are known to segfault under heavy load. - -**Parameters** - -- `simd` **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** (optional, default `false`) - -**Examples** - -```javascript -const simd = sharp.simd(); -// simd is `true` if SIMD is currently enabled -``` - -```javascript -const simd = sharp.simd(true); -// attempts to enable the use of SIMD, returning true if available -``` - -Returns **[Boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean)** diff --git a/docs/astro.config.mjs b/docs/astro.config.mjs new file mode 100644 index 000000000..2209c3381 --- /dev/null +++ b/docs/astro.config.mjs @@ -0,0 +1,90 @@ +// @ts-check +import starlight from '@astrojs/starlight'; +import { defineConfig } from 'astro/config'; +import starlightAutoSidebar from 'starlight-auto-sidebar'; + +import { version } from '../package.json'; + +export default defineConfig({ + site: 'https://sharp.pixelplumbing.com', + integrations: [ + starlight({ + title: 'sharp', + description: + 'High performance Node.js image processing. The fastest module to resize JPEG, PNG, WebP and TIFF images.', + logo: { + src: './src/assets/sharp-logo.svg', + alt: '#' + }, + customCss: ['./src/styles/custom.css'], + head: [{ + tag: 'meta', + attrs: { + 'http-equiv': 'Content-Security-Policy', + content: "default-src 'self'; connect-src 'self'; object-src 'none'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; script-src 'self' 'unsafe-inline' 'unsafe-eval' https://static.cloudflareinsights.com/beacon.min.js/;" + } + }, { + tag: 'link', + attrs: { + rel: 'author', + href: '/humans.txt', + type: 'text/plain' + } + }, { + tag: 'script', + attrs: { + type: 'application/ld+json' + }, + content: JSON.stringify({ + '@context': 'https://schema.org', + '@type': 'SoftwareSourceCode', + name: 'sharp', + description: 'High performance Node.js image processing', + url: 'https://sharp.pixelplumbing.com', + codeRepository: 'https://github.com/lovell/sharp', + programmingLanguage: ['JavaScript', 'C++'], + runtimePlatform: 'Node.js', + copyrightHolder: { + '@context': 'https://schema.org', + '@type': 'Person', + name: 'Lovell Fuller' + }, + copyrightYear: 2013, + license: 'https://www.apache.org/licenses/LICENSE-2.0' + }) + }], + sidebar: [ + { label: 'Home', link: '/' }, + { label: 'Installation', slug: 'install' }, + { + label: 'API', + items: [ + { label: 'Constructor', slug: 'api-constructor' }, + { label: 'Input metadata', slug: 'api-input' }, + { label: 'Output options', slug: 'api-output' }, + { label: 'Resizing images', slug: 'api-resize' }, + { label: 'Compositing images', slug: 'api-composite' }, + { label: 'Image operations', slug: 'api-operation' }, + { label: 'Colour manipulation', slug: 'api-colour' }, + { label: 'Channel manipulation', slug: 'api-channel' }, + { label: 'Global properties', slug: 'api-utility' } + ] + }, + { label: 'Performance', slug: 'performance' }, + { + label: 'Changelog', + collapsed: true, + autogenerate: { directory: 'changelog' } + } + ], + social: [ + { icon: 'openCollective', label: 'Open Collective', href: 'https://opencollective.com/libvips' }, + { icon: 'github', label: 'GitHub', href: 'https://github.com/lovell/sharp' } + ], + plugins: [starlightAutoSidebar()] + }) + ], + redirects: { + '/changelog': `/changelog/v${version}` + } +}); diff --git a/docs/build.mjs b/docs/build.mjs new file mode 100644 index 000000000..da43a5bf5 --- /dev/null +++ b/docs/build.mjs @@ -0,0 +1,42 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +import fs from 'node:fs/promises'; +import path from 'node:path'; +import jsdoc2md from 'jsdoc-to-markdown'; + +const pages = { + constructor: 'Constructor', + input: 'Input metadata', + resize: 'Resizing images', + composite: 'Compositing images', + operation: 'Image operations', + colour: 'Colour manipulation', + channel: 'Channel manipulation', + output: 'Output options', + utility: 'Global properties' +}; + +Object.keys(pages).forEach(async (m) => { + const input = path.join('lib', `${m}.js`); + const output = path.join('docs', 'src', 'content', 'docs', `api-${m}.md`); + + const ast = await jsdoc2md.getTemplateData({ files: input }); + const markdown = await jsdoc2md.render({ + data: ast, + 'global-index-format': 'none', + 'module-index-format': 'none' + }); + + const cleanMarkdown = + `---\n# This file was auto-generated from JSDoc in lib/${m}.js\ntitle: ${pages[m]}\n---\n\n` + + markdown + .replace(/(## )([A-Za-z0-9]+)([^\n]*)/g, '$1$2\n> $2$3\n') // simplify headings + .replace(/<\/a>/g, '') // remove anchors + .replace(/\*\*Kind\*\*: global[^\n]+/g, '') // remove all "global" Kind labels (requires JSDoc refactoring) + .trim(); + + await fs.writeFile(output, cleanMarkdown); +}); diff --git a/docs/changelog.md b/docs/changelog.md deleted file mode 100644 index 579c12163..000000000 --- a/docs/changelog.md +++ /dev/null @@ -1,616 +0,0 @@ -# Changelog - -### v0.18 - "*ridge*" - -Requires libvips v8.5.5. - -#### v0.18.2 - 1st July 2017 - -* Expose libvips' xres and yres properties for TIFF output. - [#828](https://github.com/lovell/sharp/pull/828) - [@YvesBos](https://github.com/YvesBos) - -* Ensure flip and flop operations work with auto-rotate. - [#837](https://github.com/lovell/sharp/issues/837) - [@rexxars](https://github.com/rexxars) - -* Allow binary download URL override via SHARP_DIST_BASE_URL env variable. - [#841](https://github.com/lovell/sharp/issues/841) - -* Add support for Solus Linux. - [#857](https://github.com/lovell/sharp/pull/857) - [@ekremkaraca](https://github.com/ekremkaraca) - -#### v0.18.1 - 30th May 2017 - -* Remove regression from #781 that could cause incorrect shrink calculation. - [#831](https://github.com/lovell/sharp/issues/831) - [@suprMax](https://github.com/suprMax) - -#### v0.18.0 - 30th May 2017 - -* Remove the previously-deprecated output format "option" functions: - quality, progressive, compressionLevel, withoutAdaptiveFiltering, - withoutChromaSubsampling, trellisQuantisation, trellisQuantization, - overshootDeringing, optimiseScans and optimizeScans. - -* Ensure maximum output dimensions are based on the format to be used. - [#176](https://github.com/lovell/sharp/issues/176) - [@stephanebachelier](https://github.com/stephanebachelier) - -* Avoid costly (un)premultiply when using overlayWith without alpha channel. - [#573](https://github.com/lovell/sharp/issues/573) - [@strarsis](https://github.com/strarsis) - -* Include pixel depth (e.g. "uchar") when reading metadata. - [#577](https://github.com/lovell/sharp/issues/577) - [@moedusa](https://github.com/moedusa) - -* Add support for Buffer and Stream-based TIFF output. - [#587](https://github.com/lovell/sharp/issues/587) - [@strarsis](https://github.com/strarsis) - -* Expose warnings from libvips via NODE_DEBUG=sharp environment variable. - [#607](https://github.com/lovell/sharp/issues/607) - [@puzrin](https://github.com/puzrin) - -* Switch to the libvips implementation of "attention" and "entropy" crop strategies. - [#727](https://github.com/lovell/sharp/issues/727) - -* Improve performance and accuracy of nearest neighbour integral upsampling. - [#752](https://github.com/lovell/sharp/issues/752) - [@MrIbby](https://github.com/MrIbby) - -* Constructor single argument API: allow plain object, reject null/undefined. - [#768](https://github.com/lovell/sharp/issues/768) - [@kub1x](https://github.com/kub1x) - -* Ensure ARM64 pre-built binaries use correct C++11 ABI version. - [#772](https://github.com/lovell/sharp/issues/772) - [@ajiratech2](https://github.com/ajiratech2) - -* Prevent aliasing by using dynamic values for shrink(-on-load). - [#781](https://github.com/lovell/sharp/issues/781) - [@kleisauke](https://github.com/kleisauke) - -* Expose libvips' "squash" parameter to enable 1-bit TIFF output. - [#783](https://github.com/lovell/sharp/pull/783) - [@YvesBos](https://github.com/YvesBos) - -* Add support for rotation using any multiple of +/-90 degrees. - [#791](https://github.com/lovell/sharp/pull/791) - [@ncoden](https://github.com/ncoden) - -* Add "jpg" alias to toFormat as shortened form of "jpeg". - [#814](https://github.com/lovell/sharp/pull/814) - [@jingsam](https://github.com/jingsam) - -### v0.17 - "*quill*" - -Requires libvips v8.4.2. - -#### v0.17.3 - 1st April 2017 - -* Allow toBuffer to optionally resolve a Promise with both info and data. - [#143](https://github.com/lovell/sharp/issues/143) - [@salzhrani](https://github.com/salzhrani) - -* Create blank image of given width, height, channels and background. - [#470](https://github.com/lovell/sharp/issues/470) - [@pjarts](https://github.com/pjarts) - -* Add support for the "nearest" kernel for image reductions. - [#732](https://github.com/lovell/sharp/pull/732) - [@alice0meta](https://github.com/alice0meta) - -* Add support for TIFF compression and predictor options. - [#738](https://github.com/lovell/sharp/pull/738) - [@kristojorg](https://github.com/kristojorg) - -#### v0.17.2 - 11th February 2017 - -* Ensure Readable side of Stream can start flowing after Writable side has finished. - [#671](https://github.com/lovell/sharp/issues/671) - [@danhaller](https://github.com/danhaller) - -* Expose WebP alpha quality, lossless and near-lossless output options. - [#685](https://github.com/lovell/sharp/pull/685) - [@rnanwani](https://github.com/rnanwani) - -#### v0.17.1 - 15th January 2017 - -* Improve error messages for invalid parameters. - [@spikeon](https://github.com/spikeon) - [#644](https://github.com/lovell/sharp/pull/644) - -* Simplify expression for finding vips-cpp libdir. - [#656](https://github.com/lovell/sharp/pull/656) - -* Allow HTTPS-over-HTTP proxy when downloading pre-compiled dependencies. - [@wangzhiwei1888](https://github.com/wangzhiwei1888) - [#679](https://github.com/lovell/sharp/issues/679) - -#### v0.17.0 - 11th December 2016 - -* Drop support for versions of Node prior to v4. - -* Deprecate the following output format "option" functions: - quality, progressive, compressionLevel, withoutAdaptiveFiltering, - withoutChromaSubsampling, trellisQuantisation, trellisQuantization, - overshootDeringing, optimiseScans and optimizeScans. - Access to these is now via output format functions, for example `quality(n)` - is now `jpeg({quality: n})` and/or `webp({quality: n})`. - -* Autoconvert GIF and SVG input to PNG output if no other format is specified. - -* Expose libvips' "centre" resize option to mimic \*magick's +0.5px convention. - [#568](https://github.com/lovell/sharp/issues/568) - -* Ensure support for embedded base64 PNG and JPEG images within an SVG. - [#601](https://github.com/lovell/sharp/issues/601) - [@dynamite-ready](https://github.com/dynamite-ready) - -* Ensure premultiply operation occurs before box filter shrink. - [#605](https://github.com/lovell/sharp/issues/605) - [@CmdrShepardsPie](https://github.com/CmdrShepardsPie) - [@teroparvinen](https://github.com/teroparvinen) - -* Add support for PNG and WebP tile-based output formats (in addition to JPEG). - [#622](https://github.com/lovell/sharp/pull/622) - [@ppaskaris](https://github.com/ppaskaris) - -* Allow use of extend with greyscale input. - [#623](https://github.com/lovell/sharp/pull/623) - [@ppaskaris](https://github.com/ppaskaris) - -* Allow non-RGB input to embed/extend onto background with an alpha channel. - [#646](https://github.com/lovell/sharp/issues/646) - [@DaGaMs](https://github.com/DaGaMs) - -### v0.16 - "*pencil*" - -Requires libvips v8.3.3 - -#### v0.16.2 - 22nd October 2016 - -* Restrict readelf usage to Linux only when detecting global libvips version. - [#602](https://github.com/lovell/sharp/issues/602) - [@caoko](https://github.com/caoko) - -#### v0.16.1 - 13th October 2016 - -* C++11 ABI version is now auto-detected, remove sharp-cxx11 installation flag. - -* Add experimental 'attention' crop strategy. - [#295](https://github.com/lovell/sharp/issues/295) - -* Include .node extension for Meteor's require() implementation. - [#537](https://github.com/lovell/sharp/issues/537) - [@isjackwild](https://github.com/isjackwild) - -* Ensure convolution kernel scale is clamped to a minimum value of 1. - [#561](https://github.com/lovell/sharp/issues/561) - [@abagshaw](https://github.com/abagshaw) - -* Correct calculation of y-axis placement when overlaying image at a fixed point. - [#566](https://github.com/lovell/sharp/issues/566) - [@Nateowami](https://github.com/Nateowami) - -#### v0.16.0 - 18th August 2016 - -* Add pre-compiled libvips for OS X, ARMv7 and ARMv8. - [#312](https://github.com/lovell/sharp/issues/312) - -* Ensure boolean, bandbool, extractChannel ops occur before sRGB conversion. - [#504](https://github.com/lovell/sharp/pull/504) - [@mhirsch](https://github.com/mhirsch) - -* Recalculate factors after WebP shrink-on-load to avoid round-to-zero errors. - [#508](https://github.com/lovell/sharp/issues/508) - [@asilvas](https://github.com/asilvas) - -* Prevent boolean errors during extract operation. - [#511](https://github.com/lovell/sharp/pull/511) - [@mhirsch](https://github.com/mhirsch) - -* Add joinChannel and toColourspace/toColorspace operations. - [#513](https://github.com/lovell/sharp/pull/513) - [@mhirsch](https://github.com/mhirsch) - -* Add support for raw pixel data with boolean and withOverlay operations. - [#516](https://github.com/lovell/sharp/pull/516) - [@mhirsch](https://github.com/mhirsch) - -* Prevent bandbool creating a single channel sRGB image. - [#519](https://github.com/lovell/sharp/pull/519) - [@mhirsch](https://github.com/mhirsch) - -* Ensure ICC profiles are removed from PNG output unless withMetadata used. - [#521](https://github.com/lovell/sharp/issues/521) - [@ChrisPinewood](https://github.com/ChrisPinewood) - -* Add alpha channels, if missing, to overlayWith images. - [#540](https://github.com/lovell/sharp/pull/540) - [@cmtt](https://github.com/cmtt) - -* Remove deprecated interpolateWith method - use resize(w, h, { interpolator: ... }) - [#310](https://github.com/lovell/sharp/issues/310) - -### v0.15 - "*outfit*" - -Requires libvips v8.3.1 - -#### v0.15.1 - 12th July 2016 - -* Concat Stream-based input in single operation for ~+3% perf and less GC. - [#429](https://github.com/lovell/sharp/issues/429) - [@papandreou](https://github.com/papandreou) - -* Add alpha channel, if required, before extend operation. - [#439](https://github.com/lovell/sharp/pull/439) - [@frulo](https://github.com/frulo) - -* Allow overlay image to be repeated across entire image via tile option. - [#443](https://github.com/lovell/sharp/pull/443) - [@lemnisk8](https://github.com/lemnisk8) - -* Add cutout option to overlayWith feature, applies only the alpha channel of the overlay image. - [#448](https://github.com/lovell/sharp/pull/448) - [@kleisauke](https://github.com/kleisauke) - -* Ensure scaling factors are calculated independently to prevent rounding errors. - [#452](https://github.com/lovell/sharp/issues/452) - [@puzrin](https://github.com/puzrin) - -* Add --sharp-cxx11 flag to compile with gcc's new C++11 ABI. - [#456](https://github.com/lovell/sharp/pull/456) - [@kapouer](https://github.com/kapouer) - -* Add top/left offset support to overlayWith operation. - [#473](https://github.com/lovell/sharp/pull/473) - [@rnanwani](https://github.com/rnanwani) - -* Add convolve operation for kernel-based convolution. - [#479](https://github.com/lovell/sharp/pull/479) - [@mhirsch](https://github.com/mhirsch) - -* Add greyscale option to threshold operation for colourspace conversion control. - [#480](https://github.com/lovell/sharp/pull/480) - [@mhirsch](https://github.com/mhirsch) - -* Ensure ICC profiles are licenced for distribution. - [#486](https://github.com/lovell/sharp/issues/486) - [@kapouer](https://github.com/kapouer) - -* Allow images with an alpha channel to work with LAB-colourspace based sharpen. - [#490](https://github.com/lovell/sharp/issues/490) - [@jwagner](https://github.com/jwagner) - -* Add trim operation to remove "boring" edges. - [#492](https://github.com/lovell/sharp/pull/492) - [@kleisauke](https://github.com/kleisauke) - -* Add bandbool feature for channel-wise boolean operations. - [#496](https://github.com/lovell/sharp/pull/496) - [@mhirsch](https://github.com/mhirsch) - -* Add extractChannel operation to extract a channel from an image. - [#497](https://github.com/lovell/sharp/pull/497) - [@mhirsch](https://github.com/mhirsch) - -* Add ability to read and write native libvips .v files. - [#500](https://github.com/lovell/sharp/pull/500) - [@mhirsch](https://github.com/mhirsch) - -* Add boolean feature for bitwise image operations. - [#501](https://github.com/lovell/sharp/pull/501) - [@mhirsch](https://github.com/mhirsch) - -#### v0.15.0 - 21st May 2016 - -* Use libvips' new Lanczos 3 kernel as default for image reduction. - Deprecate interpolateWith method, now provided as a resize option. - [#310](https://github.com/lovell/sharp/issues/310) - [@jcupitt](https://github.com/jcupitt) - -* Take advantage of libvips v8.3 features. - Add support for libvips' new GIF and SVG loaders. - Pre-built binaries now include giflib and librsvg, exclude *magick. - Use shrink-on-load for WebP input. - Break existing sharpen API to accept sigma and improve precision. - [#369](https://github.com/lovell/sharp/issues/369) - -* Remove unnecessary (un)premultiply operations when not resizing/compositing. - [#413](https://github.com/lovell/sharp/issues/413) - [@jardakotesovec](https://github.com/jardakotesovec) - -### v0.14 - "*needle*" - -Requires libvips v8.2.3 - -#### v0.14.1 - 16th April 2016 - -* Allow removal of limitation on input pixel count via limitInputPixels. Use with care. - [#250](https://github.com/lovell/sharp/issues/250) - [#316](https://github.com/lovell/sharp/pull/316) - [@anandthakker](https://github.com/anandthakker) - [@kentongray](https://github.com/kentongray) - -* Use final output image for metadata passed to callback. - [#399](https://github.com/lovell/sharp/pull/399) - [@salzhrani](https://github.com/salzhrani) - -* Add support for writing tiled images to a zip container. - [#402](https://github.com/lovell/sharp/pull/402) - [@felixbuenemann](https://github.com/felixbuenemann) - -* Allow use of embed with 1 and 2 channel images. - [#411](https://github.com/lovell/sharp/issues/411) - [@janaz](https://github.com/janaz) - -* Improve Electron compatibility by allowing node-gyp rebuilds without npm. - [#412](https://github.com/lovell/sharp/issues/412) - [@nouh](https://github.com/nouh) - -#### v0.14.0 - 2nd April 2016 - -* Add ability to extend (pad) the edges of an image. - [#128](https://github.com/lovell/sharp/issues/128) - [@blowsie](https://github.com/blowsie) - -* Add support for Zoomify and Google tile layouts. Breaks existing tile API. - [#223](https://github.com/lovell/sharp/issues/223) - [@bdunnette](https://github.com/bdunnette) - -* Improvements to overlayWith: differing sizes/formats, gravity, buffer input. - [#239](https://github.com/lovell/sharp/issues/239) - [@chrisriley](https://github.com/chrisriley) - -* Add entropy-based crop strategy to remove least interesting edges. - [#295](https://github.com/lovell/sharp/issues/295) - [@rightaway](https://github.com/rightaway) - -* Expose density metadata; set density of images from vector input. - [#338](https://github.com/lovell/sharp/issues/338) - [@lookfirst](https://github.com/lookfirst) - -* Emit post-processing 'info' event for Stream output. - [#367](https://github.com/lovell/sharp/issues/367) - [@salzhrani](https://github.com/salzhrani) - -* Ensure output image EXIF Orientation values are within 1-8 range. - [#385](https://github.com/lovell/sharp/pull/385) - [@jtobinisaniceguy](https://github.com/jtobinisaniceguy) - -* Ensure ratios are not swapped when rotating 90/270 and ignoring aspect. - [#387](https://github.com/lovell/sharp/issues/387) - [@kleisauke](https://github.com/kleisauke) - -* Remove deprecated style of calling extract API. Breaks calls using positional arguments. - [#276](https://github.com/lovell/sharp/issues/276) - -### v0.13 - "*mind*" - -Requires libvips v8.2.2 - -#### v0.13.1 - 27th February 2016 - -* Fix embedding onto transparent backgrounds; regression introduced in v0.13.0. - [#366](https://github.com/lovell/sharp/issues/366) - [@diegocsandrim](https://github.com/diegocsandrim) - -#### v0.13.0 - 15th February 2016 - -* Improve vector image support by allowing control of density/DPI. - Switch pre-built libs from Imagemagick to Graphicsmagick. - [#110](https://github.com/lovell/sharp/issues/110) - [@bradisbell](https://github.com/bradisbell) - -* Add support for raw, uncompressed pixel Buffer/Stream input. - [#220](https://github.com/lovell/sharp/issues/220) - [@mikemorris](https://github.com/mikemorris) - -* Switch from libvips' C to C++ bindings, requires upgrade to v8.2.2. - [#299](https://github.com/lovell/sharp/issues/299) - -* Control number of open files in libvips' cache; breaks existing `cache` behaviour. - [#315](https://github.com/lovell/sharp/issues/315) - [@impomezia](https://github.com/impomezia) - -* Ensure 16-bit input images can be normalised and embedded onto transparent backgrounds. - [#339](https://github.com/lovell/sharp/issues/339) - [#340](https://github.com/lovell/sharp/issues/340) - [@janaz](https://github.com/janaz) - -* Ensure selected format takes precedence over any unknown output filename extension. - [#344](https://github.com/lovell/sharp/issues/344) - [@ubaltaci](https://github.com/ubaltaci) - -* Add support for libvips' PBM, PGM, PPM and FITS image format loaders. - [#347](https://github.com/lovell/sharp/issues/347) - [@oaleynik](https://github.com/oaleynik) - -* Ensure default crop gravity is center/centre. - [#351](https://github.com/lovell/sharp/pull/351) - [@joelmukuthu](https://github.com/joelmukuthu) - -* Improve support for musl libc systems e.g. Alpine Linux. - [#354](https://github.com/lovell/sharp/issues/354) - [#359](https://github.com/lovell/sharp/pull/359) - [@download13](https://github.com/download13) - [@wjordan](https://github.com/wjordan) - -* Small optimisation when reducing by an integral factor to favour shrink over affine. - -* Add support for gamma correction of images with an alpha channel. - -### v0.12 - "*look*" - -Requires libvips v8.2.0 - -#### v0.12.2 - 16th January 2016 - -* Upgrade libvips to v8.2.0 for improved vips_shrink. - -* Add pre-compiled libvips for ARMv6+ CPUs. - -* Ensure 16-bit input images work with embed option. - [#325](https://github.com/lovell/sharp/issues/325) - [@janaz](https://github.com/janaz) - -* Allow compilation with gmake to provide FreeBSD support. - [#326](https://github.com/lovell/sharp/issues/326) - [@c0decafe](https://github.com/c0decafe) - -* Attempt to remove temporary file after installation. - [#331](https://github.com/lovell/sharp/issues/331) - [@dtoubelis](https://github.com/dtoubelis) - -#### v0.12.1 - 12th December 2015 - -* Allow use of SIMD vector instructions (via liborc) to be toggled on/off. - [#172](https://github.com/lovell/sharp/issues/172) - [@bkw](https://github.com/bkw) - [@puzrin](https://github.com/puzrin) - -* Ensure embedded ICC profiles output with perceptual intent. - [#321](https://github.com/lovell/sharp/issues/321) - [@vlapo](https://github.com/vlapo) - -* Use the NPM-configured HTTPS proxy, if any, for binary downloads. - -#### v0.12.0 - 23rd November 2015 - -* Bundle pre-compiled libvips and its dependencies for 64-bit Linux and Windows. - [#42](https://github.com/lovell/sharp/issues/42) - -* Take advantage of libvips v8.1.0+ features. - [#152](https://github.com/lovell/sharp/issues/152) - -* Add support for 64-bit Windows. Drop support for 32-bit Windows. - [#224](https://github.com/lovell/sharp/issues/224) - [@sabrehagen](https://github.com/sabrehagen) - -* Switch default interpolator to bicubic. - [#289](https://github.com/lovell/sharp/issues/289) - [@mahnunchik](https://github.com/mahnunchik) - -* Pre-extract rotatation should not swap width/height. - [#296](https://github.com/lovell/sharp/issues/296) - [@asilvas](https://github.com/asilvas) - -* Ensure 16-bit+alpha input images are (un)premultiplied correctly. - [#301](https://github.com/lovell/sharp/issues/301) - [@izaakschroeder](https://github.com/izaakschroeder) - -* Add `threshold` operation. - [#303](https://github.com/lovell/sharp/pull/303) - [@dacarley](https://github.com/dacarley) - -* Add `negate` operation. - [#306](https://github.com/lovell/sharp/pull/306) - [@dacarley](https://github.com/dacarley) - -* Support `options` Object with existing `extract` operation. - [#309](https://github.com/lovell/sharp/pull/309) - [@papandreou](https://github.com/papandreou) - -### v0.11 - "*knife*" - -#### v0.11.4 - 5th November 2015 - -* Add corners, e.g. `northeast`, to existing `gravity` option. - [#291](https://github.com/lovell/sharp/pull/291) - [@brandonaaron](https://github.com/brandonaaron) - -* Ensure correct auto-rotation for EXIF Orientation values 2 and 4. - [#288](https://github.com/lovell/sharp/pull/288) - [@brandonaaron](https://github.com/brandonaaron) - -* Make static linking possible via `--runtime_link` install option. - [#287](https://github.com/lovell/sharp/pull/287) - [@vlapo](https://github.com/vlapo) - -#### v0.11.3 - 8th September 2015 - -* Intrepret blurSigma, sharpenFlat, and sharpenJagged as double precision. - [#263](https://github.com/lovell/sharp/pull/263) - [@chrisriley](https://github.com/chrisriley) - -#### v0.11.2 - 28th August 2015 - -* Allow crop gravity to be provided as a String. - [#255](https://github.com/lovell/sharp/pull/255) - [@papandreou](https://github.com/papandreou) -* Add support for io.js v3 and Node v4. - [#246](https://github.com/lovell/sharp/issues/246) - -#### v0.11.1 - 12th August 2015 - -* Silence MSVC warning: "C4530: C++ exception handler used, but unwind semantics are not enabled". - [#244](https://github.com/lovell/sharp/pull/244) - [@TheThing](https://github.com/TheThing) - -* Suppress gamma correction for input image with alpha transparency. - [#249](https://github.com/lovell/sharp/issues/249) - [@compeak](https://github.com/compeak) - -#### v0.11.0 - 15th July 2015 - -* Allow alpha transparency compositing via new `overlayWith` method. - [#97](https://github.com/lovell/sharp/issues/97) - [@gasi](https://github.com/gasi) - -* Expose raw ICC profile data as a Buffer when using `metadata`. - [#129](https://github.com/lovell/sharp/issues/129) - [@homerjam](https://github.com/homerjam) - -* Allow image header updates via a parameter passed to existing `withMetadata` method. - Provide initial support for EXIF `Orientation` tag, - which if present is now removed when using `rotate`, `flip` or `flop`. - [#189](https://github.com/lovell/sharp/issues/189) - [@h2non](https://github.com/h2non) - -* Tighten constructor parameter checks. - [#221](https://github.com/lovell/sharp/issues/221) - [@mikemorris](https://github.com/mikemorris) - -* Allow one input Stream to be shared with two or more output Streams via new `clone` method. - [#235](https://github.com/lovell/sharp/issues/235) - [@jaubourg](https://github.com/jaubourg) - -* Use `round` instead of `floor` when auto-scaling dimensions to avoid floating-point rounding errors. - [#238](https://github.com/lovell/sharp/issues/238) - [@richardadjogah](https://github.com/richardadjogah) - -### v0.10 - "*judgment*" - -#### v0.10.1 - 1st June 2015 - -* Allow embed of image with alpha transparency onto non-transparent background. - [#204](https://github.com/lovell/sharp/issues/204) - [@mikemliu](https://github.com/mikemliu) - -* Include C standard library for `atoi` as Xcode 6.3 appears to no longer do this. - [#228](https://github.com/lovell/sharp/issues/228) - [@doggan](https://github.com/doggan) - -#### v0.10.0 - 23rd April 2015 - -* Add support for Windows (x86). - [#19](https://github.com/lovell/sharp/issues/19) - [@DullReferenceException](https://github.com/DullReferenceException) - [@itsananderson](https://github.com/itsananderson) - -* Add support for Openslide input and DeepZoom output. - [#146](https://github.com/lovell/sharp/issues/146) - [@mvictoras](https://github.com/mvictoras) - -* Allow arbitrary aspect ratios when resizing images via new `ignoreAspectRatio` method. - [#192](https://github.com/lovell/sharp/issues/192) - [@skedastik](https://github.com/skedastik) - -* Enhance output image contrast by stretching its luminance to cover the full dynamic range via new `normalize` method. - [#194](https://github.com/lovell/sharp/issues/194) - [@bkw](https://github.com/bkw) - [@codingforce](https://github.com/codingforce) diff --git a/docs/firebase.json b/docs/firebase.json new file mode 100644 index 000000000..f00270acb --- /dev/null +++ b/docs/firebase.json @@ -0,0 +1,16 @@ +{ + "hosting": { + "site": "pixelplumbing-sharp", + "public": "dist", + "headers": [ + { + "source": "**", + "headers": [ + { "key": "Cache-Control", "value": "max-age=86400" }, + { "key": "X-Content-Type-Options", "value": "nosniff" }, + { "key": "X-Frame-Options", "value": "SAMEORIGIN" } + ] + } + ] + } +} diff --git a/docs/index.md b/docs/index.md deleted file mode 100644 index bcfdd81ec..000000000 --- a/docs/index.md +++ /dev/null @@ -1,119 +0,0 @@ -# sharp - -The typical use case for this high speed Node.js module -is to convert large images in common formats to -smaller, web-friendly JPEG, PNG and WebP images of varying dimensions. - -Resizing an image is typically 4x-5x faster than using the -quickest ImageMagick and GraphicsMagick settings. - -Colour spaces, embedded ICC profiles and alpha transparency channels are all handled correctly. -Lanczos resampling ensures quality is not sacrificed for speed. - -As well as image resizing, operations such as -rotation, extraction, compositing and gamma correction are available. - -OS X, Windows (x64), Linux (x64, ARM) systems do not require -the installation of any external runtime dependencies. - -[![Test Coverage](https://coveralls.io/repos/lovell/sharp/badge.png?branch=master)](https://coveralls.io/r/lovell/sharp?branch=master) - -### Formats - -This module supports reading JPEG, PNG, WebP, TIFF, GIF and SVG images. - -Output images can be in JPEG, PNG, WebP and TIFF formats as well as uncompressed raw pixel data. - -Streams, Buffer objects and the filesystem can be used for input and output. - -A single input Stream can be split into multiple processing pipelines and output Streams. - -Deep Zoom image pyramids can be generated, -suitable for use with "slippy map" tile viewers like -[OpenSeadragon](https://github.com/openseadragon/openseadragon) -and [Leaflet](https://github.com/turban/Leaflet.Zoomify). - -### Fast - -This module is powered by the blazingly fast -[libvips](https://github.com/jcupitt/libvips) image processing library, -originally created in 1989 at Birkbeck College -and currently maintained by -[John Cupitt](https://github.com/jcupitt). - -Only small regions of uncompressed image data -are held in memory and processed at a time, -taking full advantage of multiple CPU cores and L1/L2/L3 cache. - -Everything remains non-blocking thanks to _libuv_, -no child processes are spawned and Promises/A+ are supported. - -### Optimal - -Huffman tables are optimised when generating JPEG output images -without having to use separate command line tools like -[jpegoptim](https://github.com/tjko/jpegoptim) and -[jpegtran](http://jpegclub.org/jpegtran/). - -PNG filtering can be disabled, -which for diagrams and line art often produces the same result -as [pngcrush](http://pmt.sourceforge.net/pngcrush/). - -### Contributing - -A [guide for contributors](https://github.com/lovell/sharp/blob/master/CONTRIBUTING.md) -covers reporting bugs, requesting features and submitting code changes. - -### Credits - -This module would never have been possible without -the help and code contributions of the following people: - -* [John Cupitt](https://github.com/jcupitt) -* [Pierre Inglebert](https://github.com/pierreinglebert) -* [Jonathan Ong](https://github.com/jonathanong) -* [Chanon Sajjamanochai](https://github.com/chanon) -* [Juliano Julio](https://github.com/julianojulio) -* [Daniel Gasienica](https://github.com/gasi) -* [Julian Walker](https://github.com/julianwa) -* [Amit Pitaru](https://github.com/apitaru) -* [Brandon Aaron](https://github.com/brandonaaron) -* [Andreas Lind](https://github.com/papandreou) -* [Maurus Cuelenaere](https://github.com/mcuelenaere) -* [Linus Unnebäck](https://github.com/LinusU) -* [Victor Mateevitsi](https://github.com/mvictoras) -* [Alaric Holloway](https://github.com/skedastik) -* [Bernhard K. Weisshuhn](https://github.com/bkw) -* [David A. Carley](https://github.com/dacarley) -* [John Tobin](https://github.com/jtobinisaniceguy) -* [Kenton Gray](https://github.com/kentongray) -* [Felix Bünemann](https://github.com/felixbuenemann) -* [Samy Al Zahrani](https://github.com/salzhrani) -* [Chintan Thakkar](https://github.com/lemnisk8) -* [F. Orlando Galashan](https://github.com/frulo) -* [Kleis Auke Wolthuizen](https://github.com/kleisauke) -* [Matt Hirsch](https://github.com/mhirsch) -* [Rahul Nanwani](https://github.com/rnanwani) -* [Matthias Thoemmes](https://github.com/cmtt) -* [Patrick Paskaris](https://github.com/ppaskaris) -* [Jérémy Lal](https://github.com/kapouer) -* [Alice Monday](https://github.com/alice0meta) -* [Kristo Jorgenson](https://github.com/kristojorg) -* [Yves Bos](https://github.com/YvesBos) - -Thank you! - -### Licence - -Copyright 2013, 2014, 2015, 2016, 2017 Lovell Fuller and contributors. - -Licensed under the Apache License, Version 2.0 (the "License"); -you may not use this file except in compliance with the License. -You may obtain a copy of the License at -[http://www.apache.org/licenses/LICENSE-2.0](http://www.apache.org/licenses/LICENSE-2.0.html) - -Unless required by applicable law or agreed to in writing, software -distributed under the License is distributed on an "AS IS" BASIS, -WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -See the License for the specific language governing permissions and -limitations under the License. diff --git a/docs/install.md b/docs/install.md deleted file mode 100644 index b2950e0a4..000000000 --- a/docs/install.md +++ /dev/null @@ -1,253 +0,0 @@ -# Installation - -```sh -npm install sharp -``` - -```sh -yarn add sharp -``` - -### Prerequisites - -* Node v4.5.0+ -* C++11 compatible compiler such as gcc 4.8+, clang 3.0+ or MSVC 2013+ -* [node-gyp](https://github.com/TooTallNate/node-gyp#installation) and its dependencies (includes Python) - -### Linux - -[![Ubuntu 14.04 Build Status](https://travis-ci.org/lovell/sharp.png?branch=master)](https://travis-ci.org/lovell/sharp) -[![Linux Build Status](https://circleci.com/gh/lovell/sharp.svg?style=svg&circle-token=6cb6d1d287a51af83722b19ed8885377fbc85e5c)](https://circleci.com/gh/lovell/sharp) - -libvips and its dependencies are fetched and stored within `node_modules/sharp/vendor` during `npm install`. -This involves an automated HTTPS download of approximately 7MB. - -Most recent Linux-based operating systems with glibc running on x64 and ARMv6+ CPUs should "just work", e.g.: - -* Debian 7, 8 -* Ubuntu 12.04, 14.04, 16.04 -* Centos 7 -* Fedora -* openSUSE 13.2 -* Archlinux -* Raspbian Jessie -* Amazon Linux 2016.03, 2016.09 -* Solus - -To use a globally-installed version of libvips instead of the provided binaries, -make sure it is at least the version listed under `config.libvips` in the `package.json` file -and that it can be located using `pkg-config --modversion vips-cpp`. - -If you are using non-stadard paths (anything other than `/usr` or `/usr/local`), -you might need to set `PKG_CONFIG_PATH` during `npm install` -and `LD_LIBRARY_PATH` at runtime. - -This allows the use of newer versions of libvips with older versions of sharp. - -For 32-bit Intel CPUs and older Linux-based operating systems such as Centos 6, -it is recommended to install a system-wide installation of libvips from source: - -https://jcupitt.github.io/libvips/install.html#building-libvips-from-a-source-tarball - -#### Alpine Linux - -libvips is available in the -[testing repository](https://pkgs.alpinelinux.org/packages?name=vips-dev): - -```sh -apk add vips-dev fftw-dev --update-cache --repository https://dl-3.alpinelinux.org/alpine/edge/testing/ -``` - -The smaller stack size of musl libc means -libvips may need to be used without a cache -via `sharp.cache(false)` to avoid a stack overflow. - -### Mac OS - -[![OS X 10.9.5 Build Status](https://travis-ci.org/lovell/sharp.png?branch=master)](https://travis-ci.org/lovell/sharp) - -libvips and its dependencies are fetched and stored within `node_modules/sharp/vendor` during `npm install`. -This involves an automated HTTPS download of approximately 7MB. - -To use your own version of libvips instead of the provided binaries, make sure it is -at least the version listed under `config.libvips` in the `package.json` file and -that it can be located using `pkg-config --modversion vips-cpp`. - -### Windows x64 - -[![Windows x64 Build Status](https://ci.appveyor.com/api/projects/status/pgtul704nkhhg6sg)](https://ci.appveyor.com/project/lovell/sharp) - -libvips and its dependencies are fetched and stored within `node_modules\sharp\vendor` during `npm install`. -This involves an automated HTTPS download of approximately 11MB. - -Only 64-bit (x64) `node.exe` is supported. - -### FreeBSD - -libvips must be installed before `npm install` is run. -This can be achieved via [FreshPorts](https://www.freshports.org/graphics/vips/): - -```sh -cd /usr/ports/graphics/vips/ && make install clean -``` - -FreeBSD's gcc v4 and v5 need `CXXFLAGS=-D_GLIBCXX_USE_C99` set for C++11 support due to -https://bugs.freebsd.org/bugzilla/show_bug.cgi?id=193528 - -### Heroku - -libvips and its dependencies are fetched and stored within `node_modules\sharp\vendor` during `npm install`. -This involves an automated HTTPS download of approximately 7MB. - -Set [NODE_MODULES_CACHE](https://devcenter.heroku.com/articles/nodejs-support#cache-behavior) -to `false` when using the `yarn` package manager. - -### Docker - -[Marc Bachmann](https://github.com/marcbachmann) maintains an -[Ubuntu-based Dockerfile for libvips](https://github.com/marcbachmann/dockerfile-libvips). - -```sh -docker pull marcbachmann/libvips -``` - -[Will Jordan](https://github.com/wjordan) maintains an -[Alpine-based Dockerfile for libvips](https://github.com/wjordan/dockerfile-libvips). - -```sh -docker pull wjordan/libvips -``` - -[Tailor Brands](https://github.com/TailorBrands) maintain -[Debian-based Dockerfiles for libvips and nodejs](https://github.com/TailorBrands/docker-libvips). - -```sh -docker pull tailor/docker-libvips -``` - -### AWS Lambda - -In order to use sharp on AWS Lambda, you need to [create a deployment package](http://docs.aws.amazon.com/lambda/latest/dg/nodejs-create-deployment-pkg.html). Because sharp -downloads and links libraries for the current platform during `npm install` you have to -do this on a system similar to the [Lambda Execution Environment](http://docs.aws.amazon.com/lambda/latest/dg/current-supported-versions.html). The easiest ways to do this, is to setup a -small t2.micro instance using the AMI ID listed in the previous link, ssh into it as ec2-user -and follow the instructions below. - -Install dependencies: - -```sh -curl -s https://rpm.nodesource.com/setup_4.x | sudo bash - -sudo yum install -y gcc-c++ nodejs -``` - -Copy your code and package.json to the instance using `scp` and create a deployment package: - -```sh -cd sharp-lambda-example -npm install -zip -ur9 ../sharp-lambda-example.zip index.js node_modules -``` - -You can now download your deployment ZIP using `scp` and upload it to Lambda. Be sure to set your Lambda runtime to Node.js 4.3. - -**Performance Tip:** To get the best performance on Lambda choose the largest memory available because this also gives you the most cpu time (a 1536 MB function is 12x faster than a 128 MB function). - -### Build tools - -* [gulp-responsive](https://www.npmjs.com/package/gulp-responsive) -* [grunt-sharp](https://www.npmjs.com/package/grunt-sharp) - -### CLI tools - -* [sharp-cli](https://www.npmjs.com/package/sharp-cli) - -### Security - -Many users of this module process untrusted, user-supplied images, -but there are aspects of security to consider when doing so. - -It is possible to compile libvips with support for various third-party image loaders. -Each of these libraries has undergone differing levels of security testing. - -Whilst tools such as [American Fuzzy Lop](http://lcamtuf.coredump.cx/afl/) -and [Valgrind](http://valgrind.org/) have been used to test -the most popular web-based formats, as well as libvips itself, -you are advised to perform your own testing and sandboxing. - -ImageMagick in particular has a relatively large attack surface, -which can be partially mitigated with a -[policy.xml](http://www.imagemagick.org/script/resources.php) -configuration file to prevent the use of coders known to be vulnerable. - -```xml - - - - - - - - - - - -``` - -Set the `MAGICK_CONFIGURE_PATH` environment variable -to the directory containing the `policy.xml` file. - -### Pre-compiled libvips binaries - -If a global installation of libvips that meets the -minimum version requirement cannot be found, -this module will attempt to download a pre-compiled bundle of libvips -and its dependencies on Linux and Windows machines. - -Should you need to manually download and inspect these files, -you can do so via https://dl.bintray.com/lovell/sharp/ - -Should you wish to install these from your own location, -set the `SHARP_DIST_BASE_URL` environment variable, e.g. - -```sh -SHARP_DIST_BASE_URL="https://hostname/path/" npm install sharp -``` - -to use `https://hostname/path/libvips-x.y.z-platform.tar.gz`. - -### Licences - -This module is licensed under the terms of the -[Apache 2.0 Licence](https://github.com/lovell/sharp/blob/master/LICENSE). - -The libraries downloaded and used by this module -are done so under the terms of the following licences, -all of which are compatible with the Apache 2.0 Licence. - -Use of libraries under the terms of the LGPLv3 is via the -"any later version" clause of the LGPLv2 or LGPLv2.1. - -| Library | Used under the terms of | -|---------------|----------------------------------------------------------------------------------------------------------| -| cairo | Mozilla Public License 2.0 | -| expat | MIT Licence | -| fontconfig | [fontconfig Licence](https://cgit.freedesktop.org/fontconfig/tree/COPYING) (BSD-like) | -| freetype | [freetype Licence](http://git.savannah.gnu.org/cgit/freetype/freetype2.git/tree/docs/FTL.TXT) (BSD-like) | -| giflib | MIT Licence | -| glib | LGPLv3 | -| harfbuzz | MIT Licence | -| lcms | MIT Licence | -| libcroco | LGPLv3 | -| libexif | LGPLv3 | -| libffi | MIT Licence | -| libgsf | LGPLv3 | -| libjpeg-turbo | [zlib License, IJG License](https://github.com/libjpeg-turbo/libjpeg-turbo/blob/master/LICENSE.md) | -| libpng | [libpng License](http://www.libpng.org/pub/png/src/libpng-LICENSE.txt) | -| librsvg | LGPLv3 | -| libtiff | [libtiff License](http://www.libtiff.org/misc.html) (BSD-like) | -| libvips | LGPLv3 | -| libwebp | New BSD License | -| libxml2 | MIT Licence | -| pango | LGPLv3 | -| pixman | MIT Licence | -| zlib | [zlib Licence](https://github.com/madler/zlib/blob/master/zlib.h) | diff --git a/docs/package.json b/docs/package.json new file mode 100644 index 000000000..7eaa41943 --- /dev/null +++ b/docs/package.json @@ -0,0 +1,18 @@ +{ + "name": "sharp-docs", + "type": "module", + "version": "0.0.1", + "private": true, + "scripts": { + "dev": "astro dev", + "start": "astro dev", + "build": "astro build", + "preview": "astro preview", + "astro": "astro" + }, + "dependencies": { + "@astrojs/starlight": "^0.36.2", + "astro": "^5.15.3", + "starlight-auto-sidebar": "^0.1.3" + } +} diff --git a/docs/performance.md b/docs/performance.md deleted file mode 100644 index 376d0c5df..000000000 --- a/docs/performance.md +++ /dev/null @@ -1,73 +0,0 @@ -# Performance - -### Test environment - -* AWS EC2 eu-central-1 [c4.xlarge](http://aws.amazon.com/ec2/instance-types/#c4) (4x E5-2666 v3 @ 2.90GHz) -* Ubuntu 16.04.1 LTS (HVM, SSD, 20161115, ami-82cf0aed) -* Node.js v6.9.1 - -### The contenders - -* [jimp](https://www.npmjs.com/package/jimp) v0.2.27 - Image processing in pure JavaScript. Bilinear interpolation only. -* [lwip](https://www.npmjs.com/package/lwip) v0.0.9 - Wrapper around CImg. Compiles outdated, insecure dependencies from source. -* [mapnik](https://www.npmjs.org/package/mapnik) v3.5.14 - Whilst primarily a map renderer, Mapnik contains bitmap image utilities. -* [imagemagick-native](https://www.npmjs.com/package/imagemagick-native) v1.9.3 - Wrapper around libmagick++, supports Buffers only. -* [imagemagick](https://www.npmjs.com/package/imagemagick) v0.1.3 - Supports filesystem only and "*has been unmaintained for a long time*". -* [gm](https://www.npmjs.com/package/gm) v1.23.0 - Fully featured wrapper around GraphicsMagick's `gm` command line utility. -* sharp v0.17.0 / libvips v8.4.2 - Caching within libvips disabled to ensure a fair comparison. - -### The task - -Decompress a 2725x2225 JPEG image, -resize to 720x588 using Lanczos 3 resampling (where available), -then compress to JPEG at a "quality" setting of 80. - -### Results - -| Module | Input | Output | Ops/sec | Speed-up | -| :----------------- | :----- | :----- | ------: | -------: | -| jimp (bilinear) | buffer | buffer | 1.06 | 1.0 | -| lwip | buffer | buffer | 1.87 | 1.8 | -| mapnik | buffer | buffer | 2.91 | 2.7 | -| imagemagick-native | buffer | buffer | 4.03 | 3.8 | -| imagemagick | file | file | 7.10 | 6.7 | -| gm | buffer | buffer | 7.08 | 6.7 | -| gm | file | file | 7.10 | 6.7 | -| sharp | stream | stream | 27.61 | 26.0 | -| sharp | file | file | 28.41 | 26.8 | -| sharp | buffer | file | 28.71 | 27.1 | -| sharp | file | buffer | 28.60 | 27.0 | -| sharp | buffer | buffer | 29.08 | 27.4 | - -Greater libvips performance can be expected with caching enabled (default) -and using 8+ core machines, especially those with larger L1/L2 CPU caches. - -The I/O limits of the relevant (de)compression library will generally determine maximum throughput. - -### Benchmark test prerequisites - -Requires both _ImageMagick_ and _GraphicsMagick_: - -```sh -brew install imagemagick -brew install graphicsmagick -``` - -```sh -sudo apt-get install imagemagick libmagick++-dev graphicsmagick -``` - -```sh -sudo yum install ImageMagick-devel ImageMagick-c++-devel GraphicsMagick -``` - -### Running the benchmark test - -```sh -git clone https://github.com/lovell/sharp.git -cd sharp -npm install -cd test/bench -npm install -npm test -``` diff --git a/docs/public/api-resize-fit.svg b/docs/public/api-resize-fit.svg new file mode 100644 index 000000000..9227a0d2f --- /dev/null +++ b/docs/public/api-resize-fit.svg @@ -0,0 +1,61 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + cover + + + + contain + + + + fill + + + + + inside + + + + + outside + + diff --git a/docs/public/favicon.svg b/docs/public/favicon.svg new file mode 100644 index 000000000..fc185469f --- /dev/null +++ b/docs/public/favicon.svg @@ -0,0 +1,5 @@ + + + + + \ No newline at end of file diff --git a/docs/public/humans.txt b/docs/public/humans.txt new file mode 100644 index 000000000..7119152ab --- /dev/null +++ b/docs/public/humans.txt @@ -0,0 +1,328 @@ +/* THANKS */ + +Name: John Cupitt +GitHub: https://github.com/jcupitt + +Name: Pierre Inglebert +GitHub: https://github.com/pierreinglebert + +Name: Jonathan Ong +GitHub: https://github.com/jonathanong + +Name: Chanon Sajjamanochai +GitHub: https://github.com/chanon + +Name: Juliano Julio +GitHub: https://github.com/julianojulio + +Name: Daniel Gasienica +GitHub: https://github.com/gasi + +Name: Julian Walker +GitHub: https://github.com/julianwa + +Name: Amit Pitaru +GitHub: https://github.com/apitaru + +Name: Brandon Aaron +GitHub: https://github.com/brandonaaron + +Name: Andreas Lind +GitHub: https://github.com/papandreou + +Name: Maurus Cuelenaere +GitHub: https://github.com/mcuelenaere + +Name: Linus Unnebäck +GitHub: https://github.com/LinusU + +Name: Victor Mateevitsi +GitHub: https://github.com/mvictoras + +Name: Alaric Holloway +GitHub: https://github.com/skedastik + +Name: Bernhard K. Weisshuhn +GitHub: https://github.com/bkw + +Name: David A. Carley +GitHub: https://github.com/dacarley + +Name: John Tobin +GitHub: https://github.com/jtobinisaniceguy + +Name: Kenton Gray +GitHub: https://github.com/kentongray + +Name: Felix Bünemann +GitHub: https://github.com/felixbuenemann + +Name: Samy Al Zahrani +GitHub: https://github.com/salzhrani + +Name: Chintan Thakkar +GitHub: https://github.com/lemnisk8 + +Name: F. Orlando Galashan +GitHub: https://github.com/frulo + +Name: Kleis Auke Wolthuizen +GitHub: https://github.com/kleisauke + +Name: Matt Hirsch +GitHub: https://github.com/mhirsch + +Name: Rahul Nanwani +GitHub: https://github.com/rnanwani + +Name: Matthias Thoemmes +GitHub: https://github.com/cmtt + +Name: Patrick Paskaris +GitHub: https://github.com/ppaskaris + +Name: Jérémy Lal +GitHub: https://github.com/kapouer + +Name: Alice Monday +GitHub: https://github.com/alice0meta + +Name: Kristo Jorgenson +GitHub: https://github.com/kristojorg + +Name: Yves Bos +GitHub: https://github.com/YvesBos + +Name: Nicolas Coden +GitHub: https://github.com/ncoden + +Name: Matt Parrish +GitHub: https://github.com/pbomb + +Name: Matthew McEachen +GitHub: https://github.com/mceachen + +Name: Jarda Kotěšovec +GitHub: https://github.com/jardakotesovec + +Name: Kenric D'Souza +GitHub: https://github.com/AzureByte + +Name: Oleh Aleinyk +GitHub: https://github.com/oaleynik + +Name: Marcel Bretschneider +GitHub: https://github.com/3epnm + +Name: Andrea Bianco +GitHub: https://github.com/BiancoA + +Name: Rik Heywood +GitHub: https://github.com/rikh42 + +Name: Thomas Parisot +GitHub: https://github.com/oncletom + +Name: Nathan Graves +GitHub: https://github.com/woolite64 + +Name: Tom Lokhorst +GitHub: https://github.com/tomlokhorst + +Name: Espen Hovlandsdal +GitHub: https://github.com/rexxars + +Name: Sylvain Dumont +GitHub: https://github.com/sylvaindumont + +Name: Alun Davies +GitHub: https://github.com/alundavies + +Name: Aidan Hoolachan +GitHub: https://github.com/ajhool + +Name: Axel Eirola +GitHub: https://github.com/aeirola + +Name: Freezy +GitHub: https://github.com/freezy + +Name: Julian Aubourg +GitHub: https://github.com/jaubourg + +Name: Keith Belovay +GitHub: https://github.com/fromkeith + +Name: Michael B. Klein +GitHub: https://github.com/mbklein + +Name: Jakub Michálek +GitHub: https://github.com/Goues + +Name: Ilya Ovdin +GitHub: https://github.com/iovdin + +Name: Andargor +GitHub: https://github.com/Andargor + +Name: Nicolas Stepien +GitHub: https://github.com/MayhemYDG + +Name: Paul Neave +GitHub: https://github.com/neave + +Name: Brendan Kennedy +GitHub: https://github.com/rustyguts + +Name: Brychan Bennett-Odlum +GitHub: https://github.com/BrychanOdlum + +Name: Edward Silverton +GitHub: https://github.com/edsilv + +Name: Dumitru Deveatii +GitHub: https://github.com/dimadeveatii + +Name: Roland Asmann +GitHub: https://github.com/malice00 + +Name: Roman Malieiev +GitHub: https://github.com/romaleev + +Name: Jerome Vouillon +GitHub: https://github.com/vouillon + +Name: Tomáš Szabo +GitHub: https://github.com/deftomat + +Name: Robert O'Rourke +GitHub: https://github.com/roborourke + +Name: Denis Soldatov +GitHub: https://github.com/derom + +Name: Stefan Probst +GitHub: https://github.com/stefanprobst + +Name: Thomas Beiganz +GitHub: https://github.com/beig + +Name: Florian Busch +GitHub: https://github.com/florian-busch + +Name: Matthieu Salettes +GitHub: https://github.com/msalettes + +Name: Taneli Vatanen +GitHub: https://github.com/Daiz + +Name: Mart Jansink +GitHub: https://github.com/mart-jansink + +Name: Tenpi +GitHub: https://github.com/Tenpi + +Name: Zaruike +GitHub: https://github.com/Zaruike + +Name: Erlend F +GitHub: https://github.com/erf + +Name: Drian Naude +GitHub: https://github.com/driannaude + +Name: Max Gordon +GitHub: https://github.com/gforge + +Name: Chris Banks +GitHub: https://github.com/christopherbradleybanks + +Name: codepage949 +GitHub: https://github.com/codepage949 + +Name: Chris Hranj +GitHub: https://github.com/Brodan + +Name: Ankur Parihar +GitHub: https://github.com/ankurparihar + +Name: Joona Heinikoski +GitHub: https://github.com/joonamo + +Name: AlexanderTheGrey +GitHub: https://github.com/AlexanderTheGrey + +Name: Blayne Chard +GitHub: https://github.com/blacha + +Name: Brahim +GitHub: https://github.com/brahima + +Name: Anton Marsden +GitHub: https://github.com/antonmarsden + +Name: Marcos Casagrande +GitHub: https://github.com/marcosc90 + +Name: Emanuel Jöbstl +GitHub: https://github.com/ejoebstl + +Name: Tomasz Janowski +GitHub: https://github.com/janaz + +Name: Lachlan Newman +GitHub: https://github.com/LachlanNewman + +Name: BJJ +GitHub: https://github.com/bianjunjie1981 + +Name: Dennis Beatty +GitHub: https://github.com/dnsbty + +Name: Ingvar Stepanyan +GitHub: https://github.com/RReverser + +Name: Tamás András Horváth +GitHub: https://github.com/icetee + +Name: Aaron Che +GitHub: https://github.com/yolopunk + +Name: Mert Alev +GitHub: https://github.com/mertalev + +Name: Adriaan Meuris +GitHub: https://github.com/adriaanmeuris + +Name: Richard Hillmann +GitHub: https://github.com/project0 + +Name: Pongsatorn Manusopit +GitHub: https://github.com/ton11797 + +Name: Nathan Keynes +GitHub: https://github.com/nkeynes + +Name: Sumit D +GitHub: https://github.com/sumitd2 + +Name: Caleb Meredith +GitHub: https://github.com/calebmer + +Name: Don Denton +GitHub: https://github.com/happycollision + +Name: Florent Zabera +GitHub: https://github.com/florentzabera + +Name: Quentin Pinçon +GitHub: https://github.com/qpincon + +Name: Hans Chen +GitHub: https://github.com/hans00 + +Name: Thibaut Patel +GitHub: https://github.com/tpatel + +Name: Maël Nison +GitHub: https://github.com/arcanis diff --git a/docs/public/robots.txt b/docs/public/robots.txt new file mode 100644 index 000000000..a829753d9 --- /dev/null +++ b/docs/public/robots.txt @@ -0,0 +1,4 @@ +User-agent: * +Disallow: + +Sitemap: https://sharp.pixelplumbing.com/sitemap-index.xml diff --git a/docs/public/sharp-logo-mono.svg b/docs/public/sharp-logo-mono.svg new file mode 100644 index 000000000..65bb881b7 --- /dev/null +++ b/docs/public/sharp-logo-mono.svg @@ -0,0 +1,11 @@ + + + + + + + + + + + \ No newline at end of file diff --git a/docs/public/sharp-logo.svg b/docs/public/sharp-logo.svg new file mode 100644 index 000000000..fc185469f --- /dev/null +++ b/docs/public/sharp-logo.svg @@ -0,0 +1,5 @@ + + + + + \ No newline at end of file diff --git a/docs/src/assets/sharp-logo.svg b/docs/src/assets/sharp-logo.svg new file mode 100644 index 000000000..fc185469f --- /dev/null +++ b/docs/src/assets/sharp-logo.svg @@ -0,0 +1,5 @@ + + + + + \ No newline at end of file diff --git a/docs/src/content.config.ts b/docs/src/content.config.ts new file mode 100644 index 000000000..06cf12929 --- /dev/null +++ b/docs/src/content.config.ts @@ -0,0 +1,10 @@ +import { defineCollection } from 'astro:content'; +import { docsLoader } from '@astrojs/starlight/loaders'; +import { docsSchema } from '@astrojs/starlight/schema'; +import { autoSidebarLoader } from 'starlight-auto-sidebar/loader' +import { autoSidebarSchema } from 'starlight-auto-sidebar/schema' + +export const collections = { + docs: defineCollection({ loader: docsLoader(), schema: docsSchema() }), + autoSidebar: defineCollection({ loader: autoSidebarLoader(), schema: autoSidebarSchema() }) +}; diff --git a/docs/src/content/docs/api-channel.md b/docs/src/content/docs/api-channel.md new file mode 100644 index 000000000..214e8611d --- /dev/null +++ b/docs/src/content/docs/api-channel.md @@ -0,0 +1,143 @@ +--- +# This file was auto-generated from JSDoc in lib/channel.js +title: Channel manipulation +--- + +## removeAlpha +> removeAlpha() ⇒ Sharp + +Remove alpha channels, if any. This is a no-op if the image does not have an alpha channel. + +See also [flatten](/api-operation/#flatten). + + +**Example** +```js +sharp('rgba.png') + .removeAlpha() + .toFile('rgb.png', function(err, info) { + // rgb.png is a 3 channel image without an alpha channel + }); +``` + + +## ensureAlpha +> ensureAlpha([alpha]) ⇒ Sharp + +Ensure the output image has an alpha transparency channel. +If missing, the added alpha channel will have the specified +transparency level, defaulting to fully-opaque (1). +This is a no-op if the image already has an alpha channel. + + +**Throws**: + +- Error Invalid alpha transparency level + +**Since**: 0.21.2 + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [alpha] | number | 1 | alpha transparency level (0=fully-transparent, 1=fully-opaque) | + +**Example** +```js +// rgba.png will be a 4 channel image with a fully-opaque alpha channel +await sharp('rgb.jpg') + .ensureAlpha() + .toFile('rgba.png') +``` +**Example** +```js +// rgba is a 4 channel image with a fully-transparent alpha channel +const rgba = await sharp(rgb) + .ensureAlpha(0) + .toBuffer(); +``` + + +## extractChannel +> extractChannel(channel) ⇒ Sharp + +Extract a single channel from a multi-channel image. + +The output colourspace will be either `b-w` (8-bit) or `grey16` (16-bit). + + +**Throws**: + +- Error Invalid channel + + +| Param | Type | Description | +| --- | --- | --- | +| channel | number \| string | zero-indexed channel/band number to extract, or `red`, `green`, `blue` or `alpha`. | + +**Example** +```js +// green.jpg is a greyscale image containing the green channel of the input +await sharp(input) + .extractChannel('green') + .toFile('green.jpg'); +``` +**Example** +```js +// red1 is the red value of the first pixel, red2 the second pixel etc. +const [red1, red2, ...] = await sharp(input) + .extractChannel(0) + .raw() + .toBuffer(); +``` + + +## joinChannel +> joinChannel(images, options) ⇒ Sharp + +Join one or more channels to the image. +The meaning of the added channels depends on the output colourspace, set with `toColourspace()`. +By default the output image will be web-friendly sRGB, with additional channels interpreted as alpha channels. +Channel ordering follows vips convention: +- sRGB: 0: Red, 1: Green, 2: Blue, 3: Alpha. +- CMYK: 0: Magenta, 1: Cyan, 2: Yellow, 3: Black, 4: Alpha. + +Buffers may be any of the image formats supported by sharp. +For raw pixel input, the `options` object should contain a `raw` attribute, which follows the format of the attribute of the same name in the `sharp()` constructor. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Description | +| --- | --- | --- | +| images | Array.<(string\|Buffer)> \| string \| Buffer | one or more images (file paths, Buffers). | +| options | Object | image options, see `sharp()` constructor. | + + + +## bandbool +> bandbool(boolOp) ⇒ Sharp + +Perform a bitwise boolean operation on all input image channels (bands) to produce a single channel output image. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Description | +| --- | --- | --- | +| boolOp | string | one of `and`, `or` or `eor` to perform that bitwise operation, like the C logic operators `&`, `|` and `^` respectively. | + +**Example** +```js +sharp('3-channel-rgb-input.png') + .bandbool(sharp.bool.and) + .toFile('1-channel-output.png', function (err, info) { + // The output will be a single channel image where each pixel `P = R & G & B`. + // If `I(1,1) = [247, 170, 14] = [0b11110111, 0b10101010, 0b00001111]` + // then `O(1,1) = 0b11110111 & 0b10101010 & 0b00001111 = 0b00000010 = 2`. + }); +``` \ No newline at end of file diff --git a/docs/src/content/docs/api-colour.md b/docs/src/content/docs/api-colour.md new file mode 100644 index 000000000..6fb81debd --- /dev/null +++ b/docs/src/content/docs/api-colour.md @@ -0,0 +1,150 @@ +--- +# This file was auto-generated from JSDoc in lib/colour.js +title: Colour manipulation +--- + +## tint +> tint(tint) ⇒ Sharp + +Tint the image using the provided colour. +An alpha channel may be present and will be unchanged by the operation. + + +**Throws**: + +- Error Invalid parameter + + +| Param | Type | Description | +| --- | --- | --- | +| tint | string \| Object | Parsed by the [color](https://www.npmjs.org/package/color) module. | + +**Example** +```js +const output = await sharp(input) + .tint({ r: 255, g: 240, b: 16 }) + .toBuffer(); +``` + + +## greyscale +> greyscale([greyscale]) ⇒ Sharp + +Convert to 8-bit greyscale; 256 shades of grey. +This is a linear operation. If the input image is in a non-linear colour space such as sRGB, use `gamma()` with `greyscale()` for the best results. +By default the output image will be web-friendly sRGB and contain three (identical) colour channels. +This may be overridden by other sharp operations such as `toColourspace('b-w')`, +which will produce an output image containing one colour channel. +An alpha channel may be present, and will be unchanged by the operation. + + + +| Param | Type | Default | +| --- | --- | --- | +| [greyscale] | Boolean | true | + +**Example** +```js +const output = await sharp(input).greyscale().toBuffer(); +``` + + +## grayscale +> grayscale([grayscale]) ⇒ Sharp + +Alternative spelling of `greyscale`. + + + +| Param | Type | Default | +| --- | --- | --- | +| [grayscale] | Boolean | true | + + + +## pipelineColourspace +> pipelineColourspace([colourspace]) ⇒ Sharp + +Set the pipeline colourspace. + +The input image will be converted to the provided colourspace at the start of the pipeline. +All operations will use this colourspace before converting to the output colourspace, +as defined by [toColourspace](#tocolourspace). + + +**Throws**: + +- Error Invalid parameters + +**Since**: 0.29.0 + +| Param | Type | Description | +| --- | --- | --- | +| [colourspace] | string | pipeline colourspace e.g. `rgb16`, `scrgb`, `lab`, `grey16` [...](https://www.libvips.org/API/current/enum.Interpretation.html) | + +**Example** +```js +// Run pipeline in 16 bits per channel RGB while converting final result to 8 bits per channel sRGB. +await sharp(input) + .pipelineColourspace('rgb16') + .toColourspace('srgb') + .toFile('16bpc-pipeline-to-8bpc-output.png') +``` + + +## pipelineColorspace +> pipelineColorspace([colorspace]) ⇒ Sharp + +Alternative spelling of `pipelineColourspace`. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Description | +| --- | --- | --- | +| [colorspace] | string | pipeline colorspace. | + + + +## toColourspace +> toColourspace([colourspace]) ⇒ Sharp + +Set the output colourspace. +By default output image will be web-friendly sRGB, with additional channels interpreted as alpha channels. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Description | +| --- | --- | --- | +| [colourspace] | string | output colourspace e.g. `srgb`, `rgb`, `cmyk`, `lab`, `b-w` [...](https://www.libvips.org/API/current/enum.Interpretation.html) | + +**Example** +```js +// Output 16 bits per pixel RGB +await sharp(input) + .toColourspace('rgb16') + .toFile('16-bpp.png') +``` + + +## toColorspace +> toColorspace([colorspace]) ⇒ Sharp + +Alternative spelling of `toColourspace`. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Description | +| --- | --- | --- | +| [colorspace] | string | output colorspace. | \ No newline at end of file diff --git a/docs/src/content/docs/api-composite.md b/docs/src/content/docs/api-composite.md new file mode 100644 index 000000000..d4f9b8550 --- /dev/null +++ b/docs/src/content/docs/api-composite.md @@ -0,0 +1,103 @@ +--- +# This file was auto-generated from JSDoc in lib/composite.js +title: Compositing images +--- + +## composite +> composite(images) ⇒ Sharp + +Composite image(s) over the processed (resized, extracted etc.) image. + +The images to composite must be the same size or smaller than the processed image. +If both `top` and `left` options are provided, they take precedence over `gravity`. + +Other operations in the same processing pipeline (e.g. resize, rotate, flip, +flop, extract) will always be applied to the input image before composition. + +The `blend` option can be one of `clear`, `source`, `over`, `in`, `out`, `atop`, +`dest`, `dest-over`, `dest-in`, `dest-out`, `dest-atop`, +`xor`, `add`, `saturate`, `multiply`, `screen`, `overlay`, `darken`, `lighten`, +`colour-dodge`, `color-dodge`, `colour-burn`,`color-burn`, +`hard-light`, `soft-light`, `difference`, `exclusion`. + +More information about blend modes can be found at +https://www.libvips.org/API/current/enum.BlendMode.html +and https://www.cairographics.org/operators/ + + +**Throws**: + +- Error Invalid parameters + +**Since**: 0.22.0 + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| images | Array.<Object> | | Ordered list of images to composite | +| [images[].input] | Buffer \| String | | Buffer containing image data, String containing the path to an image file, or Create object (see below) | +| [images[].input.create] | Object | | describes a blank overlay to be created. | +| [images[].input.create.width] | Number | | | +| [images[].input.create.height] | Number | | | +| [images[].input.create.channels] | Number | | 3-4 | +| [images[].input.create.background] | String \| Object | | parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. | +| [images[].input.text] | Object | | describes a new text image to be created. | +| [images[].input.text.text] | string | | text to render as a UTF-8 string. It can contain Pango markup, for example `LeMonde`. | +| [images[].input.text.font] | string | | font name to render with. | +| [images[].input.text.fontfile] | string | | absolute filesystem path to a font file that can be used by `font`. | +| [images[].input.text.width] | number | 0 | integral number of pixels to word-wrap at. Lines of text wider than this will be broken at word boundaries. | +| [images[].input.text.height] | number | 0 | integral number of pixels high. When defined, `dpi` will be ignored and the text will automatically fit the pixel resolution defined by `width` and `height`. Will be ignored if `width` is not specified or set to 0. | +| [images[].input.text.align] | string | "'left'" | text alignment (`'left'`, `'centre'`, `'center'`, `'right'`). | +| [images[].input.text.justify] | boolean | false | set this to true to apply justification to the text. | +| [images[].input.text.dpi] | number | 72 | the resolution (size) at which to render the text. Does not take effect if `height` is specified. | +| [images[].input.text.rgba] | boolean | false | set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for Pango markup features like `Red!`. | +| [images[].input.text.spacing] | number | 0 | text line height in points. Will use the font line height if none is specified. | +| [images[].autoOrient] | Boolean | false | set to true to use EXIF orientation data, if present, to orient the image. | +| [images[].blend] | String | 'over' | how to blend this image with the image below. | +| [images[].gravity] | String | 'centre' | gravity at which to place the overlay. | +| [images[].top] | Number | | the pixel offset from the top edge. | +| [images[].left] | Number | | the pixel offset from the left edge. | +| [images[].tile] | Boolean | false | set to true to repeat the overlay image across the entire image with the given `gravity`. | +| [images[].premultiplied] | Boolean | false | set to true to avoid premultiplying the image below. Equivalent to the `--premultiplied` vips option. | +| [images[].density] | Number | 72 | number representing the DPI for vector overlay image. | +| [images[].raw] | Object | | describes overlay when using raw pixel data. | +| [images[].raw.width] | Number | | | +| [images[].raw.height] | Number | | | +| [images[].raw.channels] | Number | | | +| [images[].animated] | boolean | false | Set to `true` to read all frames/pages of an animated image. | +| [images[].failOn] | string | "'warning'" | @see [constructor parameters](/api-constructor/) | +| [images[].limitInputPixels] | number \| boolean | 268402689 | @see [constructor parameters](/api-constructor/) | + +**Example** +```js +await sharp(background) + .composite([ + { input: layer1, gravity: 'northwest' }, + { input: layer2, gravity: 'southeast' }, + ]) + .toFile('combined.png'); +``` +**Example** +```js +const output = await sharp('input.gif', { animated: true }) + .composite([ + { input: 'overlay.png', tile: true, blend: 'saturate' } + ]) + .toBuffer(); +``` +**Example** +```js +sharp('input.png') + .rotate(180) + .resize(300) + .flatten( { background: '#ff6600' } ) + .composite([{ input: 'overlay.png', gravity: 'southeast' }]) + .sharpen() + .withMetadata() + .webp( { quality: 90 } ) + .toBuffer() + .then(function(outputBuffer) { + // outputBuffer contains upside down, 300px wide, alpha channel flattened + // onto orange background, composited with overlay.png with SE gravity, + // sharpened, with metadata, 90% quality WebP image data. Phew! + }); +``` \ No newline at end of file diff --git a/docs/src/content/docs/api-constructor.md b/docs/src/content/docs/api-constructor.md new file mode 100644 index 000000000..3504239dc --- /dev/null +++ b/docs/src/content/docs/api-constructor.md @@ -0,0 +1,274 @@ +--- +# This file was auto-generated from JSDoc in lib/constructor.js +title: Constructor +--- + +## Sharp +> Sharp + + +**Emits**: Sharp#event:info, Sharp#event:warning + + +### new +> new Sharp([input], [options]) + +Constructor factory to create an instance of `sharp`, to which further methods are chained. + +JPEG, PNG, WebP, GIF, AVIF or TIFF format image data can be streamed out from this object. +When using Stream based output, derived attributes are available from the `info` event. + +Non-critical problems encountered during processing are emitted as `warning` events. + +Implements the [stream.Duplex](http://nodejs.org/api/stream.html#stream_class_stream_duplex) class. + +When loading more than one page/frame of an animated image, +these are combined as a vertically-stacked "toilet roll" image +where the overall height is the `pageHeight` multiplied by the number of `pages`. + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [input] | Buffer \| ArrayBuffer \| Uint8Array \| Uint8ClampedArray \| Int8Array \| Uint16Array \| Int16Array \| Uint32Array \| Int32Array \| Float32Array \| Float64Array \| string \| Array | | if present, can be a Buffer / ArrayBuffer / Uint8Array / Uint8ClampedArray containing JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image data, or a TypedArray containing raw pixel image data, or a String containing the filesystem path to an JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image file. An array of inputs can be provided, and these will be joined together. JPEG, PNG, WebP, AVIF, GIF, SVG, TIFF or raw pixel image data can be streamed into the object when not present. | +| [options] | Object | | if present, is an Object with optional attributes. | +| [options.failOn] | string | "'warning'" | When to abort processing of invalid pixel data, one of (in order of sensitivity, least to most): 'none', 'truncated', 'error', 'warning'. Higher levels imply lower levels. Invalid metadata will always abort. | +| [options.limitInputPixels] | number \| boolean | 268402689 | Do not process input images where the number of pixels (width x height) exceeds this limit. Assumes image dimensions contained in the input metadata can be trusted. An integral Number of pixels, zero or false to remove limit, true to use default limit of 268402689 (0x3FFF x 0x3FFF). | +| [options.unlimited] | boolean | false | Set this to `true` to remove safety features that help prevent memory exhaustion (JPEG, PNG, SVG, HEIF). | +| [options.autoOrient] | boolean | false | Set this to `true` to rotate/flip the image to match EXIF `Orientation`, if any. | +| [options.sequentialRead] | boolean | true | Set this to `false` to use random access rather than sequential read. Some operations will do this automatically. | +| [options.density] | number | 72 | number representing the DPI for vector images in the range 1 to 100000. | +| [options.ignoreIcc] | number | false | should the embedded ICC profile, if any, be ignored. | +| [options.pages] | number | 1 | Number of pages to extract for multi-page input (GIF, WebP, TIFF), use -1 for all pages. | +| [options.page] | number | 0 | Page number to start extracting from for multi-page input (GIF, WebP, TIFF), zero based. | +| [options.animated] | boolean | false | Set to `true` to read all frames/pages of an animated image (GIF, WebP, TIFF), equivalent of setting `pages` to `-1`. | +| [options.raw] | Object | | describes raw pixel input image data. See `raw()` for pixel ordering. | +| [options.raw.width] | number | | integral number of pixels wide. | +| [options.raw.height] | number | | integral number of pixels high. | +| [options.raw.channels] | number | | integral number of channels, between 1 and 4. | +| [options.raw.premultiplied] | boolean | | specifies that the raw input has already been premultiplied, set to `true` to avoid sharp premultiplying the image. (optional, default `false`) | +| [options.raw.pageHeight] | number | | The pixel height of each page/frame for animated images, must be an integral factor of `raw.height`. | +| [options.create] | Object | | describes a new image to be created. | +| [options.create.width] | number | | integral number of pixels wide. | +| [options.create.height] | number | | integral number of pixels high. | +| [options.create.channels] | number | | integral number of channels, either 3 (RGB) or 4 (RGBA). | +| [options.create.background] | string \| Object | | parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. | +| [options.create.pageHeight] | number | | The pixel height of each page/frame for animated images, must be an integral factor of `create.height`. | +| [options.create.noise] | Object | | describes a noise to be created. | +| [options.create.noise.type] | string | | type of generated noise, currently only `gaussian` is supported. | +| [options.create.noise.mean] | number | 128 | Mean value of pixels in the generated noise. | +| [options.create.noise.sigma] | number | 30 | Standard deviation of pixel values in the generated noise. | +| [options.text] | Object | | describes a new text image to be created. | +| [options.text.text] | string | | text to render as a UTF-8 string. It can contain Pango markup, for example `LeMonde`. | +| [options.text.font] | string | | font name to render with. | +| [options.text.fontfile] | string | | absolute filesystem path to a font file that can be used by `font`. | +| [options.text.width] | number | 0 | Integral number of pixels to word-wrap at. Lines of text wider than this will be broken at word boundaries. | +| [options.text.height] | number | 0 | Maximum integral number of pixels high. When defined, `dpi` will be ignored and the text will automatically fit the pixel resolution defined by `width` and `height`. Will be ignored if `width` is not specified or set to 0. | +| [options.text.align] | string | "'left'" | Alignment style for multi-line text (`'left'`, `'centre'`, `'center'`, `'right'`). | +| [options.text.justify] | boolean | false | set this to true to apply justification to the text. | +| [options.text.dpi] | number | 72 | the resolution (size) at which to render the text. Does not take effect if `height` is specified. | +| [options.text.rgba] | boolean | false | set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for pango markup features like `Red!`. | +| [options.text.spacing] | number | 0 | text line height in points. Will use the font line height if none is specified. | +| [options.text.wrap] | string | "'word'" | word wrapping style when width is provided, one of: 'word', 'char', 'word-char' (prefer word, fallback to char) or 'none'. | +| [options.join] | Object | | describes how an array of input images should be joined. | +| [options.join.across] | number | 1 | number of images to join horizontally. | +| [options.join.animated] | boolean | false | set this to `true` to join the images as an animated image. | +| [options.join.shim] | number | 0 | number of pixels to insert between joined images. | +| [options.join.background] | string \| Object | | parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. | +| [options.join.halign] | string | "'left'" | horizontal alignment style for images joined horizontally (`'left'`, `'centre'`, `'center'`, `'right'`). | +| [options.join.valign] | string | "'top'" | vertical alignment style for images joined vertically (`'top'`, `'centre'`, `'center'`, `'bottom'`). | +| [options.tiff] | Object | | Describes TIFF specific options. | +| [options.tiff.subifd] | number | -1 | Sub Image File Directory to extract for OME-TIFF, defaults to main image. | +| [options.svg] | Object | | Describes SVG specific options. | +| [options.svg.stylesheet] | string | | Custom CSS for SVG input, applied with a User Origin during the CSS cascade. | +| [options.svg.highBitdepth] | boolean | false | Set to `true` to render SVG input at 32-bits per channel (128-bit) instead of 8-bits per channel (32-bit) RGBA. | +| [options.pdf] | Object | | Describes PDF specific options. Requires the use of a globally-installed libvips compiled with support for PDFium, Poppler, ImageMagick or GraphicsMagick. | +| [options.pdf.background] | string \| Object | | Background colour to use when PDF is partially transparent. Parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. | +| [options.openSlide] | Object | | Describes OpenSlide specific options. Requires the use of a globally-installed libvips compiled with support for OpenSlide. | +| [options.openSlide.level] | number | 0 | Level to extract from a multi-level input, zero based. | +| [options.jp2] | Object | | Describes JPEG 2000 specific options. Requires the use of a globally-installed libvips compiled with support for OpenJPEG. | +| [options.jp2.oneshot] | boolean | false | Set to `true` to decode tiled JPEG 2000 images in a single operation, improving compatibility. | + +**Example** +```js +sharp('input.jpg') + .resize(300, 200) + .toFile('output.jpg', function(err) { + // output.jpg is a 300 pixels wide and 200 pixels high image + // containing a scaled and cropped version of input.jpg + }); +``` +**Example** +```js +// Read image data from remote URL, +// resize to 300 pixels wide, +// emit an 'info' event with calculated dimensions +// and finally write image data to writableStream +const { body } = fetch('https://...'); +const readableStream = Readable.fromWeb(body); +const transformer = sharp() + .resize(300) + .on('info', ({ height }) => { + console.log(`Image height is ${height}`); + }); +readableStream.pipe(transformer).pipe(writableStream); +``` +**Example** +```js +// Create a blank 300x200 PNG image of semi-translucent red pixels +sharp({ + create: { + width: 300, + height: 200, + channels: 4, + background: { r: 255, g: 0, b: 0, alpha: 0.5 } + } +}) +.png() +.toBuffer() +.then( ... ); +``` +**Example** +```js +// Convert an animated GIF to an animated WebP +await sharp('in.gif', { animated: true }).toFile('out.webp'); +``` +**Example** +```js +// Read a raw array of pixels and save it to a png +const input = Uint8Array.from([255, 255, 255, 0, 0, 0]); // or Uint8ClampedArray +const image = sharp(input, { + // because the input does not contain its dimensions or how many channels it has + // we need to specify it in the constructor options + raw: { + width: 2, + height: 1, + channels: 3 + } +}); +await image.toFile('my-two-pixels.png'); +``` +**Example** +```js +// Generate RGB Gaussian noise +await sharp({ + create: { + width: 300, + height: 200, + channels: 3, + noise: { + type: 'gaussian', + mean: 128, + sigma: 30 + } + } +}).toFile('noise.png'); +``` +**Example** +```js +// Generate an image from text +await sharp({ + text: { + text: 'Hello, world!', + width: 400, // max width + height: 300 // max height + } +}).toFile('text_bw.png'); +``` +**Example** +```js +// Generate an rgba image from text using pango markup and font +await sharp({ + text: { + text: 'Red!blue', + font: 'sans', + rgba: true, + dpi: 300 + } +}).toFile('text_rgba.png'); +``` +**Example** +```js +// Join four input images as a 2x2 grid with a 4 pixel gutter +const data = await sharp( + [image1, image2, image3, image4], + { join: { across: 2, shim: 4 } } +).toBuffer(); +``` +**Example** +```js +// Generate a two-frame animated image from emoji +const images = ['😀', '😛'].map(text => ({ + text: { text, width: 64, height: 64, channels: 4, rgba: true } +})); +await sharp(images, { join: { animated: true } }).toFile('out.gif'); +``` + + +## clone +> clone() ⇒ [Sharp](#Sharp) + +Take a "snapshot" of the Sharp instance, returning a new instance. +Cloned instances inherit the input of their parent instance. +This allows multiple output Streams and therefore multiple processing pipelines to share a single input Stream. + + +**Example** +```js +const pipeline = sharp().rotate(); +pipeline.clone().resize(800, 600).pipe(firstWritableStream); +pipeline.clone().extract({ left: 20, top: 20, width: 100, height: 100 }).pipe(secondWritableStream); +readableStream.pipe(pipeline); +// firstWritableStream receives auto-rotated, resized readableStream +// secondWritableStream receives auto-rotated, extracted region of readableStream +``` +**Example** +```js +// Create a pipeline that will download an image, resize it and format it to different files +// Using Promises to know when the pipeline is complete +const fs = require("fs"); +const got = require("got"); +const sharpStream = sharp({ failOn: 'none' }); + +const promises = []; + +promises.push( + sharpStream + .clone() + .jpeg({ quality: 100 }) + .toFile("originalFile.jpg") +); + +promises.push( + sharpStream + .clone() + .resize({ width: 500 }) + .jpeg({ quality: 80 }) + .toFile("optimized-500.jpg") +); + +promises.push( + sharpStream + .clone() + .resize({ width: 500 }) + .webp({ quality: 80 }) + .toFile("optimized-500.webp") +); + +// https://github.com/sindresorhus/got/blob/main/documentation/3-streams.md +got.stream("https://www.example.com/some-file.jpg").pipe(sharpStream); + +Promise.all(promises) + .then(res => { console.log("Done!", res); }) + .catch(err => { + console.error("Error processing files, let's clean it up", err); + try { + fs.unlinkSync("originalFile.jpg"); + fs.unlinkSync("optimized-500.jpg"); + fs.unlinkSync("optimized-500.webp"); + } catch (e) {} + }); +``` \ No newline at end of file diff --git a/docs/src/content/docs/api-input.md b/docs/src/content/docs/api-input.md new file mode 100644 index 000000000..8477f1ff5 --- /dev/null +++ b/docs/src/content/docs/api-input.md @@ -0,0 +1,139 @@ +--- +# This file was auto-generated from JSDoc in lib/input.js +title: Input metadata +--- + +## metadata +> metadata([callback]) ⇒ Promise.<Object> \| Sharp + +Fast access to (uncached) image metadata without decoding any compressed pixel data. + +This is read from the header of the input image. +It does not take into consideration any operations to be applied to the output image, +such as resize or rotate. + +Dimensions in the response will respect the `page` and `pages` properties of the +[constructor parameters](/api-constructor/). + +A `Promise` is returned when `callback` is not provided. + +- `format`: Name of decoder used to parse image e.g. `jpeg`, `png`, `webp`, `gif`, `svg`, `heif`, `tiff` +- `size`: Total size of image in bytes, for Stream and Buffer input only +- `width`: Number of pixels wide (EXIF orientation is not taken into consideration, see example below) +- `height`: Number of pixels high (EXIF orientation is not taken into consideration, see example below) +- `space`: Name of colour space interpretation e.g. `srgb`, `rgb`, `cmyk`, `lab`, `b-w` [...](https://www.libvips.org/API/current/enum.Interpretation.html) +- `channels`: Number of bands e.g. `3` for sRGB, `4` for CMYK +- `depth`: Name of pixel depth format e.g. `uchar`, `char`, `ushort`, `float` [...](https://www.libvips.org/API/current/enum.BandFormat.html) +- `density`: Number of pixels per inch (DPI), if present +- `chromaSubsampling`: String containing JPEG chroma subsampling, `4:2:0` or `4:4:4` for RGB, `4:2:0:4` or `4:4:4:4` for CMYK +- `isProgressive`: Boolean indicating whether the image is interlaced using a progressive scan +- `isPalette`: Boolean indicating whether the image is palette-based (GIF, PNG). +- `bitsPerSample`: Number of bits per sample for each channel (GIF, PNG, HEIF). +- `pages`: Number of pages/frames contained within the image, with support for TIFF, HEIF, PDF, animated GIF and animated WebP +- `pageHeight`: Number of pixels high each page in a multi-page image will be. +- `loop`: Number of times to loop an animated image, zero refers to a continuous loop. +- `delay`: Delay in ms between each page in an animated image, provided as an array of integers. +- `pagePrimary`: Number of the primary page in a HEIF image +- `levels`: Details of each level in a multi-level image provided as an array of objects, requires libvips compiled with support for OpenSlide +- `subifds`: Number of Sub Image File Directories in an OME-TIFF image +- `background`: Default background colour, if present, for PNG (bKGD) and GIF images +- `compression`: The encoder used to compress an HEIF file, `av1` (AVIF) or `hevc` (HEIC) +- `resolutionUnit`: The unit of resolution (density), either `inch` or `cm`, if present +- `hasProfile`: Boolean indicating the presence of an embedded ICC profile +- `hasAlpha`: Boolean indicating the presence of an alpha transparency channel +- `orientation`: Number value of the EXIF Orientation header, if present +- `exif`: Buffer containing raw EXIF data, if present +- `icc`: Buffer containing raw [ICC](https://www.npmjs.com/package/icc) profile data, if present +- `iptc`: Buffer containing raw IPTC data, if present +- `xmp`: Buffer containing raw XMP data, if present +- `xmpAsString`: String containing XMP data, if valid UTF-8. +- `tifftagPhotoshop`: Buffer containing raw TIFFTAG_PHOTOSHOP data, if present +- `formatMagick`: String containing format for images loaded via *magick +- `comments`: Array of keyword/text pairs representing PNG text blocks, if present. + + + +| Param | Type | Description | +| --- | --- | --- | +| [callback] | function | called with the arguments `(err, metadata)` | + +**Example** +```js +const metadata = await sharp(input).metadata(); +``` +**Example** +```js +const image = sharp(inputJpg); +image + .metadata() + .then(function(metadata) { + return image + .resize(Math.round(metadata.width / 2)) + .webp() + .toBuffer(); + }) + .then(function(data) { + // data contains a WebP image half the width and height of the original JPEG + }); +``` +**Example** +```js +// Get dimensions taking EXIF Orientation into account. +const { autoOrient } = await sharp(input).metadata(); +const { width, height } = autoOrient; +``` + + +## stats +> stats([callback]) ⇒ Promise.<Object> + +Access to pixel-derived image statistics for every channel in the image. +A `Promise` is returned when `callback` is not provided. + +- `channels`: Array of channel statistics for each channel in the image. Each channel statistic contains + - `min` (minimum value in the channel) + - `max` (maximum value in the channel) + - `sum` (sum of all values in a channel) + - `squaresSum` (sum of squared values in a channel) + - `mean` (mean of the values in a channel) + - `stdev` (standard deviation for the values in a channel) + - `minX` (x-coordinate of one of the pixel where the minimum lies) + - `minY` (y-coordinate of one of the pixel where the minimum lies) + - `maxX` (x-coordinate of one of the pixel where the maximum lies) + - `maxY` (y-coordinate of one of the pixel where the maximum lies) +- `isOpaque`: Is the image fully opaque? Will be `true` if the image has no alpha channel or if every pixel is fully opaque. +- `entropy`: Histogram-based estimation of greyscale entropy, discarding alpha channel if any. +- `sharpness`: Estimation of greyscale sharpness based on the standard deviation of a Laplacian convolution, discarding alpha channel if any. +- `dominant`: Object containing most dominant sRGB colour based on a 4096-bin 3D histogram. + +**Note**: Statistics are derived from the original input image. Any operations performed on the image must first be +written to a buffer in order to run `stats` on the result (see third example). + + + +| Param | Type | Description | +| --- | --- | --- | +| [callback] | function | called with the arguments `(err, stats)` | + +**Example** +```js +const image = sharp(inputJpg); +image + .stats() + .then(function(stats) { + // stats contains the channel-wise statistics array and the isOpaque value + }); +``` +**Example** +```js +const { entropy, sharpness, dominant } = await sharp(input).stats(); +const { r, g, b } = dominant; +``` +**Example** +```js +const image = sharp(input); +// store intermediate result +const part = await image.extract(region).toBuffer(); +// create new instance to obtain statistics of extracted region +const stats = await sharp(part).stats(); +``` \ No newline at end of file diff --git a/docs/src/content/docs/api-operation.md b/docs/src/content/docs/api-operation.md new file mode 100644 index 000000000..abcb41fc1 --- /dev/null +++ b/docs/src/content/docs/api-operation.md @@ -0,0 +1,723 @@ +--- +# This file was auto-generated from JSDoc in lib/operation.js +title: Image operations +--- + +## rotate +> rotate([angle], [options]) ⇒ Sharp + +Rotate the output image. + +The provided angle is converted to a valid positive degree rotation. +For example, `-450` will produce a 270 degree rotation. + +When rotating by an angle other than a multiple of 90, +the background colour can be provided with the `background` option. + +For backwards compatibility, if no angle is provided, `.autoOrient()` will be called. + +Only one rotation can occur per pipeline (aside from an initial call without +arguments to orient via EXIF data). Previous calls to `rotate` in the same +pipeline will be ignored. + +Multi-page images can only be rotated by 180 degrees. + +Method order is important when rotating, resizing and/or extracting regions, +for example `.rotate(x).extract(y)` will produce a different result to `.extract(y).rotate(x)`. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [angle] | number | auto | angle of rotation. | +| [options] | Object | | if present, is an Object with optional attributes. | +| [options.background] | string \| Object | "\"#000000\"" | parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. | + +**Example** +```js +const rotateThenResize = await sharp(input) + .rotate(90) + .resize({ width: 16, height: 8, fit: 'fill' }) + .toBuffer(); +const resizeThenRotate = await sharp(input) + .resize({ width: 16, height: 8, fit: 'fill' }) + .rotate(90) + .toBuffer(); +``` + + +## autoOrient +> autoOrient() ⇒ Sharp + +Auto-orient based on the EXIF `Orientation` tag, then remove the tag. +Mirroring is supported and may infer the use of a flip operation. + +Previous or subsequent use of `rotate(angle)` and either `flip()` or `flop()` +will logically occur after auto-orientation, regardless of call order. + + +**Example** +```js +const output = await sharp(input).autoOrient().toBuffer(); +``` +**Example** +```js +const pipeline = sharp() + .autoOrient() + .resize(null, 200) + .toBuffer(function (err, outputBuffer, info) { + // outputBuffer contains 200px high JPEG image data, + // auto-oriented using EXIF Orientation tag + // info.width and info.height contain the dimensions of the resized image + }); +readableStream.pipe(pipeline); +``` + + +## flip +> flip([flip]) ⇒ Sharp + +Mirror the image vertically (up-down) about the x-axis. +This always occurs before rotation, if any. + +This operation does not work correctly with multi-page images. + + + +| Param | Type | Default | +| --- | --- | --- | +| [flip] | Boolean | true | + +**Example** +```js +const output = await sharp(input).flip().toBuffer(); +``` + + +## flop +> flop([flop]) ⇒ Sharp + +Mirror the image horizontally (left-right) about the y-axis. +This always occurs before rotation, if any. + + + +| Param | Type | Default | +| --- | --- | --- | +| [flop] | Boolean | true | + +**Example** +```js +const output = await sharp(input).flop().toBuffer(); +``` + + +## affine +> affine(matrix, [options]) ⇒ Sharp + +Perform an affine transform on an image. This operation will always occur after resizing, extraction and rotation, if any. + +You must provide an array of length 4 or a 2x2 affine transformation matrix. +By default, new pixels are filled with a black background. You can provide a background colour with the `background` option. +A particular interpolator may also be specified. Set the `interpolator` option to an attribute of the `sharp.interpolators` Object e.g. `sharp.interpolators.nohalo`. + +In the case of a 2x2 matrix, the transform is: +- X = `matrix[0, 0]` \* (x + `idx`) + `matrix[0, 1]` \* (y + `idy`) + `odx` +- Y = `matrix[1, 0]` \* (x + `idx`) + `matrix[1, 1]` \* (y + `idy`) + `ody` + +where: +- x and y are the coordinates in input image. +- X and Y are the coordinates in output image. +- (0,0) is the upper left corner. + + +**Throws**: + +- Error Invalid parameters + +**Since**: 0.27.0 + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| matrix | Array.<Array.<number>> \| Array.<number> | | affine transformation matrix | +| [options] | Object | | if present, is an Object with optional attributes. | +| [options.background] | String \| Object | "#000000" | parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. | +| [options.idx] | Number | 0 | input horizontal offset | +| [options.idy] | Number | 0 | input vertical offset | +| [options.odx] | Number | 0 | output horizontal offset | +| [options.ody] | Number | 0 | output vertical offset | +| [options.interpolator] | String | sharp.interpolators.bicubic | interpolator | + +**Example** +```js +const pipeline = sharp() + .affine([[1, 0.3], [0.1, 0.7]], { + background: 'white', + interpolator: sharp.interpolators.nohalo + }) + .toBuffer((err, outputBuffer, info) => { + // outputBuffer contains the transformed image + // info.width and info.height contain the new dimensions + }); + +inputStream + .pipe(pipeline); +``` + + +## sharpen +> sharpen([options], [flat], [jagged]) ⇒ Sharp + +Sharpen the image. + +When used without parameters, performs a fast, mild sharpen of the output image. + +When a `sigma` is provided, performs a slower, more accurate sharpen of the L channel in the LAB colour space. +Fine-grained control over the level of sharpening in "flat" (m1) and "jagged" (m2) areas is available. + +See [libvips sharpen](https://www.libvips.org/API/current/method.Image.sharpen.html) operation. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object \| number | | if present, is an Object with attributes | +| [options.sigma] | number | | the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`, between 0.000001 and 10 | +| [options.m1] | number | 1.0 | the level of sharpening to apply to "flat" areas, between 0 and 1000000 | +| [options.m2] | number | 2.0 | the level of sharpening to apply to "jagged" areas, between 0 and 1000000 | +| [options.x1] | number | 2.0 | threshold between "flat" and "jagged", between 0 and 1000000 | +| [options.y2] | number | 10.0 | maximum amount of brightening, between 0 and 1000000 | +| [options.y3] | number | 20.0 | maximum amount of darkening, between 0 and 1000000 | +| [flat] | number | | (deprecated) see `options.m1`. | +| [jagged] | number | | (deprecated) see `options.m2`. | + +**Example** +```js +const data = await sharp(input).sharpen().toBuffer(); +``` +**Example** +```js +const data = await sharp(input).sharpen({ sigma: 2 }).toBuffer(); +``` +**Example** +```js +const data = await sharp(input) + .sharpen({ + sigma: 2, + m1: 0, + m2: 3, + x1: 3, + y2: 15, + y3: 15, + }) + .toBuffer(); +``` + + +## median +> median([size]) ⇒ Sharp + +Apply median filter. +When used without parameters the default window is 3x3. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [size] | number | 3 | square mask size: size x size | + +**Example** +```js +const output = await sharp(input).median().toBuffer(); +``` +**Example** +```js +const output = await sharp(input).median(5).toBuffer(); +``` + + +## blur +> blur([options]) ⇒ Sharp + +Blur the image. + +When used without parameters, performs a fast 3x3 box blur (equivalent to a box linear filter). + +When a `sigma` is provided, performs a slower, more accurate Gaussian blur. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object \| number \| Boolean | | | +| [options.sigma] | number | | a value between 0.3 and 1000 representing the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`. | +| [options.precision] | string | "'integer'" | How accurate the operation should be, one of: integer, float, approximate. | +| [options.minAmplitude] | number | 0.2 | A value between 0.001 and 1. A smaller value will generate a larger, more accurate mask. | + +**Example** +```js +const boxBlurred = await sharp(input) + .blur() + .toBuffer(); +``` +**Example** +```js +const gaussianBlurred = await sharp(input) + .blur(5) + .toBuffer(); +``` + + +## dilate +> dilate([width]) ⇒ Sharp + +Expand foreground objects using the dilate morphological operator. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [width] | Number | 1 | dilation width in pixels. | + +**Example** +```js +const output = await sharp(input) + .dilate() + .toBuffer(); +``` + + +## erode +> erode([width]) ⇒ Sharp + +Shrink foreground objects using the erode morphological operator. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [width] | Number | 1 | erosion width in pixels. | + +**Example** +```js +const output = await sharp(input) + .erode() + .toBuffer(); +``` + + +## flatten +> flatten([options]) ⇒ Sharp + +Merge alpha transparency channel, if any, with a background, then remove the alpha channel. + +See also [removeAlpha](/api-channel#removealpha). + + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object | | | +| [options.background] | string \| Object | "{r: 0, g: 0, b: 0}" | background colour, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black. | + +**Example** +```js +await sharp(rgbaInput) + .flatten({ background: '#F0A703' }) + .toBuffer(); +``` + + +## unflatten +> unflatten() + +Ensure the image has an alpha channel +with all white pixel values made fully transparent. + +Existing alpha channel values for non-white pixels remain unchanged. + +This feature is experimental and the API may change. + + +**Since**: 0.32.1 +**Example** +```js +await sharp(rgbInput) + .unflatten() + .toBuffer(); +``` +**Example** +```js +await sharp(rgbInput) + .threshold(128, { grayscale: false }) // converter bright pixels to white + .unflatten() + .toBuffer(); +``` + + +## gamma +> gamma([gamma], [gammaOut]) ⇒ Sharp + +Apply a gamma correction by reducing the encoding (darken) pre-resize at a factor of `1/gamma` +then increasing the encoding (brighten) post-resize at a factor of `gamma`. +This can improve the perceived brightness of a resized image in non-linear colour spaces. +JPEG and WebP input images will not take advantage of the shrink-on-load performance optimisation +when applying a gamma correction. + +Supply a second argument to use a different output gamma value, otherwise the first value is used in both cases. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [gamma] | number | 2.2 | value between 1.0 and 3.0. | +| [gammaOut] | number | | value between 1.0 and 3.0. (optional, defaults to same as `gamma`) | + + + +## negate +> negate([options]) ⇒ Sharp + +Produce the "negative" of the image. + + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object | | | +| [options.alpha] | Boolean | true | Whether or not to negate any alpha channel | + +**Example** +```js +const output = await sharp(input) + .negate() + .toBuffer(); +``` +**Example** +```js +const output = await sharp(input) + .negate({ alpha: false }) + .toBuffer(); +``` + + +## normalise +> normalise([options]) ⇒ Sharp + +Enhance output image contrast by stretching its luminance to cover a full dynamic range. + +Uses a histogram-based approach, taking a default range of 1% to 99% to reduce sensitivity to noise at the extremes. + +Luminance values below the `lower` percentile will be underexposed by clipping to zero. +Luminance values above the `upper` percentile will be overexposed by clipping to the max pixel value. + + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object | | | +| [options.lower] | number | 1 | Percentile below which luminance values will be underexposed. | +| [options.upper] | number | 99 | Percentile above which luminance values will be overexposed. | + +**Example** +```js +const output = await sharp(input) + .normalise() + .toBuffer(); +``` +**Example** +```js +const output = await sharp(input) + .normalise({ lower: 0, upper: 100 }) + .toBuffer(); +``` + + +## normalize +> normalize([options]) ⇒ Sharp + +Alternative spelling of normalise. + + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object | | | +| [options.lower] | number | 1 | Percentile below which luminance values will be underexposed. | +| [options.upper] | number | 99 | Percentile above which luminance values will be overexposed. | + +**Example** +```js +const output = await sharp(input) + .normalize() + .toBuffer(); +``` + + +## clahe +> clahe(options) ⇒ Sharp + +Perform contrast limiting adaptive histogram equalization +[CLAHE](https://en.wikipedia.org/wiki/Adaptive_histogram_equalization#Contrast_Limited_AHE). + +This will, in general, enhance the clarity of the image by bringing out darker details. + + +**Throws**: + +- Error Invalid parameters + +**Since**: 0.28.3 + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| options | Object | | | +| options.width | number | | Integral width of the search window, in pixels. | +| options.height | number | | Integral height of the search window, in pixels. | +| [options.maxSlope] | number | 3 | Integral level of brightening, between 0 and 100, where 0 disables contrast limiting. | + +**Example** +```js +const output = await sharp(input) + .clahe({ + width: 3, + height: 3, + }) + .toBuffer(); +``` + + +## convolve +> convolve(kernel) ⇒ Sharp + +Convolve the image with the specified kernel. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| kernel | Object | | | +| kernel.width | number | | width of the kernel in pixels. | +| kernel.height | number | | height of the kernel in pixels. | +| kernel.kernel | Array.<number> | | Array of length `width*height` containing the kernel values. | +| [kernel.scale] | number | sum | the scale of the kernel in pixels. | +| [kernel.offset] | number | 0 | the offset of the kernel in pixels. | + +**Example** +```js +sharp(input) + .convolve({ + width: 3, + height: 3, + kernel: [-1, 0, 1, -2, 0, 2, -1, 0, 1] + }) + .raw() + .toBuffer(function(err, data, info) { + // data contains the raw pixel data representing the convolution + // of the input image with the horizontal Sobel operator + }); +``` + + +## threshold +> threshold([threshold], [options]) ⇒ Sharp + +Any pixel value greater than or equal to the threshold value will be set to 255, otherwise it will be set to 0. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [threshold] | number | 128 | a value in the range 0-255 representing the level at which the threshold will be applied. | +| [options] | Object | | | +| [options.greyscale] | Boolean | true | convert to single channel greyscale. | +| [options.grayscale] | Boolean | true | alternative spelling for greyscale. | + + + +## boolean +> boolean(operand, operator, [options]) ⇒ Sharp + +Perform a bitwise boolean operation with operand image. + +This operation creates an output image where each pixel is the result of +the selected bitwise boolean `operation` between the corresponding pixels of the input images. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Description | +| --- | --- | --- | +| operand | Buffer \| string | Buffer containing image data or string containing the path to an image file. | +| operator | string | one of `and`, `or` or `eor` to perform that bitwise operation, like the C logic operators `&`, `|` and `^` respectively. | +| [options] | Object | | +| [options.raw] | Object | describes operand when using raw pixel data. | +| [options.raw.width] | number | | +| [options.raw.height] | number | | +| [options.raw.channels] | number | | + + + +## linear +> linear([a], [b]) ⇒ Sharp + +Apply the linear formula `a` * input + `b` to the image to adjust image levels. + +When a single number is provided, it will be used for all image channels. +When an array of numbers is provided, the array length must match the number of channels. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [a] | number \| Array.<number> | [] | multiplier | +| [b] | number \| Array.<number> | [] | offset | + +**Example** +```js +await sharp(input) + .linear(0.5, 2) + .toBuffer(); +``` +**Example** +```js +await sharp(rgbInput) + .linear( + [0.25, 0.5, 0.75], + [150, 100, 50] + ) + .toBuffer(); +``` + + +## recomb +> recomb(inputMatrix) ⇒ Sharp + +Recombine the image with the specified matrix. + + +**Throws**: + +- Error Invalid parameters + +**Since**: 0.21.1 + +| Param | Type | Description | +| --- | --- | --- | +| inputMatrix | Array.<Array.<number>> | 3x3 or 4x4 Recombination matrix | + +**Example** +```js +sharp(input) + .recomb([ + [0.3588, 0.7044, 0.1368], + [0.2990, 0.5870, 0.1140], + [0.2392, 0.4696, 0.0912], + ]) + .raw() + .toBuffer(function(err, data, info) { + // data contains the raw pixel data after applying the matrix + // With this example input, a sepia filter has been applied + }); +``` + + +## modulate +> modulate([options]) ⇒ Sharp + +Transforms the image using brightness, saturation, hue rotation, and lightness. +Brightness and lightness both operate on luminance, with the difference being that +brightness is multiplicative whereas lightness is additive. + + +**Since**: 0.22.1 + +| Param | Type | Description | +| --- | --- | --- | +| [options] | Object | | +| [options.brightness] | number | Brightness multiplier | +| [options.saturation] | number | Saturation multiplier | +| [options.hue] | number | Degrees for hue rotation | +| [options.lightness] | number | Lightness addend | + +**Example** +```js +// increase brightness by a factor of 2 +const output = await sharp(input) + .modulate({ + brightness: 2 + }) + .toBuffer(); +``` +**Example** +```js +// hue-rotate by 180 degrees +const output = await sharp(input) + .modulate({ + hue: 180 + }) + .toBuffer(); +``` +**Example** +```js +// increase lightness by +50 +const output = await sharp(input) + .modulate({ + lightness: 50 + }) + .toBuffer(); +``` +**Example** +```js +// decrease brightness and saturation while also hue-rotating by 90 degrees +const output = await sharp(input) + .modulate({ + brightness: 0.5, + saturation: 0.5, + hue: 90, + }) + .toBuffer(); +``` \ No newline at end of file diff --git a/docs/src/content/docs/api-output.md b/docs/src/content/docs/api-output.md new file mode 100644 index 000000000..5acd2fadb --- /dev/null +++ b/docs/src/content/docs/api-output.md @@ -0,0 +1,911 @@ +--- +# This file was auto-generated from JSDoc in lib/output.js +title: Output options +--- + +## toFile +> toFile(fileOut, [callback]) ⇒ Promise.<Object> + +Write output image data to a file. + +If an explicit output format is not selected, it will be inferred from the extension, +with JPEG, PNG, WebP, AVIF, TIFF, GIF, DZI, and libvips' V format supported. +Note that raw pixel data is only supported for buffer output. + +By default all metadata will be removed, which includes EXIF-based orientation. +See [withMetadata](#withmetadata) for control over this. + +The caller is responsible for ensuring directory structures and permissions exist. + +A `Promise` is returned when `callback` is not provided. + + +**Returns**: Promise.<Object> - - when no callback is provided +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Description | +| --- | --- | --- | +| fileOut | string | the path to write the image data to. | +| [callback] | function | called on completion with two arguments `(err, info)`. `info` contains the output image `format`, `size` (bytes), `width`, `height`, `channels` and `premultiplied` (indicating if premultiplication was used). When using a crop strategy also contains `cropOffsetLeft` and `cropOffsetTop`. When using the attention crop strategy also contains `attentionX` and `attentionY`, the focal point of the cropped region. Animated output will also contain `pageHeight` and `pages`. May also contain `textAutofitDpi` (dpi the font was rendered at) if image was created from text. | + +**Example** +```js +sharp(input) + .toFile('output.png', (err, info) => { ... }); +``` +**Example** +```js +sharp(input) + .toFile('output.png') + .then(info => { ... }) + .catch(err => { ... }); +``` + + +## toBuffer +> toBuffer([options], [callback]) ⇒ Promise.<Buffer> + +Write output to a Buffer. +JPEG, PNG, WebP, AVIF, TIFF, GIF and raw pixel data output are supported. + +Use [toFormat](#toformat) or one of the format-specific functions such as [jpeg](#jpeg), [png](#png) etc. to set the output format. + +If no explicit format is set, the output format will match the input image, except SVG input which becomes PNG output. + +By default all metadata will be removed, which includes EXIF-based orientation. +See [withMetadata](#withmetadata) for control over this. + +`callback`, if present, gets three arguments `(err, data, info)` where: +- `err` is an error, if any. +- `data` is the output image data. +- `info` contains the output image `format`, `size` (bytes), `width`, `height`, +`channels` and `premultiplied` (indicating if premultiplication was used). +When using a crop strategy also contains `cropOffsetLeft` and `cropOffsetTop`. +Animated output will also contain `pageHeight` and `pages`. +May also contain `textAutofitDpi` (dpi the font was rendered at) if image was created from text. + +A `Promise` is returned when `callback` is not provided. + + +**Returns**: Promise.<Buffer> - - when no callback is provided + +| Param | Type | Description | +| --- | --- | --- | +| [options] | Object | | +| [options.resolveWithObject] | boolean | Resolve the Promise with an Object containing `data` and `info` properties instead of resolving only with `data`. | +| [callback] | function | | + +**Example** +```js +sharp(input) + .toBuffer((err, data, info) => { ... }); +``` +**Example** +```js +sharp(input) + .toBuffer() + .then(data => { ... }) + .catch(err => { ... }); +``` +**Example** +```js +sharp(input) + .png() + .toBuffer({ resolveWithObject: true }) + .then(({ data, info }) => { ... }) + .catch(err => { ... }); +``` +**Example** +```js +const { data, info } = await sharp('my-image.jpg') + // output the raw pixels + .raw() + .toBuffer({ resolveWithObject: true }); + +// create a more type safe way to work with the raw pixel data +// this will not copy the data, instead it will change `data`s underlying ArrayBuffer +// so `data` and `pixelArray` point to the same memory location +const pixelArray = new Uint8ClampedArray(data.buffer); + +// When you are done changing the pixelArray, sharp takes the `pixelArray` as an input +const { width, height, channels } = info; +await sharp(pixelArray, { raw: { width, height, channels } }) + .toFile('my-changed-image.jpg'); +``` + + +## keepExif +> keepExif() ⇒ Sharp + +Keep all EXIF metadata from the input image in the output image. + +EXIF metadata is unsupported for TIFF output. + + +**Since**: 0.33.0 +**Example** +```js +const outputWithExif = await sharp(inputWithExif) + .keepExif() + .toBuffer(); +``` + + +## withExif +> withExif(exif) ⇒ Sharp + +Set EXIF metadata in the output image, ignoring any EXIF in the input image. + + +**Throws**: + +- Error Invalid parameters + +**Since**: 0.33.0 + +| Param | Type | Description | +| --- | --- | --- | +| exif | Object.<string, Object.<string, string>> | Object keyed by IFD0, IFD1 etc. of key/value string pairs to write as EXIF data. | + +**Example** +```js +const dataWithExif = await sharp(input) + .withExif({ + IFD0: { + Copyright: 'The National Gallery' + }, + IFD3: { + GPSLatitudeRef: 'N', + GPSLatitude: '51/1 30/1 3230/100', + GPSLongitudeRef: 'W', + GPSLongitude: '0/1 7/1 4366/100' + } + }) + .toBuffer(); +``` + + +## withExifMerge +> withExifMerge(exif) ⇒ Sharp + +Update EXIF metadata from the input image in the output image. + + +**Throws**: + +- Error Invalid parameters + +**Since**: 0.33.0 + +| Param | Type | Description | +| --- | --- | --- | +| exif | Object.<string, Object.<string, string>> | Object keyed by IFD0, IFD1 etc. of key/value string pairs to write as EXIF data. | + +**Example** +```js +const dataWithMergedExif = await sharp(inputWithExif) + .withExifMerge({ + IFD0: { + Copyright: 'The National Gallery' + } + }) + .toBuffer(); +``` + + +## keepIccProfile +> keepIccProfile() ⇒ Sharp + +Keep ICC profile from the input image in the output image. + +When input and output colour spaces differ, use with [toColourspace](/api-colour/#tocolourspace) and optionally [pipelineColourspace](/api-colour/#pipelinecolourspace). + + +**Since**: 0.33.0 +**Example** +```js +const outputWithIccProfile = await sharp(inputWithIccProfile) + .keepIccProfile() + .toBuffer(); +``` +**Example** +```js +const cmykOutputWithIccProfile = await sharp(cmykInputWithIccProfile) + .pipelineColourspace('cmyk') + .toColourspace('cmyk') + .keepIccProfile() + .toBuffer(); +``` + + +## withIccProfile +> withIccProfile(icc, [options]) ⇒ Sharp + +Transform using an ICC profile and attach to the output image. + +This can either be an absolute filesystem path or +built-in profile name (`srgb`, `p3`, `cmyk`). + + +**Throws**: + +- Error Invalid parameters + +**Since**: 0.33.0 + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| icc | string | | Absolute filesystem path to output ICC profile or built-in profile name (srgb, p3, cmyk). | +| [options] | Object | | | +| [options.attach] | number | true | Should the ICC profile be included in the output image metadata? | + +**Example** +```js +const outputWithP3 = await sharp(input) + .withIccProfile('p3') + .toBuffer(); +``` + + +## keepXmp +> keepXmp() ⇒ Sharp + +Keep XMP metadata from the input image in the output image. + + +**Since**: 0.34.3 +**Example** +```js +const outputWithXmp = await sharp(inputWithXmp) + .keepXmp() + .toBuffer(); +``` + + +## withXmp +> withXmp(xmp) ⇒ Sharp + +Set XMP metadata in the output image. + +Supported by PNG, JPEG, WebP, and TIFF output. + + +**Throws**: + +- Error Invalid parameters + +**Since**: 0.34.3 + +| Param | Type | Description | +| --- | --- | --- | +| xmp | string | String containing XMP metadata to be embedded in the output image. | + +**Example** +```js +const xmpString = ` + + + + + John Doe + + + `; + +const data = await sharp(input) + .withXmp(xmpString) + .toBuffer(); +``` + + +## keepMetadata +> keepMetadata() ⇒ Sharp + +Keep all metadata (EXIF, ICC, XMP, IPTC) from the input image in the output image. + +The default behaviour, when `keepMetadata` is not used, is to convert to the device-independent +sRGB colour space and strip all metadata, including the removal of any ICC profile. + + +**Since**: 0.33.0 +**Example** +```js +const outputWithMetadata = await sharp(inputWithMetadata) + .keepMetadata() + .toBuffer(); +``` + + +## withMetadata +> withMetadata([options]) ⇒ Sharp + +Keep most metadata (EXIF, XMP, IPTC) from the input image in the output image. + +This will also convert to and add a web-friendly sRGB ICC profile if appropriate. + +Allows orientation and density to be set or updated. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Description | +| --- | --- | --- | +| [options] | Object | | +| [options.orientation] | number | Used to update the EXIF `Orientation` tag, integer between 1 and 8. | +| [options.density] | number | Number of pixels per inch (DPI). | + +**Example** +```js +const outputSrgbWithMetadata = await sharp(inputRgbWithMetadata) + .withMetadata() + .toBuffer(); +``` +**Example** +```js +// Set output metadata to 96 DPI +const data = await sharp(input) + .withMetadata({ density: 96 }) + .toBuffer(); +``` + + +## toFormat +> toFormat(format, options) ⇒ Sharp + +Force output to a given format. + + +**Throws**: + +- Error unsupported format or options + + +| Param | Type | Description | +| --- | --- | --- | +| format | string \| Object | as a string or an Object with an 'id' attribute | +| options | Object | output options | + +**Example** +```js +// Convert any input to PNG output +const data = await sharp(input) + .toFormat('png') + .toBuffer(); +``` + + +## jpeg +> jpeg([options]) ⇒ Sharp + +Use these JPEG options for output image. + + +**Throws**: + +- Error Invalid options + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object | | output options | +| [options.quality] | number | 80 | quality, integer 1-100 | +| [options.progressive] | boolean | false | use progressive (interlace) scan | +| [options.chromaSubsampling] | string | "'4:2:0'" | set to '4:4:4' to prevent chroma subsampling otherwise defaults to '4:2:0' chroma subsampling | +| [options.optimiseCoding] | boolean | true | optimise Huffman coding tables | +| [options.optimizeCoding] | boolean | true | alternative spelling of optimiseCoding | +| [options.mozjpeg] | boolean | false | use mozjpeg defaults, equivalent to `{ trellisQuantisation: true, overshootDeringing: true, optimiseScans: true, quantisationTable: 3 }` | +| [options.trellisQuantisation] | boolean | false | apply trellis quantisation | +| [options.overshootDeringing] | boolean | false | apply overshoot deringing | +| [options.optimiseScans] | boolean | false | optimise progressive scans, forces progressive | +| [options.optimizeScans] | boolean | false | alternative spelling of optimiseScans | +| [options.quantisationTable] | number | 0 | quantization table to use, integer 0-8 | +| [options.quantizationTable] | number | 0 | alternative spelling of quantisationTable | +| [options.force] | boolean | true | force JPEG output, otherwise attempt to use input format | + +**Example** +```js +// Convert any input to very high quality JPEG output +const data = await sharp(input) + .jpeg({ + quality: 100, + chromaSubsampling: '4:4:4' + }) + .toBuffer(); +``` +**Example** +```js +// Use mozjpeg to reduce output JPEG file size (slower) +const data = await sharp(input) + .jpeg({ mozjpeg: true }) + .toBuffer(); +``` + + +## png +> png([options]) ⇒ Sharp + +Use these PNG options for output image. + +By default, PNG output is full colour at 8 bits per pixel. + +Indexed PNG input at 1, 2 or 4 bits per pixel is converted to 8 bits per pixel. +Set `palette` to `true` for slower, indexed PNG output. + +For 16 bits per pixel output, convert to `rgb16` via +[toColourspace](/api-colour/#tocolourspace). + + +**Throws**: + +- Error Invalid options + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object | | | +| [options.progressive] | boolean | false | use progressive (interlace) scan | +| [options.compressionLevel] | number | 6 | zlib compression level, 0 (fastest, largest) to 9 (slowest, smallest) | +| [options.adaptiveFiltering] | boolean | false | use adaptive row filtering | +| [options.palette] | boolean | false | quantise to a palette-based image with alpha transparency support | +| [options.quality] | number | 100 | use the lowest number of colours needed to achieve given quality, sets `palette` to `true` | +| [options.effort] | number | 7 | CPU effort, between 1 (fastest) and 10 (slowest), sets `palette` to `true` | +| [options.colours] | number | 256 | maximum number of palette entries, sets `palette` to `true` | +| [options.colors] | number | 256 | alternative spelling of `options.colours`, sets `palette` to `true` | +| [options.dither] | number | 1.0 | level of Floyd-Steinberg error diffusion, sets `palette` to `true` | +| [options.force] | boolean | true | force PNG output, otherwise attempt to use input format | + +**Example** +```js +// Convert any input to full colour PNG output +const data = await sharp(input) + .png() + .toBuffer(); +``` +**Example** +```js +// Convert any input to indexed PNG output (slower) +const data = await sharp(input) + .png({ palette: true }) + .toBuffer(); +``` +**Example** +```js +// Output 16 bits per pixel RGB(A) +const data = await sharp(input) + .toColourspace('rgb16') + .png() + .toBuffer(); +``` + + +## webp +> webp([options]) ⇒ Sharp + +Use these WebP options for output image. + + +**Throws**: + +- Error Invalid options + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object | | output options | +| [options.quality] | number | 80 | quality, integer 1-100 | +| [options.alphaQuality] | number | 100 | quality of alpha layer, integer 0-100 | +| [options.lossless] | boolean | false | use lossless compression mode | +| [options.nearLossless] | boolean | false | use near_lossless compression mode | +| [options.smartSubsample] | boolean | false | use high quality chroma subsampling | +| [options.smartDeblock] | boolean | false | auto-adjust the deblocking filter, can improve low contrast edges (slow) | +| [options.preset] | string | "'default'" | named preset for preprocessing/filtering, one of: default, photo, picture, drawing, icon, text | +| [options.effort] | number | 4 | CPU effort, between 0 (fastest) and 6 (slowest) | +| [options.loop] | number | 0 | number of animation iterations, use 0 for infinite animation | +| [options.delay] | number \| Array.<number> | | delay(s) between animation frames (in milliseconds) | +| [options.minSize] | boolean | false | prevent use of animation key frames to minimise file size (slow) | +| [options.mixed] | boolean | false | allow mixture of lossy and lossless animation frames (slow) | +| [options.force] | boolean | true | force WebP output, otherwise attempt to use input format | + +**Example** +```js +// Convert any input to lossless WebP output +const data = await sharp(input) + .webp({ lossless: true }) + .toBuffer(); +``` +**Example** +```js +// Optimise the file size of an animated WebP +const outputWebp = await sharp(inputWebp, { animated: true }) + .webp({ effort: 6 }) + .toBuffer(); +``` + + +## gif +> gif([options]) ⇒ Sharp + +Use these GIF options for the output image. + +The first entry in the palette is reserved for transparency. + +The palette of the input image will be re-used if possible. + + +**Throws**: + +- Error Invalid options + +**Since**: 0.30.0 + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object | | output options | +| [options.reuse] | boolean | true | re-use existing palette, otherwise generate new (slow) | +| [options.progressive] | boolean | false | use progressive (interlace) scan | +| [options.colours] | number | 256 | maximum number of palette entries, including transparency, between 2 and 256 | +| [options.colors] | number | 256 | alternative spelling of `options.colours` | +| [options.effort] | number | 7 | CPU effort, between 1 (fastest) and 10 (slowest) | +| [options.dither] | number | 1.0 | level of Floyd-Steinberg error diffusion, between 0 (least) and 1 (most) | +| [options.interFrameMaxError] | number | 0 | maximum inter-frame error for transparency, between 0 (lossless) and 32 | +| [options.interPaletteMaxError] | number | 3 | maximum inter-palette error for palette reuse, between 0 and 256 | +| [options.keepDuplicateFrames] | boolean | false | keep duplicate frames in the output instead of combining them | +| [options.loop] | number | 0 | number of animation iterations, use 0 for infinite animation | +| [options.delay] | number \| Array.<number> | | delay(s) between animation frames (in milliseconds) | +| [options.force] | boolean | true | force GIF output, otherwise attempt to use input format | + +**Example** +```js +// Convert PNG to GIF +await sharp(pngBuffer) + .gif() + .toBuffer(); +``` +**Example** +```js +// Convert animated WebP to animated GIF +await sharp('animated.webp', { animated: true }) + .toFile('animated.gif'); +``` +**Example** +```js +// Create a 128x128, cropped, non-dithered, animated thumbnail of an animated GIF +const out = await sharp('in.gif', { animated: true }) + .resize({ width: 128, height: 128 }) + .gif({ dither: 0 }) + .toBuffer(); +``` +**Example** +```js +// Lossy file size reduction of animated GIF +await sharp('in.gif', { animated: true }) + .gif({ interFrameMaxError: 8 }) + .toFile('optim.gif'); +``` + + +## jp2 +> jp2([options]) ⇒ Sharp + +Use these JP2 options for output image. + +Requires libvips compiled with support for OpenJPEG. +The prebuilt binaries do not include this - see +[installing a custom libvips](/install#custom-libvips). + + +**Throws**: + +- Error Invalid options + +**Since**: 0.29.1 + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object | | output options | +| [options.quality] | number | 80 | quality, integer 1-100 | +| [options.lossless] | boolean | false | use lossless compression mode | +| [options.tileWidth] | number | 512 | horizontal tile size | +| [options.tileHeight] | number | 512 | vertical tile size | +| [options.chromaSubsampling] | string | "'4:4:4'" | set to '4:2:0' to use chroma subsampling | + +**Example** +```js +// Convert any input to lossless JP2 output +const data = await sharp(input) + .jp2({ lossless: true }) + .toBuffer(); +``` +**Example** +```js +// Convert any input to very high quality JP2 output +const data = await sharp(input) + .jp2({ + quality: 100, + chromaSubsampling: '4:4:4' + }) + .toBuffer(); +``` + + +## tiff +> tiff([options]) ⇒ Sharp + +Use these TIFF options for output image. + +The `density` can be set in pixels/inch via [withMetadata](#withmetadata) +instead of providing `xres` and `yres` in pixels/mm. + + +**Throws**: + +- Error Invalid options + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object | | output options | +| [options.quality] | number | 80 | quality, integer 1-100 | +| [options.force] | boolean | true | force TIFF output, otherwise attempt to use input format | +| [options.compression] | string | "'jpeg'" | compression options: none, jpeg, deflate, packbits, ccittfax4, lzw, webp, zstd, jp2k | +| [options.bigtiff] | boolean | false | use BigTIFF variant (has no effect when compression is none) | +| [options.predictor] | string | "'horizontal'" | compression predictor options: none, horizontal, float | +| [options.pyramid] | boolean | false | write an image pyramid | +| [options.tile] | boolean | false | write a tiled tiff | +| [options.tileWidth] | number | 256 | horizontal tile size | +| [options.tileHeight] | number | 256 | vertical tile size | +| [options.xres] | number | 1.0 | horizontal resolution in pixels/mm | +| [options.yres] | number | 1.0 | vertical resolution in pixels/mm | +| [options.resolutionUnit] | string | "'inch'" | resolution unit options: inch, cm | +| [options.bitdepth] | number | 8 | reduce bitdepth to 1, 2 or 4 bit | +| [options.miniswhite] | boolean | false | write 1-bit images as miniswhite | + +**Example** +```js +// Convert SVG input to LZW-compressed, 1 bit per pixel TIFF output +sharp('input.svg') + .tiff({ + compression: 'lzw', + bitdepth: 1 + }) + .toFile('1-bpp-output.tiff') + .then(info => { ... }); +``` + + +## avif +> avif([options]) ⇒ Sharp + +Use these AVIF options for output image. + +AVIF image sequences are not supported. +Prebuilt binaries support a bitdepth of 8 only. + +This feature is experimental on the Windows ARM64 platform +and requires a CPU with ARM64v8.4 or later. + + +**Throws**: + +- Error Invalid options + +**Since**: 0.27.0 + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object | | output options | +| [options.quality] | number | 50 | quality, integer 1-100 | +| [options.lossless] | boolean | false | use lossless compression | +| [options.effort] | number | 4 | CPU effort, between 0 (fastest) and 9 (slowest) | +| [options.chromaSubsampling] | string | "'4:4:4'" | set to '4:2:0' to use chroma subsampling | +| [options.bitdepth] | number | 8 | set bitdepth to 8, 10 or 12 bit | + +**Example** +```js +const data = await sharp(input) + .avif({ effort: 2 }) + .toBuffer(); +``` +**Example** +```js +const data = await sharp(input) + .avif({ lossless: true }) + .toBuffer(); +``` + + +## heif +> heif(options) ⇒ Sharp + +Use these HEIF options for output image. + +Support for patent-encumbered HEIC images using `hevc` compression requires the use of a +globally-installed libvips compiled with support for libheif, libde265 and x265. + + +**Throws**: + +- Error Invalid options + +**Since**: 0.23.0 + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| options | Object | | output options | +| options.compression | string | | compression format: av1, hevc | +| [options.quality] | number | 50 | quality, integer 1-100 | +| [options.lossless] | boolean | false | use lossless compression | +| [options.effort] | number | 4 | CPU effort, between 0 (fastest) and 9 (slowest) | +| [options.chromaSubsampling] | string | "'4:4:4'" | set to '4:2:0' to use chroma subsampling | +| [options.bitdepth] | number | 8 | set bitdepth to 8, 10 or 12 bit | + +**Example** +```js +const data = await sharp(input) + .heif({ compression: 'hevc' }) + .toBuffer(); +``` + + +## jxl +> jxl([options]) ⇒ Sharp + +Use these JPEG-XL (JXL) options for output image. + +This feature is experimental, please do not use in production systems. + +Requires libvips compiled with support for libjxl. +The prebuilt binaries do not include this - see +[installing a custom libvips](/install/#custom-libvips). + + +**Throws**: + +- Error Invalid options + +**Since**: 0.31.3 + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object | | output options | +| [options.distance] | number | 1.0 | maximum encoding error, between 0 (highest quality) and 15 (lowest quality) | +| [options.quality] | number | | calculate `distance` based on JPEG-like quality, between 1 and 100, overrides distance if specified | +| [options.decodingTier] | number | 0 | target decode speed tier, between 0 (highest quality) and 4 (lowest quality) | +| [options.lossless] | boolean | false | use lossless compression | +| [options.effort] | number | 7 | CPU effort, between 1 (fastest) and 9 (slowest) | +| [options.loop] | number | 0 | number of animation iterations, use 0 for infinite animation | +| [options.delay] | number \| Array.<number> | | delay(s) between animation frames (in milliseconds) | + + + +## raw +> raw([options]) ⇒ Sharp + +Force output to be raw, uncompressed pixel data. +Pixel ordering is left-to-right, top-to-bottom, without padding. +Channel ordering will be RGB or RGBA for non-greyscale colourspaces. + + +**Throws**: + +- Error Invalid options + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object | | output options | +| [options.depth] | string | "'uchar'" | bit depth, one of: char, uchar (default), short, ushort, int, uint, float, complex, double, dpcomplex | + +**Example** +```js +// Extract raw, unsigned 8-bit RGB pixel data from JPEG input +const { data, info } = await sharp('input.jpg') + .raw() + .toBuffer({ resolveWithObject: true }); +``` +**Example** +```js +// Extract alpha channel as raw, unsigned 16-bit pixel data from PNG input +const data = await sharp('input.png') + .ensureAlpha() + .extractChannel(3) + .toColourspace('b-w') + .raw({ depth: 'ushort' }) + .toBuffer(); +``` + + +## tile +> tile([options]) ⇒ Sharp + +Use tile-based deep zoom (image pyramid) output. + +Set the format and options for tile images via the `toFormat`, `jpeg`, `png` or `webp` functions. +Use a `.zip` or `.szi` file extension with `toFile` to write to a compressed archive file format. + +The container will be set to `zip` when the output is a Buffer or Stream, otherwise it will default to `fs`. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object | | | +| [options.size] | number | 256 | tile size in pixels, a value between 1 and 8192. | +| [options.overlap] | number | 0 | tile overlap in pixels, a value between 0 and 8192. | +| [options.angle] | number | 0 | tile angle of rotation, must be a multiple of 90. | +| [options.background] | string \| Object | "{r: 255, g: 255, b: 255, alpha: 1}" | background colour, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to white without transparency. | +| [options.depth] | string | | how deep to make the pyramid, possible values are `onepixel`, `onetile` or `one`, default based on layout. | +| [options.skipBlanks] | number | -1 | Threshold to skip tile generation. Range is 0-255 for 8-bit images, 0-65535 for 16-bit images. Default is 5 for `google` layout, -1 (no skip) otherwise. | +| [options.container] | string | "'fs'" | tile container, with value `fs` (filesystem) or `zip` (compressed file). | +| [options.layout] | string | "'dz'" | filesystem layout, possible values are `dz`, `iiif`, `iiif3`, `zoomify` or `google`. | +| [options.centre] | boolean | false | centre image in tile. | +| [options.center] | boolean | false | alternative spelling of centre. | +| [options.id] | string | "'https://example.com/iiif'" | when `layout` is `iiif`/`iiif3`, sets the `@id`/`id` attribute of `info.json` | +| [options.basename] | string | | the name of the directory within the zip file when container is `zip`. | + +**Example** +```js +sharp('input.tiff') + .png() + .tile({ + size: 512 + }) + .toFile('output.dz', function(err, info) { + // output.dzi is the Deep Zoom XML definition + // output_files contains 512x512 tiles grouped by zoom level + }); +``` +**Example** +```js +const zipFileWithTiles = await sharp(input) + .tile({ basename: "tiles" }) + .toBuffer(); +``` +**Example** +```js +const iiififier = sharp().tile({ layout: "iiif" }); +readableStream + .pipe(iiififier) + .pipe(writeableStream); +``` + + +## timeout +> timeout(options) ⇒ Sharp + +Set a timeout for processing, in seconds. +Use a value of zero to continue processing indefinitely, the default behaviour. + +The clock starts when libvips opens an input image for processing. +Time spent waiting for a libuv thread to become available is not included. + + +**Since**: 0.29.2 + +| Param | Type | Description | +| --- | --- | --- | +| options | Object | | +| options.seconds | number | Number of seconds after which processing will be stopped | + +**Example** +```js +// Ensure processing takes no longer than 3 seconds +try { + const data = await sharp(input) + .blur(1000) + .timeout({ seconds: 3 }) + .toBuffer(); +} catch (err) { + if (err.message.includes('timeout')) { ... } +} +``` \ No newline at end of file diff --git a/docs/src/content/docs/api-resize.md b/docs/src/content/docs/api-resize.md new file mode 100644 index 000000000..6452c3455 --- /dev/null +++ b/docs/src/content/docs/api-resize.md @@ -0,0 +1,323 @@ +--- +# This file was auto-generated from JSDoc in lib/resize.js +title: Resizing images +--- + +## resize +> resize([width], [height], [options]) ⇒ Sharp + +Resize image to `width`, `height` or `width x height`. + +When both a `width` and `height` are provided, the possible methods by which the image should **fit** these are: +- `cover`: (default) Preserving aspect ratio, attempt to ensure the image covers both provided dimensions by cropping/clipping to fit. +- `contain`: Preserving aspect ratio, contain within both provided dimensions using "letterboxing" where necessary. +- `fill`: Ignore the aspect ratio of the input and stretch to both provided dimensions. +- `inside`: Preserving aspect ratio, resize the image to be as large as possible while ensuring its dimensions are less than or equal to both those specified. +- `outside`: Preserving aspect ratio, resize the image to be as small as possible while ensuring its dimensions are greater than or equal to both those specified. + +Some of these values are based on the [object-fit](https://developer.mozilla.org/en-US/docs/Web/CSS/object-fit) CSS property. + +Examples of various values for the fit property when resizing + +When using a **fit** of `cover` or `contain`, the default **position** is `centre`. Other options are: +- `sharp.position`: `top`, `right top`, `right`, `right bottom`, `bottom`, `left bottom`, `left`, `left top`. +- `sharp.gravity`: `north`, `northeast`, `east`, `southeast`, `south`, `southwest`, `west`, `northwest`, `center` or `centre`. +- `sharp.strategy`: `cover` only, dynamically crop using either the `entropy` or `attention` strategy. + +Some of these values are based on the [object-position](https://developer.mozilla.org/en-US/docs/Web/CSS/object-position) CSS property. + +The strategy-based approach initially resizes so one dimension is at its target length +then repeatedly ranks edge regions, discarding the edge with the lowest score based on the selected strategy. +- `entropy`: focus on the region with the highest [Shannon entropy](https://en.wikipedia.org/wiki/Entropy_%28information_theory%29). +- `attention`: focus on the region with the highest luminance frequency, colour saturation and presence of skin tones. + +Possible downsizing kernels are: +- `nearest`: Use [nearest neighbour interpolation](http://en.wikipedia.org/wiki/Nearest-neighbor_interpolation). +- `linear`: Use a [triangle filter](https://en.wikipedia.org/wiki/Triangular_function). +- `cubic`: Use a [Catmull-Rom spline](https://en.wikipedia.org/wiki/Centripetal_Catmull%E2%80%93Rom_spline). +- `mitchell`: Use a [Mitchell-Netravali spline](https://www.cs.utexas.edu/~fussell/courses/cs384g-fall2013/lectures/mitchell/Mitchell.pdf). +- `lanczos2`: Use a [Lanczos kernel](https://en.wikipedia.org/wiki/Lanczos_resampling#Lanczos_kernel) with `a=2`. +- `lanczos3`: Use a Lanczos kernel with `a=3` (the default). +- `mks2013`: Use a [Magic Kernel Sharp](https://johncostella.com/magic/mks.pdf) 2013 kernel, as adopted by Facebook. +- `mks2021`: Use a Magic Kernel Sharp 2021 kernel, with more accurate (reduced) sharpening than the 2013 version. + +When upsampling, these kernels map to `nearest`, `linear` and `cubic` interpolators. +Downsampling kernels without a matching upsampling interpolator map to `cubic`. + +Only one resize can occur per pipeline. +Previous calls to `resize` in the same pipeline will be ignored. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [width] | number | | How many pixels wide the resultant image should be. Use `null` or `undefined` to auto-scale the width to match the height. | +| [height] | number | | How many pixels high the resultant image should be. Use `null` or `undefined` to auto-scale the height to match the width. | +| [options] | Object | | | +| [options.width] | number | | An alternative means of specifying `width`. If both are present this takes priority. | +| [options.height] | number | | An alternative means of specifying `height`. If both are present this takes priority. | +| [options.fit] | String | 'cover' | How the image should be resized/cropped to fit the target dimension(s), one of `cover`, `contain`, `fill`, `inside` or `outside`. | +| [options.position] | String | 'centre' | A position, gravity or strategy to use when `fit` is `cover` or `contain`. | +| [options.background] | String \| Object | {r: 0, g: 0, b: 0, alpha: 1} | background colour when `fit` is `contain`, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black without transparency. | +| [options.kernel] | String | 'lanczos3' | The kernel to use for image reduction and the inferred interpolator to use for upsampling. Use the `fastShrinkOnLoad` option to control kernel vs shrink-on-load. | +| [options.withoutEnlargement] | Boolean | false | Do not scale up if the width *or* height are already less than the target dimensions, equivalent to GraphicsMagick's `>` geometry option. This may result in output dimensions smaller than the target dimensions. | +| [options.withoutReduction] | Boolean | false | Do not scale down if the width *or* height are already greater than the target dimensions, equivalent to GraphicsMagick's `<` geometry option. This may still result in a crop to reach the target dimensions. | +| [options.fastShrinkOnLoad] | Boolean | true | Take greater advantage of the JPEG and WebP shrink-on-load feature, which can lead to a slight moiré pattern or round-down of an auto-scaled dimension. | + +**Example** +```js +sharp(input) + .resize({ width: 100 }) + .toBuffer() + .then(data => { + // 100 pixels wide, auto-scaled height + }); +``` +**Example** +```js +sharp(input) + .resize({ height: 100 }) + .toBuffer() + .then(data => { + // 100 pixels high, auto-scaled width + }); +``` +**Example** +```js +sharp(input) + .resize(200, 300, { + kernel: sharp.kernel.nearest, + fit: 'contain', + position: 'right top', + background: { r: 255, g: 255, b: 255, alpha: 0.5 } + }) + .toFile('output.png') + .then(() => { + // output.png is a 200 pixels wide and 300 pixels high image + // containing a nearest-neighbour scaled version + // contained within the north-east corner of a semi-transparent white canvas + }); +``` +**Example** +```js +const transformer = sharp() + .resize({ + width: 200, + height: 200, + fit: sharp.fit.cover, + position: sharp.strategy.entropy + }); +// Read image data from readableStream +// Write 200px square auto-cropped image data to writableStream +readableStream + .pipe(transformer) + .pipe(writableStream); +``` +**Example** +```js +sharp(input) + .resize(200, 200, { + fit: sharp.fit.inside, + withoutEnlargement: true + }) + .toFormat('jpeg') + .toBuffer() + .then(function(outputBuffer) { + // outputBuffer contains JPEG image data + // no wider and no higher than 200 pixels + // and no larger than the input image + }); +``` +**Example** +```js +sharp(input) + .resize(200, 200, { + fit: sharp.fit.outside, + withoutReduction: true + }) + .toFormat('jpeg') + .toBuffer() + .then(function(outputBuffer) { + // outputBuffer contains JPEG image data + // of at least 200 pixels wide and 200 pixels high while maintaining aspect ratio + // and no smaller than the input image + }); +``` +**Example** +```js +const scaleByHalf = await sharp(input) + .metadata() + .then(({ width }) => sharp(input) + .resize(Math.round(width * 0.5)) + .toBuffer() + ); +``` + + +## extend +> extend(extend) ⇒ Sharp + +Extend / pad / extrude one or more edges of the image with either +the provided background colour or pixels derived from the image. +This operation will always occur after resizing and extraction, if any. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| extend | number \| Object | | single pixel count to add to all edges or an Object with per-edge counts | +| [extend.top] | number | 0 | | +| [extend.left] | number | 0 | | +| [extend.bottom] | number | 0 | | +| [extend.right] | number | 0 | | +| [extend.extendWith] | String | 'background' | populate new pixels using this method, one of: background, copy, repeat, mirror. | +| [extend.background] | String \| Object | {r: 0, g: 0, b: 0, alpha: 1} | background colour, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black without transparency. | + +**Example** +```js +// Resize to 140 pixels wide, then add 10 transparent pixels +// to the top, left and right edges and 20 to the bottom edge +sharp(input) + .resize(140) + .extend({ + top: 10, + bottom: 20, + left: 10, + right: 10, + background: { r: 0, g: 0, b: 0, alpha: 0 } + }) + ... +``` +**Example** +```js +// Add a row of 10 red pixels to the bottom +sharp(input) + .extend({ + bottom: 10, + background: 'red' + }) + ... +``` +**Example** +```js +// Extrude image by 8 pixels to the right, mirroring existing right hand edge +sharp(input) + .extend({ + right: 8, + background: 'mirror' + }) + ... +``` + + +## extract +> extract(options) ⇒ Sharp + +Extract/crop a region of the image. + +- Use `extract` before `resize` for pre-resize extraction. +- Use `extract` after `resize` for post-resize extraction. +- Use `extract` twice and `resize` once for extract-then-resize-then-extract in a fixed operation order. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Description | +| --- | --- | --- | +| options | Object | describes the region to extract using integral pixel values | +| options.left | number | zero-indexed offset from left edge | +| options.top | number | zero-indexed offset from top edge | +| options.width | number | width of region to extract | +| options.height | number | height of region to extract | + +**Example** +```js +sharp(input) + .extract({ left: left, top: top, width: width, height: height }) + .toFile(output, function(err) { + // Extract a region of the input image, saving in the same format. + }); +``` +**Example** +```js +sharp(input) + .extract({ left: leftOffsetPre, top: topOffsetPre, width: widthPre, height: heightPre }) + .resize(width, height) + .extract({ left: leftOffsetPost, top: topOffsetPost, width: widthPost, height: heightPost }) + .toFile(output, function(err) { + // Extract a region, resize, then extract from the resized image + }); +``` + + +## trim +> trim([options]) ⇒ Sharp + +Trim pixels from all edges that contain values similar to the given background colour, which defaults to that of the top-left pixel. + +Images with an alpha channel will use the combined bounding box of alpha and non-alpha channels. + +If the result of this operation would trim an image to nothing then no change is made. + +The `info` response Object will contain `trimOffsetLeft` and `trimOffsetTop` properties. + + +**Throws**: + +- Error Invalid parameters + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object | | | +| [options.background] | string \| Object | "'top-left pixel'" | Background colour, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to that of the top-left pixel. | +| [options.threshold] | number | 10 | Allowed difference from the above colour, a positive number. | +| [options.lineArt] | boolean | false | Does the input more closely resemble line art (e.g. vector) rather than being photographic? | + +**Example** +```js +// Trim pixels with a colour similar to that of the top-left pixel. +await sharp(input) + .trim() + .toFile(output); +``` +**Example** +```js +// Trim pixels with the exact same colour as that of the top-left pixel. +await sharp(input) + .trim({ + threshold: 0 + }) + .toFile(output); +``` +**Example** +```js +// Assume input is line art and trim only pixels with a similar colour to red. +const output = await sharp(input) + .trim({ + background: "#FF0000", + lineArt: true + }) + .toBuffer(); +``` +**Example** +```js +// Trim all "yellow-ish" pixels, being more lenient with the higher threshold. +const output = await sharp(input) + .trim({ + background: "yellow", + threshold: 42, + }) + .toBuffer(); +``` \ No newline at end of file diff --git a/docs/src/content/docs/api-utility.md b/docs/src/content/docs/api-utility.md new file mode 100644 index 000000000..73190fd01 --- /dev/null +++ b/docs/src/content/docs/api-utility.md @@ -0,0 +1,232 @@ +--- +# This file was auto-generated from JSDoc in lib/utility.js +title: Global properties +--- + +## versions +> versions + +An Object containing the version numbers of sharp, libvips +and (when using prebuilt binaries) its dependencies. + + +**Example** +```js +console.log(sharp.versions); +``` + + +## interpolators +> interpolators : enum + +An Object containing the available interpolators and their proper values + + +**Read only**: true +**Properties** + +| Name | Type | Default | Description | +| --- | --- | --- | --- | +| nearest | string | "nearest" | [Nearest neighbour interpolation](http://en.wikipedia.org/wiki/Nearest-neighbor_interpolation). Suitable for image enlargement only. | +| bilinear | string | "bilinear" | [Bilinear interpolation](http://en.wikipedia.org/wiki/Bilinear_interpolation). Faster than bicubic but with less smooth results. | +| bicubic | string | "bicubic" | [Bicubic interpolation](http://en.wikipedia.org/wiki/Bicubic_interpolation) (the default). | +| locallyBoundedBicubic | string | "lbb" | [LBB interpolation](https://github.com/libvips/libvips/blob/master/libvips/resample/lbb.cpp#L100). Prevents some "[acutance](http://en.wikipedia.org/wiki/Acutance)" but typically reduces performance by a factor of 2. | +| nohalo | string | "nohalo" | [Nohalo interpolation](http://eprints.soton.ac.uk/268086/). Prevents acutance but typically reduces performance by a factor of 3. | +| vertexSplitQuadraticBasisSpline | string | "vsqbs" | [VSQBS interpolation](https://github.com/libvips/libvips/blob/master/libvips/resample/vsqbs.cpp#L48). Prevents "staircasing" when enlarging. | + + + +## format +> format ⇒ Object + +An Object containing nested boolean values representing the available input and output formats/methods. + + +**Example** +```js +console.log(sharp.format); +``` + + +## queue +> queue + +An EventEmitter that emits a `change` event when a task is either: +- queued, waiting for _libuv_ to provide a worker thread +- complete + + +**Example** +```js +sharp.queue.on('change', function(queueLength) { + console.log('Queue contains ' + queueLength + ' task(s)'); +}); +``` + + +## cache +> cache([options]) ⇒ Object + +Gets or, when options are provided, sets the limits of _libvips'_ operation cache. +Existing entries in the cache will be trimmed after any change in limits. +This method always returns cache statistics, +useful for determining how much working memory is required for a particular task. + + + +| Param | Type | Default | Description | +| --- | --- | --- | --- | +| [options] | Object \| boolean | true | Object with the following attributes, or boolean where true uses default cache settings and false removes all caching | +| [options.memory] | number | 50 | is the maximum memory in MB to use for this cache | +| [options.files] | number | 20 | is the maximum number of files to hold open | +| [options.items] | number | 100 | is the maximum number of operations to cache | + +**Example** +```js +const stats = sharp.cache(); +``` +**Example** +```js +sharp.cache( { items: 200 } ); +sharp.cache( { files: 0 } ); +sharp.cache(false); +``` + + +## concurrency +> concurrency([concurrency]) ⇒ number + +Gets or, when a concurrency is provided, sets +the maximum number of threads _libvips_ should use to process _each image_. +These are from a thread pool managed by glib, +which helps avoid the overhead of creating new threads. + +This method always returns the current concurrency. + +The default value is the number of CPU cores, +except when using glibc-based Linux without jemalloc, +where the default is `1` to help reduce memory fragmentation. + +A value of `0` will reset this to the number of CPU cores. + +Some image format libraries spawn additional threads, +e.g. libaom manages its own 4 threads when encoding AVIF images, +and these are independent of the value set here. + +:::note +Further [control over performance](/performance/) is available. +::: + + +**Returns**: number - concurrency + +| Param | Type | +| --- | --- | +| [concurrency] | number | + +**Example** +```js +const threads = sharp.concurrency(); // 4 +sharp.concurrency(2); // 2 +sharp.concurrency(0); // 4 +``` + + +## counters +> counters() ⇒ Object + +Provides access to internal task counters. +- queue is the number of tasks this module has queued waiting for _libuv_ to provide a worker thread from its pool. +- process is the number of resize tasks currently being processed. + + +**Example** +```js +const counters = sharp.counters(); // { queue: 2, process: 4 } +``` + + +## simd +> simd([simd]) ⇒ boolean + +Get and set use of SIMD vector unit instructions. +Requires libvips to have been compiled with highway support. + +Improves the performance of `resize`, `blur` and `sharpen` operations +by taking advantage of the SIMD vector unit of the CPU, e.g. Intel SSE and ARM NEON. + + + +| Param | Type | Default | +| --- | --- | --- | +| [simd] | boolean | true | + +**Example** +```js +const simd = sharp.simd(); +// simd is `true` if the runtime use of highway is currently enabled +``` +**Example** +```js +const simd = sharp.simd(false); +// prevent libvips from using highway at runtime +``` + + +## block +> block(options) + +Block libvips operations at runtime. + +This is in addition to the `VIPS_BLOCK_UNTRUSTED` environment variable, +which when set will block all "untrusted" operations. + + +**Since**: 0.32.4 + +| Param | Type | Description | +| --- | --- | --- | +| options | Object | | +| options.operation | Array.<string> | List of libvips low-level operation names to block. | + +**Example** *(Block all TIFF input.)* +```js +sharp.block({ + operation: ['VipsForeignLoadTiff'] +}); +``` + + +## unblock +> unblock(options) + +Unblock libvips operations at runtime. + +This is useful for defining a list of allowed operations. + + +**Since**: 0.32.4 + +| Param | Type | Description | +| --- | --- | --- | +| options | Object | | +| options.operation | Array.<string> | List of libvips low-level operation names to unblock. | + +**Example** *(Block all input except WebP from the filesystem.)* +```js +sharp.block({ + operation: ['VipsForeignLoad'] +}); +sharp.unblock({ + operation: ['VipsForeignLoadWebpFile'] +}); +``` +**Example** *(Block all input except JPEG and PNG from a Buffer or Stream.)* +```js +sharp.block({ + operation: ['VipsForeignLoad'] +}); +sharp.unblock({ + operation: ['VipsForeignLoadJpegBuffer', 'VipsForeignLoadPngBuffer'] +}); +``` \ No newline at end of file diff --git a/docs/src/content/docs/changelog/_meta.yml b/docs/src/content/docs/changelog/_meta.yml new file mode 100644 index 000000000..5dd257241 --- /dev/null +++ b/docs/src/content/docs/changelog/_meta.yml @@ -0,0 +1 @@ +sort: reverse-slug diff --git a/docs/src/content/docs/changelog/v0.10.0.md b/docs/src/content/docs/changelog/v0.10.0.md new file mode 100644 index 000000000..aae2ca0ee --- /dev/null +++ b/docs/src/content/docs/changelog/v0.10.0.md @@ -0,0 +1,22 @@ +--- +title: v0.10.0 - 23rd April 2015 +slug: changelog/v0.10.0 +--- + +* Add support for Windows (x86). + [#19](https://github.com/lovell/sharp/issues/19) + [@DullReferenceException](https://github.com/DullReferenceException) + [@itsananderson](https://github.com/itsananderson) + +* Add support for Openslide input and DeepZoom output. + [#146](https://github.com/lovell/sharp/issues/146) + [@mvictoras](https://github.com/mvictoras) + +* Allow arbitrary aspect ratios when resizing images via new `ignoreAspectRatio` method. + [#192](https://github.com/lovell/sharp/issues/192) + [@skedastik](https://github.com/skedastik) + +* Enhance output image contrast by stretching its luminance to cover the full dynamic range via new `normalize` method. + [#194](https://github.com/lovell/sharp/issues/194) + [@bkw](https://github.com/bkw) + [@codingforce](https://github.com/codingforce) diff --git a/docs/src/content/docs/changelog/v0.10.1.md b/docs/src/content/docs/changelog/v0.10.1.md new file mode 100644 index 000000000..6f0336861 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.10.1.md @@ -0,0 +1,12 @@ +--- +title: v0.10.1 - 1st June 2015 +slug: changelog/v0.10.1 +--- + +* Allow embed of image with alpha transparency onto non-transparent background. + [#204](https://github.com/lovell/sharp/issues/204) + [@mikemliu](https://github.com/mikemliu) + +* Include C standard library for `atoi` as Xcode 6.3 appears to no longer do this. + [#228](https://github.com/lovell/sharp/issues/228) + [@doggan](https://github.com/doggan) diff --git a/docs/src/content/docs/changelog/v0.11.0.md b/docs/src/content/docs/changelog/v0.11.0.md new file mode 100644 index 000000000..2b07a93a5 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.11.0.md @@ -0,0 +1,30 @@ +--- +title: v0.11.0 - 15th July 2015 +slug: changelog/v0.11.0 +--- + +* Allow alpha transparency compositing via new `overlayWith` method. + [#97](https://github.com/lovell/sharp/issues/97) + [@gasi](https://github.com/gasi) + +* Expose raw ICC profile data as a Buffer when using `metadata`. + [#129](https://github.com/lovell/sharp/issues/129) + [@homerjam](https://github.com/homerjam) + +* Allow image header updates via a parameter passed to existing `withMetadata` method. + Provide initial support for EXIF `Orientation` tag, + which if present is now removed when using `rotate`, `flip` or `flop`. + [#189](https://github.com/lovell/sharp/issues/189) + [@h2non](https://github.com/h2non) + +* Tighten constructor parameter checks. + [#221](https://github.com/lovell/sharp/issues/221) + [@mikemorris](https://github.com/mikemorris) + +* Allow one input Stream to be shared with two or more output Streams via new `clone` method. + [#235](https://github.com/lovell/sharp/issues/235) + [@jaubourg](https://github.com/jaubourg) + +* Use `round` instead of `floor` when auto-scaling dimensions to avoid floating-point rounding errors. + [#238](https://github.com/lovell/sharp/issues/238) + [@richardadjogah](https://github.com/richardadjogah) diff --git a/docs/src/content/docs/changelog/v0.11.1.md b/docs/src/content/docs/changelog/v0.11.1.md new file mode 100644 index 000000000..44bdfc925 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.11.1.md @@ -0,0 +1,12 @@ +--- +title: v0.11.1 - 12th August 2015 +slug: changelog/v0.11.1 +--- + +* Silence MSVC warning: "C4530: C++ exception handler used, but unwind semantics are not enabled". + [#244](https://github.com/lovell/sharp/pull/244) + [@TheThing](https://github.com/TheThing) + +* Suppress gamma correction for input image with alpha transparency. + [#249](https://github.com/lovell/sharp/issues/249) + [@compeak](https://github.com/compeak) diff --git a/docs/src/content/docs/changelog/v0.11.2.md b/docs/src/content/docs/changelog/v0.11.2.md new file mode 100644 index 000000000..9990b05f7 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.11.2.md @@ -0,0 +1,10 @@ +--- +title: v0.11.2 - 28th August 2015 +slug: changelog/v0.11.2 +--- + +* Allow crop gravity to be provided as a String. + [#255](https://github.com/lovell/sharp/pull/255) + [@papandreou](https://github.com/papandreou) +* Add support for io.js v3 and Node v4. + [#246](https://github.com/lovell/sharp/issues/246) diff --git a/docs/src/content/docs/changelog/v0.11.3.md b/docs/src/content/docs/changelog/v0.11.3.md new file mode 100644 index 000000000..a062d1b8b --- /dev/null +++ b/docs/src/content/docs/changelog/v0.11.3.md @@ -0,0 +1,8 @@ +--- +title: v0.11.3 - 8th September 2015 +slug: changelog/v0.11.3 +--- + +* Intrepret blurSigma, sharpenFlat, and sharpenJagged as double precision. + [#263](https://github.com/lovell/sharp/pull/263) + [@chrisriley](https://github.com/chrisriley) diff --git a/docs/src/content/docs/changelog/v0.11.4.md b/docs/src/content/docs/changelog/v0.11.4.md new file mode 100644 index 000000000..f8cba2cd6 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.11.4.md @@ -0,0 +1,16 @@ +--- +title: v0.11.4 - 5th November 2015 +slug: changelog/v0.11.4 +--- + +* Add corners, e.g. `northeast`, to existing `gravity` option. + [#291](https://github.com/lovell/sharp/pull/291) + [@brandonaaron](https://github.com/brandonaaron) + +* Ensure correct auto-rotation for EXIF Orientation values 2 and 4. + [#288](https://github.com/lovell/sharp/pull/288) + [@brandonaaron](https://github.com/brandonaaron) + +* Make static linking possible via `--runtime_link` install option. + [#287](https://github.com/lovell/sharp/pull/287) + [@vlapo](https://github.com/vlapo) diff --git a/docs/src/content/docs/changelog/v0.12.0.md b/docs/src/content/docs/changelog/v0.12.0.md new file mode 100644 index 000000000..c081be9de --- /dev/null +++ b/docs/src/content/docs/changelog/v0.12.0.md @@ -0,0 +1,38 @@ +--- +title: v0.12.0 - 23rd November 2015 +slug: changelog/v0.12.0 +--- + +* Bundle pre-compiled libvips and its dependencies for 64-bit Linux and Windows. + [#42](https://github.com/lovell/sharp/issues/42) + +* Take advantage of libvips v8.1.0+ features. + [#152](https://github.com/lovell/sharp/issues/152) + +* Add support for 64-bit Windows. Drop support for 32-bit Windows. + [#224](https://github.com/lovell/sharp/issues/224) + [@sabrehagen](https://github.com/sabrehagen) + +* Switch default interpolator to bicubic. + [#289](https://github.com/lovell/sharp/issues/289) + [@mahnunchik](https://github.com/mahnunchik) + +* Pre-extract rotatation should not swap width/height. + [#296](https://github.com/lovell/sharp/issues/296) + [@asilvas](https://github.com/asilvas) + +* Ensure 16-bit+alpha input images are (un)premultiplied correctly. + [#301](https://github.com/lovell/sharp/issues/301) + [@izaakschroeder](https://github.com/izaakschroeder) + +* Add `threshold` operation. + [#303](https://github.com/lovell/sharp/pull/303) + [@dacarley](https://github.com/dacarley) + +* Add `negate` operation. + [#306](https://github.com/lovell/sharp/pull/306) + [@dacarley](https://github.com/dacarley) + +* Support `options` Object with existing `extract` operation. + [#309](https://github.com/lovell/sharp/pull/309) + [@papandreou](https://github.com/papandreou) diff --git a/docs/src/content/docs/changelog/v0.12.1.md b/docs/src/content/docs/changelog/v0.12.1.md new file mode 100644 index 000000000..61fb46a80 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.12.1.md @@ -0,0 +1,15 @@ +--- +title: v0.12.1 - 12th December 2015 +slug: changelog/v0.12.1 +--- + +* Allow use of SIMD vector instructions (via liborc) to be toggled on/off. + [#172](https://github.com/lovell/sharp/issues/172) + [@bkw](https://github.com/bkw) + [@puzrin](https://github.com/puzrin) + +* Ensure embedded ICC profiles output with perceptual intent. + [#321](https://github.com/lovell/sharp/issues/321) + [@vlapo](https://github.com/vlapo) + +* Use the NPM-configured HTTPS proxy, if any, for binary downloads. diff --git a/docs/src/content/docs/changelog/v0.12.2.md b/docs/src/content/docs/changelog/v0.12.2.md new file mode 100644 index 000000000..91dd927de --- /dev/null +++ b/docs/src/content/docs/changelog/v0.12.2.md @@ -0,0 +1,20 @@ +--- +title: v0.12.2 - 16th January 2016 +slug: changelog/v0.12.2 +--- + +* Upgrade libvips to v8.2.0 for improved vips_shrink. + +* Add pre-compiled libvips for ARMv6+ CPUs. + +* Ensure 16-bit input images work with embed option. + [#325](https://github.com/lovell/sharp/issues/325) + [@janaz](https://github.com/janaz) + +* Allow compilation with gmake to provide FreeBSD support. + [#326](https://github.com/lovell/sharp/issues/326) + [@c0decafe](https://github.com/c0decafe) + +* Attempt to remove temporary file after installation. + [#331](https://github.com/lovell/sharp/issues/331) + [@dtoubelis](https://github.com/dtoubelis) diff --git a/docs/src/content/docs/changelog/v0.13.0.md b/docs/src/content/docs/changelog/v0.13.0.md new file mode 100644 index 000000000..f1c989a6b --- /dev/null +++ b/docs/src/content/docs/changelog/v0.13.0.md @@ -0,0 +1,47 @@ +--- +title: v0.13.0 - 15th February 2016 +slug: changelog/v0.13.0 +--- + +* Improve vector image support by allowing control of density/DPI. + Switch pre-built libs from Imagemagick to Graphicsmagick. + [#110](https://github.com/lovell/sharp/issues/110) + [@bradisbell](https://github.com/bradisbell) + +* Add support for raw, uncompressed pixel Buffer/Stream input. + [#220](https://github.com/lovell/sharp/issues/220) + [@mikemorris](https://github.com/mikemorris) + +* Switch from libvips' C to C++ bindings, requires upgrade to v8.2.2. + [#299](https://github.com/lovell/sharp/issues/299) + +* Control number of open files in libvips' cache; breaks existing `cache` behaviour. + [#315](https://github.com/lovell/sharp/issues/315) + [@impomezia](https://github.com/impomezia) + +* Ensure 16-bit input images can be normalised and embedded onto transparent backgrounds. + [#339](https://github.com/lovell/sharp/issues/339) + [#340](https://github.com/lovell/sharp/issues/340) + [@janaz](https://github.com/janaz) + +* Ensure selected format takes precedence over any unknown output filename extension. + [#344](https://github.com/lovell/sharp/issues/344) + [@ubaltaci](https://github.com/ubaltaci) + +* Add support for libvips' PBM, PGM, PPM and FITS image format loaders. + [#347](https://github.com/lovell/sharp/issues/347) + [@oaleynik](https://github.com/oaleynik) + +* Ensure default crop gravity is center/centre. + [#351](https://github.com/lovell/sharp/pull/351) + [@joelmukuthu](https://github.com/joelmukuthu) + +* Improve support for musl libc systems e.g. Alpine Linux. + [#354](https://github.com/lovell/sharp/issues/354) + [#359](https://github.com/lovell/sharp/pull/359) + [@download13](https://github.com/download13) + [@wjordan](https://github.com/wjordan) + +* Small optimisation when reducing by an integral factor to favour shrink over affine. + +* Add support for gamma correction of images with an alpha channel. diff --git a/docs/src/content/docs/changelog/v0.13.1.md b/docs/src/content/docs/changelog/v0.13.1.md new file mode 100644 index 000000000..cbaf11ec5 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.13.1.md @@ -0,0 +1,8 @@ +--- +title: v0.13.1 - 27th February 2016 +slug: changelog/v0.13.1 +--- + +* Fix embedding onto transparent backgrounds; regression introduced in v0.13.0. + [#366](https://github.com/lovell/sharp/issues/366) + [@diegocsandrim](https://github.com/diegocsandrim) diff --git a/docs/src/content/docs/changelog/v0.14.0.md b/docs/src/content/docs/changelog/v0.14.0.md new file mode 100644 index 000000000..a23f2e7f0 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.14.0.md @@ -0,0 +1,39 @@ +--- +title: v0.14.0 - 2nd April 2016 +slug: changelog/v0.14.0 +--- + +* Add ability to extend (pad) the edges of an image. + [#128](https://github.com/lovell/sharp/issues/128) + [@blowsie](https://github.com/blowsie) + +* Add support for Zoomify and Google tile layouts. Breaks existing tile API. + [#223](https://github.com/lovell/sharp/issues/223) + [@bdunnette](https://github.com/bdunnette) + +* Improvements to overlayWith: differing sizes/formats, gravity, buffer input. + [#239](https://github.com/lovell/sharp/issues/239) + [@chrisriley](https://github.com/chrisriley) + +* Add entropy-based crop strategy to remove least interesting edges. + [#295](https://github.com/lovell/sharp/issues/295) + [@rightaway](https://github.com/rightaway) + +* Expose density metadata; set density of images from vector input. + [#338](https://github.com/lovell/sharp/issues/338) + [@lookfirst](https://github.com/lookfirst) + +* Emit post-processing 'info' event for Stream output. + [#367](https://github.com/lovell/sharp/issues/367) + [@salzhrani](https://github.com/salzhrani) + +* Ensure output image EXIF Orientation values are within 1-8 range. + [#385](https://github.com/lovell/sharp/pull/385) + [@jtobinisaniceguy](https://github.com/jtobinisaniceguy) + +* Ensure ratios are not swapped when rotating 90/270 and ignoring aspect. + [#387](https://github.com/lovell/sharp/issues/387) + [@kleisauke](https://github.com/kleisauke) + +* Remove deprecated style of calling extract API. Breaks calls using positional arguments. + [#276](https://github.com/lovell/sharp/issues/276) diff --git a/docs/src/content/docs/changelog/v0.14.1.md b/docs/src/content/docs/changelog/v0.14.1.md new file mode 100644 index 000000000..85e88c8b1 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.14.1.md @@ -0,0 +1,26 @@ +--- +title: v0.14.1 - 16th April 2016 +slug: changelog/v0.14.1 +--- + +* Allow removal of limitation on input pixel count via limitInputPixels. Use with care. + [#250](https://github.com/lovell/sharp/issues/250) + [#316](https://github.com/lovell/sharp/pull/316) + [@anandthakker](https://github.com/anandthakker) + [@kentongray](https://github.com/kentongray) + +* Use final output image for metadata passed to callback. + [#399](https://github.com/lovell/sharp/pull/399) + [@salzhrani](https://github.com/salzhrani) + +* Add support for writing tiled images to a zip container. + [#402](https://github.com/lovell/sharp/pull/402) + [@felixbuenemann](https://github.com/felixbuenemann) + +* Allow use of embed with 1 and 2 channel images. + [#411](https://github.com/lovell/sharp/issues/411) + [@janaz](https://github.com/janaz) + +* Improve Electron compatibility by allowing node-gyp rebuilds without npm. + [#412](https://github.com/lovell/sharp/issues/412) + [@nouh](https://github.com/nouh) diff --git a/docs/src/content/docs/changelog/v0.15.0.md b/docs/src/content/docs/changelog/v0.15.0.md new file mode 100644 index 000000000..ed7807f98 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.15.0.md @@ -0,0 +1,20 @@ +--- +title: v0.15.0 - 21st May 2016 +slug: changelog/v0.15.0 +--- + +* Use libvips' new Lanczos 3 kernel as default for image reduction. + Deprecate interpolateWith method, now provided as a resize option. + [#310](https://github.com/lovell/sharp/issues/310) + [@jcupitt](https://github.com/jcupitt) + +* Take advantage of libvips v8.3 features. + Add support for libvips' new GIF and SVG loaders. + Pre-built binaries now include giflib and librsvg, exclude *magick. + Use shrink-on-load for WebP input. + Break existing sharpen API to accept sigma and improve precision. + [#369](https://github.com/lovell/sharp/issues/369) + +* Remove unnecessary (un)premultiply operations when not resizing/compositing. + [#413](https://github.com/lovell/sharp/issues/413) + [@jardakotesovec](https://github.com/jardakotesovec) diff --git a/docs/src/content/docs/changelog/v0.15.1.md b/docs/src/content/docs/changelog/v0.15.1.md new file mode 100644 index 000000000..4cfdfba7d --- /dev/null +++ b/docs/src/content/docs/changelog/v0.15.1.md @@ -0,0 +1,68 @@ +--- +title: v0.15.1 - 12th July 2016 +slug: changelog/v0.15.1 +--- + +* Concat Stream-based input in single operation for ~+3% perf and less GC. + [#429](https://github.com/lovell/sharp/issues/429) + [@papandreou](https://github.com/papandreou) + +* Add alpha channel, if required, before extend operation. + [#439](https://github.com/lovell/sharp/pull/439) + [@frulo](https://github.com/frulo) + +* Allow overlay image to be repeated across entire image via tile option. + [#443](https://github.com/lovell/sharp/pull/443) + [@lemnisk8](https://github.com/lemnisk8) + +* Add cutout option to overlayWith feature, applies only the alpha channel of the overlay image. + [#448](https://github.com/lovell/sharp/pull/448) + [@kleisauke](https://github.com/kleisauke) + +* Ensure scaling factors are calculated independently to prevent rounding errors. + [#452](https://github.com/lovell/sharp/issues/452) + [@puzrin](https://github.com/puzrin) + +* Add --sharp-cxx11 flag to compile with gcc's new C++11 ABI. + [#456](https://github.com/lovell/sharp/pull/456) + [@kapouer](https://github.com/kapouer) + +* Add top/left offset support to overlayWith operation. + [#473](https://github.com/lovell/sharp/pull/473) + [@rnanwani](https://github.com/rnanwani) + +* Add convolve operation for kernel-based convolution. + [#479](https://github.com/lovell/sharp/pull/479) + [@mhirsch](https://github.com/mhirsch) + +* Add greyscale option to threshold operation for colourspace conversion control. + [#480](https://github.com/lovell/sharp/pull/480) + [@mhirsch](https://github.com/mhirsch) + +* Ensure ICC profiles are licenced for distribution. + [#486](https://github.com/lovell/sharp/issues/486) + [@kapouer](https://github.com/kapouer) + +* Allow images with an alpha channel to work with LAB-colourspace based sharpen. + [#490](https://github.com/lovell/sharp/issues/490) + [@jwagner](https://github.com/jwagner) + +* Add trim operation to remove "boring" edges. + [#492](https://github.com/lovell/sharp/pull/492) + [@kleisauke](https://github.com/kleisauke) + +* Add bandbool feature for channel-wise boolean operations. + [#496](https://github.com/lovell/sharp/pull/496) + [@mhirsch](https://github.com/mhirsch) + +* Add extractChannel operation to extract a channel from an image. + [#497](https://github.com/lovell/sharp/pull/497) + [@mhirsch](https://github.com/mhirsch) + +* Add ability to read and write native libvips .v files. + [#500](https://github.com/lovell/sharp/pull/500) + [@mhirsch](https://github.com/mhirsch) + +* Add boolean feature for bitwise image operations. + [#501](https://github.com/lovell/sharp/pull/501) + [@mhirsch](https://github.com/mhirsch) diff --git a/docs/src/content/docs/changelog/v0.16.0.md b/docs/src/content/docs/changelog/v0.16.0.md new file mode 100644 index 000000000..b6af330fd --- /dev/null +++ b/docs/src/content/docs/changelog/v0.16.0.md @@ -0,0 +1,42 @@ +--- +title: v0.16.0 - 18th August 2016 +slug: changelog/v0.16.0 +--- + +* Add pre-compiled libvips for OS X, ARMv7 and ARMv8. + [#312](https://github.com/lovell/sharp/issues/312) + +* Ensure boolean, bandbool, extractChannel ops occur before sRGB conversion. + [#504](https://github.com/lovell/sharp/pull/504) + [@mhirsch](https://github.com/mhirsch) + +* Recalculate factors after WebP shrink-on-load to avoid round-to-zero errors. + [#508](https://github.com/lovell/sharp/issues/508) + [@asilvas](https://github.com/asilvas) + +* Prevent boolean errors during extract operation. + [#511](https://github.com/lovell/sharp/pull/511) + [@mhirsch](https://github.com/mhirsch) + +* Add joinChannel and toColourspace/toColorspace operations. + [#513](https://github.com/lovell/sharp/pull/513) + [@mhirsch](https://github.com/mhirsch) + +* Add support for raw pixel data with boolean and withOverlay operations. + [#516](https://github.com/lovell/sharp/pull/516) + [@mhirsch](https://github.com/mhirsch) + +* Prevent bandbool creating a single channel sRGB image. + [#519](https://github.com/lovell/sharp/pull/519) + [@mhirsch](https://github.com/mhirsch) + +* Ensure ICC profiles are removed from PNG output unless withMetadata used. + [#521](https://github.com/lovell/sharp/issues/521) + [@ChrisPinewood](https://github.com/ChrisPinewood) + +* Add alpha channels, if missing, to overlayWith images. + [#540](https://github.com/lovell/sharp/pull/540) + [@cmtt](https://github.com/cmtt) + +* Remove deprecated interpolateWith method - use resize(w, h, { interpolator: ... }) + [#310](https://github.com/lovell/sharp/issues/310) diff --git a/docs/src/content/docs/changelog/v0.16.1.md b/docs/src/content/docs/changelog/v0.16.1.md new file mode 100644 index 000000000..c49d28981 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.16.1.md @@ -0,0 +1,21 @@ +--- +title: v0.16.1 - 13th October 2016 +slug: changelog/v0.16.1 +--- + +* C++11 ABI version is now auto-detected, remove sharp-cxx11 installation flag. + +* Add experimental 'attention' crop strategy. + [#295](https://github.com/lovell/sharp/issues/295) + +* Include .node extension for Meteor's require() implementation. + [#537](https://github.com/lovell/sharp/issues/537) + [@isjackwild](https://github.com/isjackwild) + +* Ensure convolution kernel scale is clamped to a minimum value of 1. + [#561](https://github.com/lovell/sharp/issues/561) + [@abagshaw](https://github.com/abagshaw) + +* Correct calculation of y-axis placement when overlaying image at a fixed point. + [#566](https://github.com/lovell/sharp/issues/566) + [@Nateowami](https://github.com/Nateowami) diff --git a/docs/src/content/docs/changelog/v0.16.2.md b/docs/src/content/docs/changelog/v0.16.2.md new file mode 100644 index 000000000..8c3976c64 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.16.2.md @@ -0,0 +1,8 @@ +--- +title: v0.16.2 - 22nd October 2016 +slug: changelog/v0.16.2 +--- + +* Restrict readelf usage to Linux only when detecting global libvips version. + [#602](https://github.com/lovell/sharp/issues/602) + [@caoko](https://github.com/caoko) diff --git a/docs/src/content/docs/changelog/v0.17.0.md b/docs/src/content/docs/changelog/v0.17.0.md new file mode 100644 index 000000000..4ec1f1856 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.17.0.md @@ -0,0 +1,39 @@ +--- +title: v0.17.0 - 11th December 2016 +slug: changelog/v0.17.0 +--- + +* Drop support for versions of Node prior to v4. + +* Deprecate the following output format "option" functions: + quality, progressive, compressionLevel, withoutAdaptiveFiltering, + withoutChromaSubsampling, trellisQuantisation, trellisQuantization, + overshootDeringing, optimiseScans and optimizeScans. + Access to these is now via output format functions, for example `quality(n)` + is now `jpeg({quality: n})` and/or `webp({quality: n})`. + +* Autoconvert GIF and SVG input to PNG output if no other format is specified. + +* Expose libvips' "centre" resize option to mimic \*magick's +0.5px convention. + [#568](https://github.com/lovell/sharp/issues/568) + +* Ensure support for embedded base64 PNG and JPEG images within an SVG. + [#601](https://github.com/lovell/sharp/issues/601) + [@dynamite-ready](https://github.com/dynamite-ready) + +* Ensure premultiply operation occurs before box filter shrink. + [#605](https://github.com/lovell/sharp/issues/605) + [@CmdrShepardsPie](https://github.com/CmdrShepardsPie) + [@teroparvinen](https://github.com/teroparvinen) + +* Add support for PNG and WebP tile-based output formats (in addition to JPEG). + [#622](https://github.com/lovell/sharp/pull/622) + [@ppaskaris](https://github.com/ppaskaris) + +* Allow use of extend with greyscale input. + [#623](https://github.com/lovell/sharp/pull/623) + [@ppaskaris](https://github.com/ppaskaris) + +* Allow non-RGB input to embed/extend onto background with an alpha channel. + [#646](https://github.com/lovell/sharp/issues/646) + [@DaGaMs](https://github.com/DaGaMs) diff --git a/docs/src/content/docs/changelog/v0.17.1.md b/docs/src/content/docs/changelog/v0.17.1.md new file mode 100644 index 000000000..28cf12167 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.17.1.md @@ -0,0 +1,15 @@ +--- +title: v0.17.1 - 15th January 2017 +slug: changelog/v0.17.1 +--- + +* Improve error messages for invalid parameters. + [@spikeon](https://github.com/spikeon) + [#644](https://github.com/lovell/sharp/pull/644) + +* Simplify expression for finding vips-cpp libdir. + [#656](https://github.com/lovell/sharp/pull/656) + +* Allow HTTPS-over-HTTP proxy when downloading pre-compiled dependencies. + [@wangzhiwei1888](https://github.com/wangzhiwei1888) + [#679](https://github.com/lovell/sharp/issues/679) diff --git a/docs/src/content/docs/changelog/v0.17.2.md b/docs/src/content/docs/changelog/v0.17.2.md new file mode 100644 index 000000000..02859307e --- /dev/null +++ b/docs/src/content/docs/changelog/v0.17.2.md @@ -0,0 +1,12 @@ +--- +title: v0.17.2 - 11th February 2017 +slug: changelog/v0.17.2 +--- + +* Ensure Readable side of Stream can start flowing after Writable side has finished. + [#671](https://github.com/lovell/sharp/issues/671) + [@danhaller](https://github.com/danhaller) + +* Expose WebP alpha quality, lossless and near-lossless output options. + [#685](https://github.com/lovell/sharp/pull/685) + [@rnanwani](https://github.com/rnanwani) diff --git a/docs/src/content/docs/changelog/v0.17.3.md b/docs/src/content/docs/changelog/v0.17.3.md new file mode 100644 index 000000000..0bc48d942 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.17.3.md @@ -0,0 +1,20 @@ +--- +title: v0.17.3 - 1st April 2017 +slug: changelog/v0.17.3 +--- + +* Allow toBuffer to optionally resolve a Promise with both info and data. + [#143](https://github.com/lovell/sharp/issues/143) + [@salzhrani](https://github.com/salzhrani) + +* Create blank image of given width, height, channels and background. + [#470](https://github.com/lovell/sharp/issues/470) + [@pjarts](https://github.com/pjarts) + +* Add support for the "nearest" kernel for image reductions. + [#732](https://github.com/lovell/sharp/pull/732) + [@alice0meta](https://github.com/alice0meta) + +* Add support for TIFF compression and predictor options. + [#738](https://github.com/lovell/sharp/pull/738) + [@kristojorg](https://github.com/kristojorg) diff --git a/docs/src/content/docs/changelog/v0.18.0.md b/docs/src/content/docs/changelog/v0.18.0.md new file mode 100644 index 000000000..d8afbdd04 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.18.0.md @@ -0,0 +1,60 @@ +--- +title: v0.18.0 - 30th May 2017 +slug: changelog/v0.18.0 +--- + +* Remove the previously-deprecated output format "option" functions: + quality, progressive, compressionLevel, withoutAdaptiveFiltering, + withoutChromaSubsampling, trellisQuantisation, trellisQuantization, + overshootDeringing, optimiseScans and optimizeScans. + +* Ensure maximum output dimensions are based on the format to be used. + [#176](https://github.com/lovell/sharp/issues/176) + [@stephanebachelier](https://github.com/stephanebachelier) + +* Avoid costly (un)premultiply when using overlayWith without alpha channel. + [#573](https://github.com/lovell/sharp/issues/573) + [@strarsis](https://github.com/strarsis) + +* Include pixel depth (e.g. "uchar") when reading metadata. + [#577](https://github.com/lovell/sharp/issues/577) + [@moedusa](https://github.com/moedusa) + +* Add support for Buffer and Stream-based TIFF output. + [#587](https://github.com/lovell/sharp/issues/587) + [@strarsis](https://github.com/strarsis) + +* Expose warnings from libvips via NODE_DEBUG=sharp environment variable. + [#607](https://github.com/lovell/sharp/issues/607) + [@puzrin](https://github.com/puzrin) + +* Switch to the libvips implementation of "attention" and "entropy" crop strategies. + [#727](https://github.com/lovell/sharp/issues/727) + +* Improve performance and accuracy of nearest neighbour integral upsampling. + [#752](https://github.com/lovell/sharp/issues/752) + [@MrIbby](https://github.com/MrIbby) + +* Constructor single argument API: allow plain object, reject null/undefined. + [#768](https://github.com/lovell/sharp/issues/768) + [@kub1x](https://github.com/kub1x) + +* Ensure ARM64 pre-built binaries use correct C++11 ABI version. + [#772](https://github.com/lovell/sharp/issues/772) + [@ajiratech2](https://github.com/ajiratech2) + +* Prevent aliasing by using dynamic values for shrink(-on-load). + [#781](https://github.com/lovell/sharp/issues/781) + [@kleisauke](https://github.com/kleisauke) + +* Expose libvips' "squash" parameter to enable 1-bit TIFF output. + [#783](https://github.com/lovell/sharp/pull/783) + [@YvesBos](https://github.com/YvesBos) + +* Add support for rotation using any multiple of +/-90 degrees. + [#791](https://github.com/lovell/sharp/pull/791) + [@ncoden](https://github.com/ncoden) + +* Add "jpg" alias to toFormat as shortened form of "jpeg". + [#814](https://github.com/lovell/sharp/pull/814) + [@jingsam](https://github.com/jingsam) diff --git a/docs/src/content/docs/changelog/v0.18.1.md b/docs/src/content/docs/changelog/v0.18.1.md new file mode 100644 index 000000000..3121a9cf3 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.18.1.md @@ -0,0 +1,8 @@ +--- +title: v0.18.1 - 30th May 2017 +slug: changelog/v0.18.1 +--- + +* Remove regression from #781 that could cause incorrect shrink calculation. + [#831](https://github.com/lovell/sharp/issues/831) + [@suprMax](https://github.com/suprMax) diff --git a/docs/src/content/docs/changelog/v0.18.2.md b/docs/src/content/docs/changelog/v0.18.2.md new file mode 100644 index 000000000..2eabf8374 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.18.2.md @@ -0,0 +1,19 @@ +--- +title: v0.18.2 - 1st July 2017 +slug: changelog/v0.18.2 +--- + +* Expose libvips' xres and yres properties for TIFF output. + [#828](https://github.com/lovell/sharp/pull/828) + [@YvesBos](https://github.com/YvesBos) + +* Ensure flip and flop operations work with auto-rotate. + [#837](https://github.com/lovell/sharp/issues/837) + [@rexxars](https://github.com/rexxars) + +* Allow binary download URL override via SHARP_DIST_BASE_URL env variable. + [#841](https://github.com/lovell/sharp/issues/841) + +* Add support for Solus Linux. + [#857](https://github.com/lovell/sharp/pull/857) + [@ekremkaraca](https://github.com/ekremkaraca) diff --git a/docs/src/content/docs/changelog/v0.18.3.md b/docs/src/content/docs/changelog/v0.18.3.md new file mode 100644 index 000000000..ea66e8112 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.18.3.md @@ -0,0 +1,12 @@ +--- +title: v0.18.3 - 13th September 2017 +slug: changelog/v0.18.3 +--- + +* Skip shrink-on-load when trimming. + [#888](https://github.com/lovell/sharp/pull/888) + [@kleisauke](https://github.com/kleisauke) + +* Migrate from got to simple-get for basic auth support. + [#945](https://github.com/lovell/sharp/pull/945) + [@pbomb](https://github.com/pbomb) diff --git a/docs/src/content/docs/changelog/v0.18.4.md b/docs/src/content/docs/changelog/v0.18.4.md new file mode 100644 index 000000000..002e06c26 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.18.4.md @@ -0,0 +1,8 @@ +--- +title: v0.18.4 - 18th September 2017 +slug: changelog/v0.18.4 +--- + +* Ensure input Buffer really is marked as Persistent, prevents mark-sweep GC. + [#950](https://github.com/lovell/sharp/issues/950) + [@lfdoherty](https://github.com/lfdoherty) diff --git a/docs/src/content/docs/changelog/v0.19.0.md b/docs/src/content/docs/changelog/v0.19.0.md new file mode 100644 index 000000000..d9ecfefcb --- /dev/null +++ b/docs/src/content/docs/changelog/v0.19.0.md @@ -0,0 +1,46 @@ +--- +title: v0.19.0 - 11th January 2018 +slug: changelog/v0.19.0 +--- + +* Expose offset coordinates of strategy-based crop. + [#868](https://github.com/lovell/sharp/issues/868) + [@mirohristov-com](https://github.com/mirohristov-com) + +* PNG output now defaults to adaptiveFiltering=false, compressionLevel=9 + [#872](https://github.com/lovell/sharp/issues/872) + [@wmertens](https://github.com/wmertens) + +* Add stats feature for pixel-derived image statistics. + [#915](https://github.com/lovell/sharp/pull/915) + [@rnanwani](https://github.com/rnanwani) + +* Add failOnError option to fail-fast on bad input image data. + [#976](https://github.com/lovell/sharp/pull/976) + [@mceachen](https://github.com/mceachen) + +* Resize: switch to libvips' implementation, make fastShrinkOnLoad optional, remove interpolator and centreSampling options. + [#977](https://github.com/lovell/sharp/pull/977) + [@jardakotesovec](https://github.com/jardakotesovec) + +* Attach finish event listener to a clone only for Stream-based input. + [#995](https://github.com/lovell/sharp/issues/995) + [@whmountains](https://github.com/whmountains) + +* Add tilecache before smartcrop to avoid over-computation of previous operations. + [#1028](https://github.com/lovell/sharp/issues/1028) + [@coffeebite](https://github.com/coffeebite) + +* Prevent toFile extension taking precedence over requested format. + [#1037](https://github.com/lovell/sharp/issues/1037) + [@tomgallagher](https://github.com/tomgallagher) + +* Add support for gravity option to existing embed feature. + [#1038](https://github.com/lovell/sharp/pull/1038) + [@AzureByte](https://github.com/AzureByte) + +* Expose IPTC and XMP metadata when available. + [#1079](https://github.com/lovell/sharp/pull/1079) + [@oaleynik](https://github.com/oaleynik) + +* TIFF output: switch default predictor from 'none' to 'horizontal' to match libvips' behaviour. diff --git a/docs/src/content/docs/changelog/v0.19.1.md b/docs/src/content/docs/changelog/v0.19.1.md new file mode 100644 index 000000000..deeea20d1 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.19.1.md @@ -0,0 +1,16 @@ +--- +title: v0.19.1 - 24th February 2018 +slug: changelog/v0.19.1 +--- + +* Expose libvips' linear transform feature. + [#1024](https://github.com/lovell/sharp/pull/1024) + [@3epnm](https://github.com/3epnm) + +* Expose angle option for tile-based output. + [#1121](https://github.com/lovell/sharp/pull/1121) + [@BiancoA](https://github.com/BiancoA) + +* Prevent crop operation when image already at or below target dimensions. + [#1134](https://github.com/lovell/sharp/issues/1134) + [@pieh](https://github.com/pieh) diff --git a/docs/src/content/docs/changelog/v0.20.0.md b/docs/src/content/docs/changelog/v0.20.0.md new file mode 100644 index 000000000..1f0bf56a2 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.20.0.md @@ -0,0 +1,7 @@ +--- +title: v0.20.0 - 5th March 2018 +slug: changelog/v0.20.0 +--- + +* Add support for prebuilt sharp binaries on common platforms. + [#186](https://github.com/lovell/sharp/issues/186) diff --git a/docs/src/content/docs/changelog/v0.20.1.md b/docs/src/content/docs/changelog/v0.20.1.md new file mode 100644 index 000000000..f4af4314c --- /dev/null +++ b/docs/src/content/docs/changelog/v0.20.1.md @@ -0,0 +1,15 @@ +--- +title: v0.20.1 - 17th March 2018 +slug: changelog/v0.20.1 +--- + +* Improve installation experience when a globally-installed libvips below the minimum required version is found. + [#1148](https://github.com/lovell/sharp/issues/1148) + +* Prevent smartcrop error when cumulative rounding is below target size. + [#1154](https://github.com/lovell/sharp/issues/1154) + [@ralrom](https://github.com/ralrom) + +* Expose libvips' median filter operation. + [#1161](https://github.com/lovell/sharp/pull/1161) + [@BiancoA](https://github.com/BiancoA) diff --git a/docs/src/content/docs/changelog/v0.20.2.md b/docs/src/content/docs/changelog/v0.20.2.md new file mode 100644 index 000000000..202d5c339 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.20.2.md @@ -0,0 +1,20 @@ +--- +title: v0.20.2 - 28th April 2018 +slug: changelog/v0.20.2 +--- + +* Add tint operation to set image chroma. + [#825](https://github.com/lovell/sharp/pull/825) + [@rikh42](https://github.com/rikh42) + +* Add environment variable to ignore globally-installed libvips. + [#1165](https://github.com/lovell/sharp/pull/1165) + [@oncletom](https://github.com/oncletom) + +* Add support for page selection with multi-page input (GIF/TIFF). + [#1204](https://github.com/lovell/sharp/pull/1204) + [@woolite64](https://github.com/woolite64) + +* Add support for Group4 (CCITTFAX4) compression with TIFF output. + [#1208](https://github.com/lovell/sharp/pull/1208) + [@woolite64](https://github.com/woolite64) diff --git a/docs/src/content/docs/changelog/v0.20.3.md b/docs/src/content/docs/changelog/v0.20.3.md new file mode 100644 index 000000000..b00c64910 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.20.3.md @@ -0,0 +1,8 @@ +--- +title: v0.20.3 - 29th May 2018 +slug: changelog/v0.20.3 +--- + +* Fix tint operation by ensuring LAB interpretation and allowing negative values. + [#1235](https://github.com/lovell/sharp/issues/1235) + [@wezside](https://github.com/wezside) diff --git a/docs/src/content/docs/changelog/v0.20.4.md b/docs/src/content/docs/changelog/v0.20.4.md new file mode 100644 index 000000000..a9580dea3 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.20.4.md @@ -0,0 +1,12 @@ +--- +title: v0.20.4 - 20th June 2018 +slug: changelog/v0.20.4 +--- + +* Prevent possible rounding error when using shrink-on-load and 90/270 degree rotation. + [#1241](https://github.com/lovell/sharp/issues/1241) + [@anahit42](https://github.com/anahit42) + +* Ensure extractChannel sets correct single-channel colour space interpretation. + [#1257](https://github.com/lovell/sharp/issues/1257) + [@jeremychone](https://github.com/jeremychone) diff --git a/docs/src/content/docs/changelog/v0.20.5.md b/docs/src/content/docs/changelog/v0.20.5.md new file mode 100644 index 000000000..07f166149 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.20.5.md @@ -0,0 +1,8 @@ +--- +title: v0.20.5 - 27th June 2018 +slug: changelog/v0.20.5 +--- + +* Expose libjpeg optimize_coding flag. + [#1265](https://github.com/lovell/sharp/pull/1265) + [@tomlokhorst](https://github.com/tomlokhorst) diff --git a/docs/src/content/docs/changelog/v0.20.6.md b/docs/src/content/docs/changelog/v0.20.6.md new file mode 100644 index 000000000..284a67a27 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.20.6.md @@ -0,0 +1,33 @@ +--- +title: v0.20.6 - 20th August 2018 +slug: changelog/v0.20.6 +--- + +* Add removeAlpha operation to remove alpha channel, if any. + [#1248](https://github.com/lovell/sharp/issues/1248) + +* Expose mozjpeg quant_table flag. + [#1285](https://github.com/lovell/sharp/pull/1285) + [@rexxars](https://github.com/rexxars) + +* Allow full WebP alphaQuality range of 0-100. + [#1290](https://github.com/lovell/sharp/pull/1290) + [@sylvaindumont](https://github.com/sylvaindumont) + +* Cache libvips binaries to reduce re-install time. + [#1301](https://github.com/lovell/sharp/issues/1301) + +* Ensure vendor platform mismatch throws error at install time. + [#1303](https://github.com/lovell/sharp/issues/1303) + +* Improve install time error messages for FreeBSD users. + [#1310](https://github.com/lovell/sharp/issues/1310) + +* Ensure extractChannel works with 16-bit images. + [#1330](https://github.com/lovell/sharp/issues/1330) + +* Expose depth option for tile-based output. + [#1342](https://github.com/lovell/sharp/pull/1342) + [@alundavies](https://github.com/alundavies) + +* Add experimental entropy field to stats response. diff --git a/docs/src/content/docs/changelog/v0.20.7.md b/docs/src/content/docs/changelog/v0.20.7.md new file mode 100644 index 000000000..e9503a29d --- /dev/null +++ b/docs/src/content/docs/changelog/v0.20.7.md @@ -0,0 +1,7 @@ +--- +title: v0.20.7 - 21st August 2018 +slug: changelog/v0.20.7 +--- + +* Use copy+unlink if rename operation fails during installation. + [#1345](https://github.com/lovell/sharp/issues/1345) diff --git a/docs/src/content/docs/changelog/v0.20.8.md b/docs/src/content/docs/changelog/v0.20.8.md new file mode 100644 index 000000000..8ab9a062a --- /dev/null +++ b/docs/src/content/docs/changelog/v0.20.8.md @@ -0,0 +1,12 @@ +--- +title: v0.20.8 - 5th September 2018 +slug: changelog/v0.20.8 +--- + +* Avoid race conditions when creating directories during installation. + [#1358](https://github.com/lovell/sharp/pull/1358) + [@ajhool](https://github.com/ajhool) + +* Accept floating point values for input density parameter. + [#1362](https://github.com/lovell/sharp/pull/1362) + [@aeirola](https://github.com/aeirola) diff --git a/docs/src/content/docs/changelog/v0.21.0.md b/docs/src/content/docs/changelog/v0.21.0.md new file mode 100644 index 000000000..762cc65a4 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.21.0.md @@ -0,0 +1,39 @@ +--- +title: v0.21.0 - 4th October 2018 +slug: changelog/v0.21.0 +--- + +* Deprecate the following resize-related functions: + `crop`, `embed`, `ignoreAspectRatio`, `max`, `min` and `withoutEnlargement`. + Access to these is now via options passed to the `resize` function. + For example: + `embed('north')` is now `resize(width, height, { fit: 'contain', position: 'north' })`, + `crop('attention')` is now `resize(width, height, { fit: 'cover', position: 'attention' })`, + `max().withoutEnlargement()` is now `resize(width, height, { fit: 'inside', withoutEnlargement: true })`. + [#1135](https://github.com/lovell/sharp/issues/1135) + +* Deprecate the `background` function. + Per-operation `background` options added to `resize`, `extend` and `flatten` operations. + [#1392](https://github.com/lovell/sharp/issues/1392) + +* Add `size` to `metadata` response (Stream and Buffer input only). + [#695](https://github.com/lovell/sharp/issues/695) + +* Switch from custom trim operation to `vips_find_trim`. + [#914](https://github.com/lovell/sharp/issues/914) + +* Add `chromaSubsampling` and `isProgressive` properties to `metadata` response. + [#1186](https://github.com/lovell/sharp/issues/1186) + +* Drop Node 4 support. + [#1212](https://github.com/lovell/sharp/issues/1212) + +* Enable SIMD convolution by default. + [#1213](https://github.com/lovell/sharp/issues/1213) + +* Add experimental prebuilt binaries for musl-based Linux. + [#1379](https://github.com/lovell/sharp/issues/1379) + +* Add support for arbitrary rotation angle via vips_rotate. + [#1385](https://github.com/lovell/sharp/pull/1385) + [@freezy](https://github.com/freezy) diff --git a/docs/src/content/docs/changelog/v0.21.1.md b/docs/src/content/docs/changelog/v0.21.1.md new file mode 100644 index 000000000..bb685ef2b --- /dev/null +++ b/docs/src/content/docs/changelog/v0.21.1.md @@ -0,0 +1,31 @@ +--- +title: v0.21.1 - 7th December 2018 +slug: changelog/v0.21.1 +--- + +* Install: support `sharp_dist_base_url` npm config, like existing `SHARP_DIST_BASE_URL`. + [#1422](https://github.com/lovell/sharp/pull/1422) + [@SethWen](https://github.com/SethWen) + +* Ensure `channel` metadata is correct for raw, greyscale output. + [#1425](https://github.com/lovell/sharp/issues/1425) + +* Add support for the "mitchell" kernel for image reductions. + [#1438](https://github.com/lovell/sharp/pull/1438) + [@Daiz](https://github.com/Daiz) + +* Allow separate parameters for gamma encoding and decoding. + [#1439](https://github.com/lovell/sharp/pull/1439) + [@Daiz](https://github.com/Daiz) + +* Build prototype with `Object.assign` to allow minification. + [#1475](https://github.com/lovell/sharp/pull/1475) + [@jaubourg](https://github.com/jaubourg) + +* Expose libvips' recombination matrix operation. + [#1477](https://github.com/lovell/sharp/pull/1477) + [@fromkeith](https://github.com/fromkeith) + +* Expose libvips' pyramid/tile options for TIFF output. + [#1483](https://github.com/lovell/sharp/pull/1483) + [@mbklein](https://github.com/mbklein) diff --git a/docs/src/content/docs/changelog/v0.21.2.md b/docs/src/content/docs/changelog/v0.21.2.md new file mode 100644 index 000000000..be7f6758d --- /dev/null +++ b/docs/src/content/docs/changelog/v0.21.2.md @@ -0,0 +1,27 @@ +--- +title: v0.21.2 - 13th January 2019 +slug: changelog/v0.21.2 +--- + +* Ensure all metadata is removed from PNG output unless `withMetadata` used. + +* Ensure shortest edge is at least one pixel after resizing. + [#1003](https://github.com/lovell/sharp/issues/1003) + +* Add `ensureAlpha` operation to add an alpha channel, if missing. + [#1153](https://github.com/lovell/sharp/issues/1153) + +* Expose `pages` and `pageHeight` metadata for multi-page input images. + [#1205](https://github.com/lovell/sharp/issues/1205) + +* Expose PNG output options requiring libimagequant. + [#1484](https://github.com/lovell/sharp/issues/1484) + +* Expose underlying error message for invalid input. + [#1505](https://github.com/lovell/sharp/issues/1505) + +* Prevent mutatation of options passed to `jpeg`. + [#1516](https://github.com/lovell/sharp/issues/1516) + +* Ensure forced output format applied correctly when output chaining. + [#1528](https://github.com/lovell/sharp/issues/1528) diff --git a/docs/src/content/docs/changelog/v0.21.3.md b/docs/src/content/docs/changelog/v0.21.3.md new file mode 100644 index 000000000..1cde28c63 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.21.3.md @@ -0,0 +1,9 @@ +--- +title: v0.21.3 - 19th January 2019 +slug: changelog/v0.21.3 +--- + +* Input image decoding now fails fast, set `failOnError` to change this behaviour. + +* Failed filesystem-based input now separates missing file and invalid format errors. + [#1542](https://github.com/lovell/sharp/issues/1542) diff --git a/docs/src/content/docs/changelog/v0.22.0.md b/docs/src/content/docs/changelog/v0.22.0.md new file mode 100644 index 000000000..797c743bc --- /dev/null +++ b/docs/src/content/docs/changelog/v0.22.0.md @@ -0,0 +1,20 @@ +--- +title: v0.22.0 - 18th March 2019 +slug: changelog/v0.22.0 +--- + +* Remove functions previously deprecated in v0.21.0: + `background`, `crop`, `embed`, `ignoreAspectRatio`, `max`, `min` and `withoutEnlargement`. + +* Add `composite` operation supporting multiple images and blend modes; deprecate `overlayWith`. + [#728](https://github.com/lovell/sharp/issues/728) + +* Add support for `pages` input option for multi-page input. + [#1566](https://github.com/lovell/sharp/issues/1566) + +* Allow Stream-based input of raw pixel data. + [#1579](https://github.com/lovell/sharp/issues/1579) + +* Add support for `page` input option to GIF and PDF. + [#1595](https://github.com/lovell/sharp/pull/1595) + [@ramiel](https://github.com/ramiel) diff --git a/docs/src/content/docs/changelog/v0.22.1.md b/docs/src/content/docs/changelog/v0.22.1.md new file mode 100644 index 000000000..cf8e3d464 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.22.1.md @@ -0,0 +1,15 @@ +--- +title: v0.22.1 - 25th April 2019 +slug: changelog/v0.22.1 +--- + +* Add `modulate` operation for brightness, saturation and hue. + [#1601](https://github.com/lovell/sharp/pull/1601) + [@Goues](https://github.com/Goues) + +* Improve help messaging should `require("sharp")` fail. + [#1638](https://github.com/lovell/sharp/pull/1638) + [@sidharthachatterjee](https://github.com/sidharthachatterjee) + +* Add support for Node 12. + [#1668](https://github.com/lovell/sharp/issues/1668) diff --git a/docs/src/content/docs/changelog/v0.23.0.md b/docs/src/content/docs/changelog/v0.23.0.md new file mode 100644 index 000000000..24b18827c --- /dev/null +++ b/docs/src/content/docs/changelog/v0.23.0.md @@ -0,0 +1,32 @@ +--- +title: v0.23.0 - 29th July 2019 +slug: changelog/v0.23.0 +--- + +* Remove `overlayWith` previously deprecated in v0.22.0. + +* Add experimental support for HEIF images. Requires libvips compiled with libheif. + [#1105](https://github.com/lovell/sharp/issues/1105) + +* Expose libwebp `smartSubsample` and `reductionEffort` options. + [#1545](https://github.com/lovell/sharp/issues/1545) + +* Add experimental support for Worker Threads. + [#1558](https://github.com/lovell/sharp/issues/1558) + +* Use libvips' built-in CMYK and sRGB profiles when required. + [#1619](https://github.com/lovell/sharp/issues/1619) + +* Drop support for Node.js versions 6 and 11. + [#1674](https://github.com/lovell/sharp/issues/1674) + +* Expose `skipBlanks` option for tile-based output. + [#1687](https://github.com/lovell/sharp/pull/1687) + [@RaboliotTheGrey](https://github.com/RaboliotTheGrey) + +* Allow use of `failOnError` option with Stream-based input. + [#1691](https://github.com/lovell/sharp/issues/1691) + +* Fix rotate/extract ordering for non-90 angles. + [#1755](https://github.com/lovell/sharp/pull/1755) + [@iovdin](https://github.com/iovdin) diff --git a/docs/src/content/docs/changelog/v0.23.1.md b/docs/src/content/docs/changelog/v0.23.1.md new file mode 100644 index 000000000..c180d9952 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.23.1.md @@ -0,0 +1,24 @@ +--- +title: v0.23.1 - 26th September 2019 +slug: changelog/v0.23.1 +--- + +* Ensure `sharp.format.vips` is present and correct (filesystem only). + [#1813](https://github.com/lovell/sharp/issues/1813) + +* Ensure invalid `width` and `height` provided as options to `resize` throw. + [#1817](https://github.com/lovell/sharp/issues/1817) + +* Allow use of 'heic' and 'heif' identifiers with `toFormat`. + [#1834](https://github.com/lovell/sharp/pull/1834) + [@jaubourg](https://github.com/jaubourg) + +* Add `premultiplied` option to `composite` operation. + [#1835](https://github.com/lovell/sharp/pull/1835) + [@Andargor](https://github.com/Andargor) + +* Allow instance reuse with differing `toBuffer` options. + [#1860](https://github.com/lovell/sharp/pull/1860) + [@RaboliotTheGrey](https://github.com/RaboliotTheGrey) + +* Ensure image is at least 3x3 pixels before attempting trim operation. diff --git a/docs/src/content/docs/changelog/v0.23.2.md b/docs/src/content/docs/changelog/v0.23.2.md new file mode 100644 index 000000000..1eb59491d --- /dev/null +++ b/docs/src/content/docs/changelog/v0.23.2.md @@ -0,0 +1,12 @@ +--- +title: v0.23.2 - 28th October 2019 +slug: changelog/v0.23.2 +--- + +* Add `background` option to tile output operation. + [#1924](https://github.com/lovell/sharp/pull/1924) + [@neave](https://github.com/neave) + +* Add support for Node.js 13. + [#1932](https://github.com/lovell/sharp/pull/1932) + [@MayhemYDG](https://github.com/MayhemYDG) diff --git a/docs/src/content/docs/changelog/v0.23.3.md b/docs/src/content/docs/changelog/v0.23.3.md new file mode 100644 index 000000000..4c19795a5 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.23.3.md @@ -0,0 +1,18 @@ +--- +title: v0.23.3 - 17th November 2019 +slug: changelog/v0.23.3 +--- + +* Ensure `trim` operation supports images contained in the alpha channel. + [#1597](https://github.com/lovell/sharp/issues/1597) + +* Ensure tile `overlap` option works as expected. + [#1921](https://github.com/lovell/sharp/pull/1921) + [@rustyguts](https://github.com/rustyguts) + +* Allow compilation on FreeBSD and variants (broken since v0.23.0) + [#1952](https://github.com/lovell/sharp/pull/1952) + [@pouya-eghbali](https://github.com/pouya-eghbali) + +* Ensure `modulate` and other colour-based operations can co-exist. + [#1958](https://github.com/lovell/sharp/issues/1958) diff --git a/docs/src/content/docs/changelog/v0.23.4.md b/docs/src/content/docs/changelog/v0.23.4.md new file mode 100644 index 000000000..3d8bfed73 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.23.4.md @@ -0,0 +1,12 @@ +--- +title: v0.23.4 - 5th December 2019 +slug: changelog/v0.23.4 +--- + +* Handle zero-length Buffer objects when using Node.js v13.2.0+. + +* Expose raw TIFFTAG_PHOTOSHOP metadata. + [#1600](https://github.com/lovell/sharp/issues/1600) + +* Improve thread safety by using copy-on-write when updating metadata. + [#1986](https://github.com/lovell/sharp/issues/1986) diff --git a/docs/src/content/docs/changelog/v0.24.0.md b/docs/src/content/docs/changelog/v0.24.0.md new file mode 100644 index 000000000..378dad771 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.24.0.md @@ -0,0 +1,28 @@ +--- +title: v0.24.0 - 16th January 2020 +slug: changelog/v0.24.0 +--- + +* Drop support for Node.js 8. + [#1910](https://github.com/lovell/sharp/issues/1910) + +* Drop support for undefined input where options also provided. + [#1768](https://github.com/lovell/sharp/issues/1768) + +* Move `limitInputPixels` and `sequentialRead` to input options, deprecating functions of the same name. + +* Expose `delay` and `loop` metadata for animated images. + [#1905](https://github.com/lovell/sharp/issues/1905) + +* Ensure correct colour output for 16-bit, 2-channel PNG input with ICC profile. + [#2013](https://github.com/lovell/sharp/issues/2013) + +* Prevent use of sequentialRead for rotate operations. + [#2016](https://github.com/lovell/sharp/issues/2016) + +* Correctly bind max width and height values when using withoutEnlargement. + [#2024](https://github.com/lovell/sharp/pull/2024) + [@BrychanOdlum](https://github.com/BrychanOdlum) + +* Add support for input with 16-bit RGB profile. + [#2037](https://github.com/lovell/sharp/issues/2037) diff --git a/docs/src/content/docs/changelog/v0.24.1.md b/docs/src/content/docs/changelog/v0.24.1.md new file mode 100644 index 000000000..2155c2005 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.24.1.md @@ -0,0 +1,10 @@ +--- +title: v0.24.1 - 15th February 2020 +slug: changelog/v0.24.1 +--- + +* Prevent use of sequentialRead for EXIF-based rotate operation. + [#2042](https://github.com/lovell/sharp/issues/2042) + +* Ensure RGBA LZW TIFF returns correct channel count. + [#2064](https://github.com/lovell/sharp/issues/2064) diff --git a/docs/src/content/docs/changelog/v0.25.0.md b/docs/src/content/docs/changelog/v0.25.0.md new file mode 100644 index 000000000..b42978a98 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.25.0.md @@ -0,0 +1,18 @@ +--- +title: v0.25.0 - 7th March 2020 +slug: changelog/v0.25.0 +--- + +* Remove `limitInputPixels` and `sequentialRead` previously deprecated in v0.24.0. + +* Migrate internals to N-API. + [#1282](https://github.com/lovell/sharp/issues/1282) + +* Add support for 32-bit Windows. + [#2088](https://github.com/lovell/sharp/issues/2088) + +* Ensure correct ordering of rotate-then-trim operations. + [#2087](https://github.com/lovell/sharp/issues/2087) + +* Ensure composite accepts `limitInputPixels` and `sequentialRead` input options. + [#2099](https://github.com/lovell/sharp/issues/2099) diff --git a/docs/src/content/docs/changelog/v0.25.1.md b/docs/src/content/docs/changelog/v0.25.1.md new file mode 100644 index 000000000..af0fe39d4 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.25.1.md @@ -0,0 +1,7 @@ +--- +title: v0.25.1 - 7th March 2020 +slug: changelog/v0.25.1 +--- + +* Ensure prebuilt binaries are fetched based on N-API version. + [#2117](https://github.com/lovell/sharp/issues/2117) diff --git a/docs/src/content/docs/changelog/v0.25.2.md b/docs/src/content/docs/changelog/v0.25.2.md new file mode 100644 index 000000000..298db561a --- /dev/null +++ b/docs/src/content/docs/changelog/v0.25.2.md @@ -0,0 +1,19 @@ +--- +title: v0.25.2 - 20th March 2020 +slug: changelog/v0.25.2 +--- + +* Provide prebuilt binaries for Linux ARM64v8. + +* Add IIIF layout support to tile-based output. + [#2098](https://github.com/lovell/sharp/pull/2098) + [@edsilv](https://github.com/edsilv) + +* Ensure input options are consistently and correctly detected. + [#2118](https://github.com/lovell/sharp/issues/2118) + +* Ensure N-API prebuilt binaries work on RHEL7 and its derivatives. + [#2119](https://github.com/lovell/sharp/issues/2119) + +* Ensure AsyncWorker options are persisted. + [#2130](https://github.com/lovell/sharp/issues/2130) diff --git a/docs/src/content/docs/changelog/v0.25.3.md b/docs/src/content/docs/changelog/v0.25.3.md new file mode 100644 index 000000000..0a1a76038 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.25.3.md @@ -0,0 +1,14 @@ +--- +title: v0.25.3 - 17th May 2020 +slug: changelog/v0.25.3 +--- + +* Ensure libvips is initialised only once, improves worker thread safety. + [#2143](https://github.com/lovell/sharp/issues/2143) + +* Ensure npm platform flag is respected when copying DLLs. + [#2188](https://github.com/lovell/sharp/pull/2188) + [@dimadeveatii](https://github.com/dimadeveatii) + +* Allow SVG input with large inline images to be parsed. + [#2195](https://github.com/lovell/sharp/issues/2195) diff --git a/docs/src/content/docs/changelog/v0.25.4.md b/docs/src/content/docs/changelog/v0.25.4.md new file mode 100644 index 000000000..6bd4deb59 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.25.4.md @@ -0,0 +1,25 @@ +--- +title: v0.25.4 - 12th June 2020 +slug: changelog/v0.25.4 +--- + +* Allow libvips binary location override where version is appended. + [#2217](https://github.com/lovell/sharp/pull/2217) + [@malice00](https://github.com/malice00) + +* Enable PNG palette when setting quality, colours, colors or dither. + [#2226](https://github.com/lovell/sharp/pull/2226) + [@romaleev](https://github.com/romaleev) + +* Add `level` constructor option to use a specific level of a multi-level image. + Expose `levels` metadata for multi-level images. + [#2222](https://github.com/lovell/sharp/issues/2222) + +* Add support for named `alpha` channel to `extractChannel` operation. + [#2138](https://github.com/lovell/sharp/issues/2138) + +* Add experimental `sharpness` calculation to `stats()` response. + [#2251](https://github.com/lovell/sharp/issues/2251) + +* Emit `warning` event for non-critical processing problems. + [#2032](https://github.com/lovell/sharp/issues/2032) diff --git a/docs/src/content/docs/changelog/v0.26.0.md b/docs/src/content/docs/changelog/v0.26.0.md new file mode 100644 index 000000000..6fedd7e35 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.26.0.md @@ -0,0 +1,33 @@ +--- +title: v0.26.0 - 25th August 2020 +slug: changelog/v0.26.0 +--- + +* Prebuilt libvips binaries are now statically-linked and Brotli-compressed, requiring Node.js 10.16.0+. + +* TIFF output `squash` is replaced by `bitdepth` to reduce to 1, 2 or 4 bit. + +* JPEG output `quality` >= 90 no longer automatically sets `chromaSubsampling` to `4:4:4`. + +* Add most `dominant` colour to image `stats`. + [#640](https://github.com/lovell/sharp/issues/640) + +* Add support for animated GIF (requires \*magick) and WebP output. + [#2012](https://github.com/lovell/sharp/pull/2012) + [@deftomat](https://github.com/deftomat) + +* Add support for libvips ImageMagick v7 loaders. + [#2258](https://github.com/lovell/sharp/pull/2258) + [@vouillon](https://github.com/vouillon) + +* Allow multi-page input via \*magick. + [#2259](https://github.com/lovell/sharp/pull/2259) + [@vouillon](https://github.com/vouillon) + +* Add support to `withMetadata` for custom ICC profile. + [#2271](https://github.com/lovell/sharp/pull/2271) + [@roborourke](https://github.com/roborourke) + +* Ensure prebuilt binaries for ARM default to v7 when using Electron. + [#2292](https://github.com/lovell/sharp/pull/2292) + [@diegodev3](https://github.com/diegodev3) diff --git a/docs/src/content/docs/changelog/v0.26.1.md b/docs/src/content/docs/changelog/v0.26.1.md new file mode 100644 index 000000000..080b05da3 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.26.1.md @@ -0,0 +1,22 @@ +--- +title: v0.26.1 - 20th September 2020 +slug: changelog/v0.26.1 +--- + +* Ensure correct pageHeight when verifying multi-page image dimensions. + [#2343](https://github.com/lovell/sharp/pull/2343) + [@derom](https://github.com/derom) + +* Allow input density range up to 100000 DPI. + [#2348](https://github.com/lovell/sharp/pull/2348) + [@stefanprobst](https://github.com/stefanprobst) + +* Ensure animation-related properties can be set for Stream-based input. + [#2369](https://github.com/lovell/sharp/pull/2369) + [@AcrylicShrimp](https://github.com/AcrylicShrimp) + +* Ensure `stats` can be calculated for 1x1 input. + [#2372](https://github.com/lovell/sharp/issues/2372) + +* Ensure animated GIF output is optimised. + [#2376](https://github.com/lovell/sharp/issues/2376) diff --git a/docs/src/content/docs/changelog/v0.26.2.md b/docs/src/content/docs/changelog/v0.26.2.md new file mode 100644 index 000000000..2031300a2 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.26.2.md @@ -0,0 +1,15 @@ +--- +title: v0.26.2 - 14th October 2020 +slug: changelog/v0.26.2 +--- + +* Add support for EXR input. Requires libvips compiled with OpenEXR. + [#698](https://github.com/lovell/sharp/issues/698) + +* Ensure support for yarn v2. + [#2379](https://github.com/lovell/sharp/pull/2379) + [@jalovatt](https://github.com/jalovatt) + +* Add centre/center option to tile-based output. + [#2397](https://github.com/lovell/sharp/pull/2397) + [@beig](https://github.com/beig) diff --git a/docs/src/content/docs/changelog/v0.26.3.md b/docs/src/content/docs/changelog/v0.26.3.md new file mode 100644 index 000000000..a12a54480 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.26.3.md @@ -0,0 +1,12 @@ +--- +title: v0.26.3 - 16th November 2020 +slug: changelog/v0.26.3 +--- + +* Expose libvips' affine operation. + [#2336](https://github.com/lovell/sharp/pull/2336) + [@guillevc](https://github.com/guillevc) + +* Fallback to tar.gz for prebuilt libvips when Brotli not available. + [#2412](https://github.com/lovell/sharp/pull/2412) + [@ascorbic](https://github.com/ascorbic) diff --git a/docs/src/content/docs/changelog/v0.27.0.md b/docs/src/content/docs/changelog/v0.27.0.md new file mode 100644 index 000000000..d21378479 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.27.0.md @@ -0,0 +1,15 @@ +--- +title: v0.27.0 - 22nd December 2020 +slug: changelog/v0.27.0 +--- + +* Add support for AVIF to prebuilt binaries. + +* Remove experimental status from `heif` output, defaults are now AVIF-centric. + +* Allow negative top/left offsets for composite operation. + [#2391](https://github.com/lovell/sharp/pull/2391) + [@CurosMJ](https://github.com/CurosMJ) + +* Ensure all platforms use fontconfig for font rendering. + [#2399](https://github.com/lovell/sharp/issues/2399) diff --git a/docs/src/content/docs/changelog/v0.27.1.md b/docs/src/content/docs/changelog/v0.27.1.md new file mode 100644 index 000000000..09b24da60 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.27.1.md @@ -0,0 +1,19 @@ +--- +title: v0.27.1 - 27th January 2021 +slug: changelog/v0.27.1 +--- + +* Ensure TIFF is cast when using float predictor. + [#2502](https://github.com/lovell/sharp/pull/2502) + [@randyridge](https://github.com/randyridge) + +* Add support for Uint8Array and Uint8ClampedArray input. + [#2511](https://github.com/lovell/sharp/pull/2511) + [@leon](https://github.com/leon) + +* Revert: ensure all platforms use fontconfig for font rendering. + [#2515](https://github.com/lovell/sharp/issues/2515) + +* Expose libvips gaussnoise operation to allow creation of Gaussian noise. + [#2527](https://github.com/lovell/sharp/pull/2527) + [@alza54](https://github.com/alza54) diff --git a/docs/src/content/docs/changelog/v0.27.2.md b/docs/src/content/docs/changelog/v0.27.2.md new file mode 100644 index 000000000..b8eb9c386 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.27.2.md @@ -0,0 +1,20 @@ +--- +title: v0.27.2 - 22nd February 2021 +slug: changelog/v0.27.2 +--- + +* macOS: Prevent use of globally-installed ARM64 libvips with Rosetta x64 emulation. + [#2460](https://github.com/lovell/sharp/issues/2460) + +* Linux (musl): Prevent use of prebuilt linuxmusl-x64 binaries with musl >= 1.2.0. + [#2570](https://github.com/lovell/sharp/issues/2570) + +* Improve 16-bit grey+alpha support by using libvips' `has_alpha` detection. + [#2569](https://github.com/lovell/sharp/issues/2569) + +* Allow the use of non lower case extensions with `toFormat`. + [#2581](https://github.com/lovell/sharp/pull/2581) + [@florian-busch](https://github.com/florian-busch) + +* Allow use of `recomb` operation with single channel input. + [#2584](https://github.com/lovell/sharp/issues/2584) diff --git a/docs/src/content/docs/changelog/v0.28.0.md b/docs/src/content/docs/changelog/v0.28.0.md new file mode 100644 index 000000000..e3a72f815 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.28.0.md @@ -0,0 +1,32 @@ +--- +title: v0.28.0 - 29th March 2021 +slug: changelog/v0.28.0 +--- + +* Prebuilt binaries now include mozjpeg and libimagequant (BSD 2-Clause). + +* Prebuilt binaries limit AVIF support to the most common 8-bit depth. + +* Add `mozjpeg` option to `jpeg` method, sets mozjpeg defaults. + +* Reduce the default PNG `compressionLevel` to the more commonly used 6. + +* Reduce concurrency on glibc-based Linux when using the default memory allocator to help prevent fragmentation. + +* Default missing edge properties of extend operation to zero. + [#2578](https://github.com/lovell/sharp/issues/2578) + +* Ensure composite does not clip top and left offsets. + [#2594](https://github.com/lovell/sharp/pull/2594) + [@SHG42](https://github.com/SHG42) + +* Improve error handling of network failure at install time. + [#2608](https://github.com/lovell/sharp/pull/2608) + [@abradley](https://github.com/abradley) + +* Ensure `@id` attribute can be set for IIIF tile-based output. + [#2612](https://github.com/lovell/sharp/issues/2612) + [@edsilv](https://github.com/edsilv) + +* Ensure composite replicates the correct number of tiles for centred gravities. + [#2626](https://github.com/lovell/sharp/issues/2626) diff --git a/docs/src/content/docs/changelog/v0.28.1.md b/docs/src/content/docs/changelog/v0.28.1.md new file mode 100644 index 000000000..38e11a38a --- /dev/null +++ b/docs/src/content/docs/changelog/v0.28.1.md @@ -0,0 +1,15 @@ +--- +title: v0.28.1 - 5th April 2021 +slug: changelog/v0.28.1 +--- + +* Ensure all installation errors are logged with a more obvious prefix. + +* Allow `withMetadata` to set and update EXIF metadata. + [#650](https://github.com/lovell/sharp/issues/650) + +* Add support for OME-TIFF Sub Image File Directories (subIFD). + [#2557](https://github.com/lovell/sharp/issues/2557) + +* Allow `ensureAlpha` to set the alpha transparency level. + [#2634](https://github.com/lovell/sharp/issues/2634) diff --git a/docs/src/content/docs/changelog/v0.28.2.md b/docs/src/content/docs/changelog/v0.28.2.md new file mode 100644 index 000000000..9f067c44f --- /dev/null +++ b/docs/src/content/docs/changelog/v0.28.2.md @@ -0,0 +1,26 @@ +--- +title: v0.28.2 - 10th May 2021 +slug: changelog/v0.28.2 +--- + +* Allow `withMetadata` to set `density`. + [#967](https://github.com/lovell/sharp/issues/967) + +* Skip shrink-on-load where one dimension <4px. + [#2653](https://github.com/lovell/sharp/issues/2653) + +* Allow escaped proxy credentials. + [#2664](https://github.com/lovell/sharp/pull/2664) + [@msalettes](https://github.com/msalettes) + +* Add `premultiplied` flag for raw pixel data input. + [#2685](https://github.com/lovell/sharp/pull/2685) + [@mnutt](https://github.com/mnutt) + +* Detect empty input and throw a helpful error. + [#2687](https://github.com/lovell/sharp/pull/2687) + [@JakobJingleheimer](https://github.com/JakobJingleheimer) + +* Add install-time flag to skip version compatibility checks. + [#2692](https://github.com/lovell/sharp/pull/2692) + [@xemle](https://github.com/xemle) diff --git a/docs/src/content/docs/changelog/v0.28.3.md b/docs/src/content/docs/changelog/v0.28.3.md new file mode 100644 index 000000000..dff933269 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.28.3.md @@ -0,0 +1,13 @@ +--- +title: v0.28.3 - 24th May 2021 +slug: changelog/v0.28.3 +--- + +* Ensure presence of libvips, vendored or global, before invoking node-gyp. + +* Skip shrink-on-load for multi-page WebP. + [#2714](https://github.com/lovell/sharp/issues/2714) + +* Add contrast limiting adaptive histogram equalization (CLAHE) operator. + [#2726](https://github.com/lovell/sharp/pull/2726) + [@baparham](https://github.com/baparham) diff --git a/docs/src/content/docs/changelog/v0.29.0.md b/docs/src/content/docs/changelog/v0.29.0.md new file mode 100644 index 000000000..087545991 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.29.0.md @@ -0,0 +1,35 @@ +--- +title: v0.29.0 - 17th August 2021 +slug: changelog/v0.29.0 +--- + +* Drop support for Node.js 10, now requires Node.js >= 12.13.0. + +* Add `background` property to PNG and GIF image metadata. + +* Add `compression` property to HEIF image metadata. + [#2504](https://github.com/lovell/sharp/issues/2504) + +* AVIF encoding now defaults to `4:4:4` chroma subsampling. + [#2562](https://github.com/lovell/sharp/issues/2562) + +* Allow multiple platform-arch binaries in same `node_modules` installation tree. + [#2575](https://github.com/lovell/sharp/issues/2575) + +* Default to single-channel `b-w` space when `extractChannel` is used. + [#2658](https://github.com/lovell/sharp/issues/2658) + +* Allow installation directory to contain spaces (regression in v0.26.0). + [#2777](https://github.com/lovell/sharp/issues/2777) + +* Add `pipelineColourspace` operator to set the processing space. + [#2704](https://github.com/lovell/sharp/pull/2704) + [@Daiz](https://github.com/Daiz) + +* Allow bit depth to be set when using raw input and output. + [#2762](https://github.com/lovell/sharp/pull/2762) + [@mart-jansink](https://github.com/mart-jansink) + +* Allow `negate` to act only on non-alpha channels. + [#2808](https://github.com/lovell/sharp/pull/2808) + [@rexxars](https://github.com/rexxars) diff --git a/docs/src/content/docs/changelog/v0.29.1.md b/docs/src/content/docs/changelog/v0.29.1.md new file mode 100644 index 000000000..58809cbf4 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.29.1.md @@ -0,0 +1,30 @@ +--- +title: v0.29.1 - 7th September 2021 +slug: changelog/v0.29.1 +--- + +* Add `lightness` option to `modulate` operation. + [#2846](https://github.com/lovell/sharp/pull/2846) + +* Ensure correct PNG bitdepth is set based on number of colours. + [#2855](https://github.com/lovell/sharp/issues/2855) + +* Ensure background is always premultiplied when compositing. + [#2858](https://github.com/lovell/sharp/issues/2858) + +* Ensure images with P3 profiles retain full gamut. + [#2862](https://github.com/lovell/sharp/issues/2862) + +* Add support for libvips compiled with OpenJPEG. + [#2868](https://github.com/lovell/sharp/pull/2868) + +* Remove unsupported animation properties from AVIF output. + [#2870](https://github.com/lovell/sharp/issues/2870) + +* Resolve paths before comparing input/output filenames. + [#2878](https://github.com/lovell/sharp/pull/2878) + [@rexxars](https://github.com/rexxars) + +* Allow use of speed 9 (fastest) for HEIF encoding. + [#2879](https://github.com/lovell/sharp/pull/2879) + [@rexxars](https://github.com/rexxars) diff --git a/docs/src/content/docs/changelog/v0.29.2.md b/docs/src/content/docs/changelog/v0.29.2.md new file mode 100644 index 000000000..0628963f0 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.29.2.md @@ -0,0 +1,22 @@ +--- +title: v0.29.2 - 21st October 2021 +slug: changelog/v0.29.2 +--- + +* Add `timeout` function to limit processing time. + +* Ensure `sharp.versions` is populated from vendored libvips. + +* Remove animation properties from single page images. + [#2890](https://github.com/lovell/sharp/issues/2890) + +* Allow use of 'tif' to select TIFF output. + [#2893](https://github.com/lovell/sharp/pull/2893) + [@erf](https://github.com/erf) + +* Improve error message on Windows for version conflict. + [#2918](https://github.com/lovell/sharp/pull/2918) + [@dkrnl](https://github.com/dkrnl) + +* Throw error rather than exit when invalid binaries detected. + [#2931](https://github.com/lovell/sharp/issues/2931) diff --git a/docs/src/content/docs/changelog/v0.29.3.md b/docs/src/content/docs/changelog/v0.29.3.md new file mode 100644 index 000000000..e344d790e --- /dev/null +++ b/docs/src/content/docs/changelog/v0.29.3.md @@ -0,0 +1,11 @@ +--- +title: v0.29.3 - 14th November 2021 +slug: changelog/v0.29.3 +--- + +* Ensure correct dimensions when containing image resized to 1px. + [#2951](https://github.com/lovell/sharp/issues/2951) + +* Impute TIFF `xres`/`yres` from `density` provided to `withMetadata`. + [#2952](https://github.com/lovell/sharp/pull/2952) + [@mbklein](https://github.com/mbklein) diff --git a/docs/src/content/docs/changelog/v0.30.0.md b/docs/src/content/docs/changelog/v0.30.0.md new file mode 100644 index 000000000..2605fbcaa --- /dev/null +++ b/docs/src/content/docs/changelog/v0.30.0.md @@ -0,0 +1,48 @@ +--- +title: v0.30.0 - 1st February 2022 +slug: changelog/v0.30.0 +--- + +* Add support for GIF output to prebuilt binaries. + +* Reduce minimum Linux ARM64v8 glibc requirement to 2.17. + +* Verify prebuilt binaries with a Subresource Integrity check. + +* Standardise WebP `effort` option name, deprecate `reductionEffort`. + +* Standardise HEIF `effort` option name, deprecate `speed`. + +* Add support for IIIF v3 tile-based output. + +* Expose control over CPU effort for palette-based PNG output. + [#2541](https://github.com/lovell/sharp/issues/2541) + +* Improve animated (multi-page) image resize and extract. + [#2789](https://github.com/lovell/sharp/pull/2789) + [@kleisauke](https://github.com/kleisauke) + +* Expose platform and architecture of vendored binaries as `sharp.vendor`. + [#2928](https://github.com/lovell/sharp/issues/2928) + +* Ensure 16-bit PNG output uses correct bitdepth. + [#2958](https://github.com/lovell/sharp/pull/2958) + [@gforge](https://github.com/gforge) + +* Properly emit close events for duplex streams. + [#2976](https://github.com/lovell/sharp/pull/2976) + [@driannaude](https://github.com/driannaude) + +* Expose `unlimited` option for SVG and PNG input, switches off safety features. + [#2984](https://github.com/lovell/sharp/issues/2984) + +* Add `withoutReduction` option to resize operation. + [#3006](https://github.com/lovell/sharp/pull/3006) + [@christopherbradleybanks](https://github.com/christopherbradleybanks) + +* Add `resolutionUnit` as `tiff` option and expose in metadata. + [#3023](https://github.com/lovell/sharp/pull/3023) + [@ompal-sisodiya](https://github.com/ompal-sisodiya) + +* Ensure rotate-then-extract works with EXIF mirroring. + [#3024](https://github.com/lovell/sharp/issues/3024) diff --git a/docs/src/content/docs/changelog/v0.30.1.md b/docs/src/content/docs/changelog/v0.30.1.md new file mode 100644 index 000000000..06f890da8 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.30.1.md @@ -0,0 +1,19 @@ +--- +title: v0.30.1 - 9th February 2022 +slug: changelog/v0.30.1 +--- + +* Allow use of `toBuffer` and `toFile` on the same instance. + [#3044](https://github.com/lovell/sharp/issues/3044) + +* Skip shrink-on-load for known libjpeg rounding errors. + [#3066](https://github.com/lovell/sharp/issues/3066) + [@kleisauke](https://github.com/kleisauke) + +* Ensure withoutReduction does not interfere with contain/crop/embed. + [#3081](https://github.com/lovell/sharp/pull/3081) + [@kleisauke](https://github.com/kleisauke) + +* Ensure affine interpolator is correctly finalised. + [#3083](https://github.com/lovell/sharp/pull/3083) + [@kleisauke](https://github.com/kleisauke) diff --git a/docs/src/content/docs/changelog/v0.30.2.md b/docs/src/content/docs/changelog/v0.30.2.md new file mode 100644 index 000000000..1b5c7e1c5 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.30.2.md @@ -0,0 +1,17 @@ +--- +title: v0.30.2 - 2nd March 2022 +slug: changelog/v0.30.2 +--- + +* Improve performance and accuracy when compositing multiple images. + [#2286](https://github.com/lovell/sharp/issues/2286) + +* Expand pkgconfig search path for wider BSD support. + [#3106](https://github.com/lovell/sharp/issues/3106) + +* Ensure Windows C++ runtime is linked statically (regression in 0.30.0). + [#3110](https://github.com/lovell/sharp/pull/3110) + [@kleisauke](https://github.com/kleisauke) + +* Temporarily ignore greyscale ICC profiles to workaround lcms bug. + [#3112](https://github.com/lovell/sharp/issues/3112) diff --git a/docs/src/content/docs/changelog/v0.30.3.md b/docs/src/content/docs/changelog/v0.30.3.md new file mode 100644 index 000000000..77a2584a8 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.30.3.md @@ -0,0 +1,13 @@ +--- +title: v0.30.3 - 14th March 2022 +slug: changelog/v0.30.3 +--- + +* Allow `sharpen` options to be provided more consistently as an Object. + [#2561](https://github.com/lovell/sharp/issues/2561) + +* Expose `x1`, `y2` and `y3` parameters of `sharpen` operation. + [#2935](https://github.com/lovell/sharp/issues/2935) + +* Prevent double unpremultiply with some composite blend modes (regression in 0.30.2). + [#3118](https://github.com/lovell/sharp/issues/3118) diff --git a/docs/src/content/docs/changelog/v0.30.4.md b/docs/src/content/docs/changelog/v0.30.4.md new file mode 100644 index 000000000..a4f0040bf --- /dev/null +++ b/docs/src/content/docs/changelog/v0.30.4.md @@ -0,0 +1,20 @@ +--- +title: v0.30.4 - 18th April 2022 +slug: changelog/v0.30.4 +--- + +* Increase control over sensitivity to invalid images via `failOn`, deprecate `failOnError` (equivalent to `failOn: 'warning'`). + +* Ensure `create` input image has correct bit depth and colour space. + [#3139](https://github.com/lovell/sharp/issues/3139) + +* Add support for `TypedArray` input with `byteOffset` and `length`. + [#3146](https://github.com/lovell/sharp/pull/3146) + [@codepage949](https://github.com/codepage949) + +* Improve error message when attempting to render SVG input greater than 32767x32767. + [#3167](https://github.com/lovell/sharp/issues/3167) + +* Add missing file name to 'Input file is missing' error message. + [#3178](https://github.com/lovell/sharp/pull/3178) + [@Brodan](https://github.com/Brodan) diff --git a/docs/src/content/docs/changelog/v0.30.5.md b/docs/src/content/docs/changelog/v0.30.5.md new file mode 100644 index 000000000..406357619 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.30.5.md @@ -0,0 +1,19 @@ +--- +title: v0.30.5 - 23rd May 2022 +slug: changelog/v0.30.5 +--- + +* Install: pass `PKG_CONFIG_PATH` via env rather than substitution. + [@dwisiswant0](https://github.com/dwisiswant0) + +* Add support for `--libc` flag to improve cross-platform installation. + [#3160](https://github.com/lovell/sharp/pull/3160) + [@joonamo](https://github.com/joonamo) + +* Allow installation of prebuilt libvips binaries from filesystem. + [#3196](https://github.com/lovell/sharp/pull/3196) + [@ankurparihar](https://github.com/ankurparihar) + +* Fix rotate-then-extract for EXIF orientation 2. + [#3218](https://github.com/lovell/sharp/pull/3218) + [@jakob0fischl](https://github.com/jakob0fischl) diff --git a/docs/src/content/docs/changelog/v0.30.6.md b/docs/src/content/docs/changelog/v0.30.6.md new file mode 100644 index 000000000..340921040 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.30.6.md @@ -0,0 +1,10 @@ +--- +title: v0.30.6 - 30th May 2022 +slug: changelog/v0.30.6 +--- + +* Allow values for `limitInputPixels` larger than 32-bit. + [#3238](https://github.com/lovell/sharp/issues/3238) + +* Ensure brew-installed `vips` can be detected (regression in 0.30.5). + [#3239](https://github.com/lovell/sharp/issues/3239) diff --git a/docs/src/content/docs/changelog/v0.30.7.md b/docs/src/content/docs/changelog/v0.30.7.md new file mode 100644 index 000000000..fd90d905c --- /dev/null +++ b/docs/src/content/docs/changelog/v0.30.7.md @@ -0,0 +1,15 @@ +--- +title: v0.30.7 - 22nd June 2022 +slug: changelog/v0.30.7 +--- + +* Ensure tiled composition always works with outside resizing. + [#3227](https://github.com/lovell/sharp/issues/3227) + +* Allow WebP encoding effort of 0. + [#3261](https://github.com/lovell/sharp/pull/3261) + [@AlexanderTheGrey](https://github.com/AlexanderTheGrey) + +* Prevent upsampling via libwebp. + [#3267](https://github.com/lovell/sharp/pull/3267) + [@blacha](https://github.com/blacha) diff --git a/docs/src/content/docs/changelog/v0.31.0.md b/docs/src/content/docs/changelog/v0.31.0.md new file mode 100644 index 000000000..1998f9be8 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.31.0.md @@ -0,0 +1,63 @@ +--- +title: v0.31.0 - 5th September 2022 +slug: changelog/v0.31.0 +--- + +* Drop support for Node.js 12, now requires Node.js >= 14.15.0. + +* GIF output now re-uses input palette if possible. Use `reoptimise` option to generate a new palette. + +* Add WebP `minSize` and `mixed` options for greater control over animation frames. + +* Remove previously-deprecated WebP `reductionEffort` and HEIF `speed` options. Use `effort` to control these. + +* The `flip` and `flop` operations will now occur before the `rotate` operation. + +* Improve `normalise` operation with use of histogram. + [#200](https://github.com/lovell/sharp/issues/200) + +* Use combined bounding box of alpha and non-alpha channels for `trim` operation. + [#2166](https://github.com/lovell/sharp/issues/2166) + +* Add Buffer and Stream support to tile-based output. + [#2238](https://github.com/lovell/sharp/issues/2238) + +* Add input `fileSuffix` and output `alias` to `format` information. + [#2642](https://github.com/lovell/sharp/issues/2642) + +* Re-introduce support for greyscale ICC profiles (temporarily removed in 0.30.2). + [#3114](https://github.com/lovell/sharp/issues/3114) + +* Add support for WebP and PackBits `compression` options with TIFF output. + [#3198](https://github.com/lovell/sharp/issues/3198) + +* Ensure OpenSlide and FITS input works with custom libvips. + [#3226](https://github.com/lovell/sharp/issues/3226) + +* Ensure `trim` operation is a no-op when it would reduce an image to nothing. + [#3223](https://github.com/lovell/sharp/issues/3223) + +* Expose `vips_text` to create an image containing rendered text. + [#3252](https://github.com/lovell/sharp/pull/3252) + [@brahima](https://github.com/brahima) + +* Ensure only properties owned by the `withMetadata` EXIF Object are parsed. + [#3292](https://github.com/lovell/sharp/issues/3292) + +* Expand `linear` operation to allow use of per-channel arrays. + [#3303](https://github.com/lovell/sharp/pull/3303) + [@antonmarsden](https://github.com/antonmarsden) + +* Ensure the order of `rotate`, `resize` and `extend` operations is respected where possible. + Emit warnings when previous calls in the same pipeline will be ignored. + [#3319](https://github.com/lovell/sharp/issues/3319) + +* Ensure PNG bitdepth can be set for non-palette output. + [#3322](https://github.com/lovell/sharp/issues/3322) + +* Add trim option to provide a specific background colour. + [#3332](https://github.com/lovell/sharp/pull/3332) + [@mart-jansink](https://github.com/mart-jansink) + +* Ensure resized image is unpremultiplied before composite. + [#3334](https://github.com/lovell/sharp/issues/3334) diff --git a/docs/src/content/docs/changelog/v0.31.1.md b/docs/src/content/docs/changelog/v0.31.1.md new file mode 100644 index 000000000..44467514f --- /dev/null +++ b/docs/src/content/docs/changelog/v0.31.1.md @@ -0,0 +1,22 @@ +--- +title: v0.31.1 - 29th September 2022 +slug: changelog/v0.31.1 +--- + +* Upgrade to libvips v8.13.2 for upstream bug fixes. + +* Ensure `close` event occurs after `end` event for Stream-based output. + [#3313](https://github.com/lovell/sharp/issues/3313) + +* Ensure `limitInputPixels` constructor option uses uint64. + [#3349](https://github.com/lovell/sharp/pull/3349) + [@marcosc90](https://github.com/marcosc90) + +* Ensure auto-rotation works with shrink-on-load and extract (regression in 0.31.0). + [#3352](https://github.com/lovell/sharp/issues/3352) + +* Ensure AVIF output is always 8-bit. + [#3358](https://github.com/lovell/sharp/issues/3358) + +* Ensure greyscale images can be trimmed (regression in 0.31.0). + [#3386](https://github.com/lovell/sharp/issues/3386) diff --git a/docs/src/content/docs/changelog/v0.31.2.md b/docs/src/content/docs/changelog/v0.31.2.md new file mode 100644 index 000000000..c0be292b0 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.31.2.md @@ -0,0 +1,12 @@ +--- +title: v0.31.2 - 4th November 2022 +slug: changelog/v0.31.2 +--- + +* Upgrade to libvips v8.13.3 for upstream bug fixes. + +* Ensure manual flip, rotate, resize operation ordering (regression in 0.31.1) + [#3391](https://github.com/lovell/sharp/issues/3391) + +* Ensure auto-rotation works without resize (regression in 0.31.1) + [#3422](https://github.com/lovell/sharp/issues/3422) diff --git a/docs/src/content/docs/changelog/v0.31.3.md b/docs/src/content/docs/changelog/v0.31.3.md new file mode 100644 index 000000000..84c6ac899 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.31.3.md @@ -0,0 +1,34 @@ +--- +title: v0.31.3 - 21st December 2022 +slug: changelog/v0.31.3 +--- + +* Add experimental support for JPEG-XL images. Requires libvips compiled with libjxl. + [#2731](https://github.com/lovell/sharp/issues/2731) + +* Add runtime detection of V8 memory cage, ensures compatibility with Electron 21 onwards. + [#3384](https://github.com/lovell/sharp/issues/3384) + +* Expose `interFrameMaxError` and `interPaletteMaxError` GIF optimisation properties. + [#3401](https://github.com/lovell/sharp/issues/3401) + +* Allow installation on Linux with glibc patch versions e.g. Fedora 38. + [#3423](https://github.com/lovell/sharp/issues/3423) + +* Expand range of existing `sharpen` parameters to match libvips. + [#3427](https://github.com/lovell/sharp/issues/3427) + +* Prevent possible race condition awaiting metadata of Stream-based input. + [#3451](https://github.com/lovell/sharp/issues/3451) + +* Improve `extractChannel` support for 16-bit output colourspaces. + [#3453](https://github.com/lovell/sharp/issues/3453) + +* Ignore `sequentialRead` option when calculating image statistics. + [#3462](https://github.com/lovell/sharp/issues/3462) + +* Small performance improvement for operations that introduce a non-opaque background. + [#3465](https://github.com/lovell/sharp/issues/3465) + +* Ensure integral output of `linear` operation. + [#3468](https://github.com/lovell/sharp/issues/3468) diff --git a/docs/src/content/docs/changelog/v0.32.0.md b/docs/src/content/docs/changelog/v0.32.0.md new file mode 100644 index 000000000..f86104bfa --- /dev/null +++ b/docs/src/content/docs/changelog/v0.32.0.md @@ -0,0 +1,61 @@ +--- +title: v0.32.0 - 24th March 2023 +slug: changelog/v0.32.0 +--- + +* Default to using sequential rather than random access read where possible. + +* Replace GIF output `optimise` / `optimize` option with `reuse`. + +* Add `progressive` option to GIF output for interlacing. + +* Add `wrap` option to text image creation. + +* Add `formatMagick` property to metadata of images loaded via *magick. + +* Prefer integer (un)premultiply for faster resizing of RGBA images. + +* Add `ignoreIcc` input option to ignore embedded ICC profile. + +* Allow use of GPS (IFD3) EXIF metadata. + [#2767](https://github.com/lovell/sharp/issues/2767) + +* TypeScript definitions are now maintained and published directly, deprecating the `@types/sharp` package. + [#3369](https://github.com/lovell/sharp/issues/3369) + +* Prebuilt binaries: ensure macOS 10.13+ support, as documented. + [#3438](https://github.com/lovell/sharp/issues/3438) + +* Prebuilt binaries: prevent use of glib slice allocator, improves QEMU support. + [#3448](https://github.com/lovell/sharp/issues/3448) + +* Add focus point coordinates to output when using attention based crop. + [#3470](https://github.com/lovell/sharp/pull/3470) + [@ejoebstl](https://github.com/ejoebstl) + +* Expose sharp version as `sharp.versions.sharp`. + [#3471](https://github.com/lovell/sharp/issues/3471) + +* Respect `fastShrinkOnLoad` resize option for WebP input. + [#3516](https://github.com/lovell/sharp/issues/3516) + +* Reduce sharpen `sigma` maximum from 10000 to 10. + [#3521](https://github.com/lovell/sharp/issues/3521) + +* Add support for `ArrayBuffer` input. + [#3548](https://github.com/lovell/sharp/pull/3548) + [@kapouer](https://github.com/kapouer) + +* Add support to `extend` operation for `extendWith` to allow copy/mirror/repeat. + [#3556](https://github.com/lovell/sharp/pull/3556) + [@janaz](https://github.com/janaz) + +* Ensure all async JS callbacks are wrapped to help avoid possible race condition. + [#3569](https://github.com/lovell/sharp/issues/3569) + +* Prebuilt binaries: support for tile-based output temporarily removed due to licensing issue. + [#3581](https://github.com/lovell/sharp/issues/3581) + +* Add support to `normalise` for `lower` and `upper` percentiles. + [#3583](https://github.com/lovell/sharp/pull/3583) + [@LachlanNewman](https://github.com/LachlanNewman) diff --git a/docs/src/content/docs/changelog/v0.32.1.md b/docs/src/content/docs/changelog/v0.32.1.md new file mode 100644 index 000000000..a73cfb93e --- /dev/null +++ b/docs/src/content/docs/changelog/v0.32.1.md @@ -0,0 +1,30 @@ +--- +title: v0.32.1 - 27th April 2023 +slug: changelog/v0.32.1 +--- + +* Add experimental `unflatten` operation. + [#3461](https://github.com/lovell/sharp/pull/3461) + [@antonmarsden](https://github.com/antonmarsden) + +* Ensure use of `flip` operation forces random access read (regression in 0.32.0). + [#3600](https://github.com/lovell/sharp/issues/3600) + +* Ensure `linear` operation works with 16-bit input (regression in 0.31.3). + [#3605](https://github.com/lovell/sharp/issues/3605) + +* Install: ensure proxy URLs are logged correctly. + [#3615](https://github.com/lovell/sharp/pull/3615) + [@TomWis97](https://github.com/TomWis97) + +* Ensure profile-less CMYK to CMYK roundtrip skips colourspace conversion. + [#3620](https://github.com/lovell/sharp/issues/3620) + +* Add support for `modulate` operation when using non-sRGB pipeline colourspace. + [#3620](https://github.com/lovell/sharp/issues/3620) + +* Ensure `trim` operation works with CMYK images (regression in 0.31.0). + [#3636](https://github.com/lovell/sharp/issues/3636) + +* Install: coerce libc version to semver. + [#3641](https://github.com/lovell/sharp/issues/3641) diff --git a/docs/src/content/docs/changelog/v0.32.2.md b/docs/src/content/docs/changelog/v0.32.2.md new file mode 100644 index 000000000..a14aa1410 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.32.2.md @@ -0,0 +1,25 @@ +--- +title: v0.32.2 - 11th July 2023 +slug: changelog/v0.32.2 +--- + +* Limit HEIF output dimensions to 16384x16384, matches libvips. + +* Ensure exceptions are not thrown when terminating. + [#3569](https://github.com/lovell/sharp/issues/3569) + +* Ensure the same access method is used for all inputs (regression in 0.32.0). + [#3669](https://github.com/lovell/sharp/issues/3669) + +* Improve detection of jp2 filename extensions. + [#3674](https://github.com/lovell/sharp/pull/3674) + [@bianjunjie1981](https://github.com/bianjunjie1981) + +* Guard use of smartcrop premultiplied option to prevent warning (regression in 0.32.1). + [#3710](https://github.com/lovell/sharp/issues/3710) + +* Prevent over-compute in affine-based rotate before resize. + [#3722](https://github.com/lovell/sharp/issues/3722) + +* Allow sequential read for EXIF-based auto-orientation. + [#3725](https://github.com/lovell/sharp/issues/3725) diff --git a/docs/src/content/docs/changelog/v0.32.3.md b/docs/src/content/docs/changelog/v0.32.3.md new file mode 100644 index 000000000..ac3151982 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.32.3.md @@ -0,0 +1,10 @@ +--- +title: v0.32.3 - 14th July 2023 +slug: changelog/v0.32.3 +--- + +* Expose `preset` option for WebP output. + [#3639](https://github.com/lovell/sharp/issues/3639) + +* Ensure decoding remains sequential for all operations (regression in 0.32.2). + [#3725](https://github.com/lovell/sharp/issues/3725) diff --git a/docs/src/content/docs/changelog/v0.32.4.md b/docs/src/content/docs/changelog/v0.32.4.md new file mode 100644 index 000000000..f327f8d0b --- /dev/null +++ b/docs/src/content/docs/changelog/v0.32.4.md @@ -0,0 +1,11 @@ +--- +title: v0.32.4 - 21st July 2023 +slug: changelog/v0.32.4 +--- + +* Upgrade to libvips v8.14.3 for upstream bug fixes. + +* Expose ability to (un)block low-level libvips operations by name. + +* Prebuilt binaries: restore support for tile-based output. + [#3581](https://github.com/lovell/sharp/issues/3581) diff --git a/docs/src/content/docs/changelog/v0.32.5.md b/docs/src/content/docs/changelog/v0.32.5.md new file mode 100644 index 000000000..3a38ec7ec --- /dev/null +++ b/docs/src/content/docs/changelog/v0.32.5.md @@ -0,0 +1,24 @@ +--- +title: v0.32.5 - 15th August 2023 +slug: changelog/v0.32.5 +--- + +* Upgrade to libvips v8.14.4 for upstream bug fixes. + +* TypeScript: Add missing `WebpPresetEnum` to definitions. + [#3748](https://github.com/lovell/sharp/pull/3748) + [@pilotso11](https://github.com/pilotso11) + +* Ensure compilation using musl v1.2.4. + [#3755](https://github.com/lovell/sharp/pull/3755) + [@kleisauke](https://github.com/kleisauke) + +* Ensure resize with a `fit` of `inside` respects 90/270 degree rotation. + [#3756](https://github.com/lovell/sharp/issues/3756) + +* TypeScript: Ensure `minSize` property of `WebpOptions` is boolean. + [#3758](https://github.com/lovell/sharp/pull/3758) + [@sho-xizz](https://github.com/sho-xizz) + +* Ensure `withMetadata` adds default sRGB profile. + [#3761](https://github.com/lovell/sharp/issues/3761) diff --git a/docs/src/content/docs/changelog/v0.32.6.md b/docs/src/content/docs/changelog/v0.32.6.md new file mode 100644 index 000000000..6b7c3d431 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.32.6.md @@ -0,0 +1,19 @@ +--- +title: v0.32.6 - 18th September 2023 +slug: changelog/v0.32.6 +--- + +* Upgrade to libvips v8.14.5 for upstream bug fixes. + +* Ensure composite tile images are fully decoded (regression in 0.32.0). + [#3767](https://github.com/lovell/sharp/issues/3767) + +* Ensure `withMetadata` can add ICC profiles to RGB16 output. + [#3773](https://github.com/lovell/sharp/issues/3773) + +* Ensure `withMetadata` does not reduce 16-bit images to 8-bit (regression in 0.32.5). + [#3773](https://github.com/lovell/sharp/issues/3773) + +* TypeScript: Add definitions for block and unblock. + [#3799](https://github.com/lovell/sharp/pull/3799) + [@ldrick](https://github.com/ldrick) diff --git a/docs/src/content/docs/changelog/v0.33.0.md b/docs/src/content/docs/changelog/v0.33.0.md new file mode 100644 index 000000000..1ca8435de --- /dev/null +++ b/docs/src/content/docs/changelog/v0.33.0.md @@ -0,0 +1,47 @@ +--- +title: v0.33.0 - 29th November 2023 +slug: changelog/v0.33.0 +--- + +* Drop support for Node.js 14 and 16, now requires Node.js ^18.17.0 or >= 20.3.0 + +* Prebuilt binaries distributed via npm registry and installed via package manager. + +* Building from source requires dependency on `node-addon-api`. + +* Remove `sharp.vendor`. + +* Partially deprecate `withMetadata()`, use `withExif()` and `withIccProfile()`. + +* Add experimental support for WebAssembly-based runtimes. + [@RReverser](https://github.com/RReverser) + +* Options for `trim` operation must be an Object, add new `lineArt` option. + [#2363](https://github.com/lovell/sharp/issues/2363) + +* Improve luminance of `tint` operation with weighting function. + [#3338](https://github.com/lovell/sharp/issues/3338) + [@jcupitt](https://github.com/jcupitt) + +* Ensure all `Error` objects contain a `stack` property. + [#3653](https://github.com/lovell/sharp/issues/3653) + +* Make `compression` option of `heif` mandatory to help reduce HEIF vs HEIC confusion. + [#3740](https://github.com/lovell/sharp/issues/3740) + +* Ensure correct interpretation of 16-bit raw input. + [#3808](https://github.com/lovell/sharp/issues/3808) + +* Add support for `miniswhite` when using TIFF output. + [#3812](https://github.com/lovell/sharp/pull/3812) + [@dnsbty](https://github.com/dnsbty) + +* TypeScript: add missing definition for `withMetadata` boolean. + [#3823](https://github.com/lovell/sharp/pull/3823) + [@uhthomas](https://github.com/uhthomas) + +* Add more fine-grained control over output metadata. + [#3824](https://github.com/lovell/sharp/issues/3824) + +* Ensure multi-page extract remains sequential. + [#3837](https://github.com/lovell/sharp/issues/3837) diff --git a/docs/src/content/docs/changelog/v0.33.1.md b/docs/src/content/docs/changelog/v0.33.1.md new file mode 100644 index 000000000..9a280c82f --- /dev/null +++ b/docs/src/content/docs/changelog/v0.33.1.md @@ -0,0 +1,14 @@ +--- +title: v0.33.1 - 17th December 2023 +slug: changelog/v0.33.1 +--- + +* Add support for Yarn Plug'n'Play filesystem layout. + [#3888](https://github.com/lovell/sharp/issues/3888) + +* Emit warning when attempting to use invalid ICC profiles. + [#3895](https://github.com/lovell/sharp/issues/3895) + +* Ensure `VIPS_NOVECTOR` environment variable is respected. + [#3897](https://github.com/lovell/sharp/pull/3897) + [@icetee](https://github.com/icetee) diff --git a/docs/src/content/docs/changelog/v0.33.2.md b/docs/src/content/docs/changelog/v0.33.2.md new file mode 100644 index 000000000..7d88c0adf --- /dev/null +++ b/docs/src/content/docs/changelog/v0.33.2.md @@ -0,0 +1,16 @@ +--- +title: v0.33.2 - 12th January 2024 +slug: changelog/v0.33.2 +--- + +* Upgrade to libvips v8.15.1 for upstream bug fixes. + +* TypeScript: add definition for `keepMetadata`. + [#3914](https://github.com/lovell/sharp/pull/3914) + [@abhi0498](https://github.com/abhi0498) + +* Ensure `extend` operation stays sequential when copying (regression in 0.32.0). + [#3928](https://github.com/lovell/sharp/issues/3928) + +* Improve error handling for unsupported multi-page rotation. + [#3940](https://github.com/lovell/sharp/issues/3940) diff --git a/docs/src/content/docs/changelog/v0.33.3.md b/docs/src/content/docs/changelog/v0.33.3.md new file mode 100644 index 000000000..17e6639fe --- /dev/null +++ b/docs/src/content/docs/changelog/v0.33.3.md @@ -0,0 +1,21 @@ +--- +title: v0.33.3 - 23rd March 2024 +slug: changelog/v0.33.3 +--- + +* Upgrade to libvips v8.15.2 for upstream bug fixes. + +* Ensure `keepIccProfile` retains P3 and CMYK input profiles. + [#3906](https://github.com/lovell/sharp/issues/3906) + [#4008](https://github.com/lovell/sharp/issues/4008) + +* Ensure `text.wrap` property can accept `word-char` as value. + [#4028](https://github.com/lovell/sharp/pull/4028) + [@yolopunk](https://github.com/yolopunk) + +* Ensure `clone` takes a deep copy of existing options. + [#4029](https://github.com/lovell/sharp/issues/4029) + +* Add `bitdepth` option to `heif` output (prebuilt binaries support 8-bit only). + [#4036](https://github.com/lovell/sharp/pull/4036) + [@mertalev](https://github.com/mertalev) diff --git a/docs/src/content/docs/changelog/v0.33.4.md b/docs/src/content/docs/changelog/v0.33.4.md new file mode 100644 index 000000000..9c4500f91 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.33.4.md @@ -0,0 +1,32 @@ +--- +title: v0.33.4 - 16th May 2024 +slug: changelog/v0.33.4 +--- + +* Remove experimental status from `pipelineColourspace`. + +* Reduce default concurrency when musl thread over-subscription detected. + +* TypeScript: add missing definitions for `OverlayOptions`. + [#4048](https://github.com/lovell/sharp/pull/4048) + [@ike-gg](https://github.com/ike-gg) + +* Install: add advanced option to force use of a globally-installed libvips. + [#4060](https://github.com/lovell/sharp/issues/4060) + +* Expose `bilinear` resizing kernel (and interpolator). + [#4061](https://github.com/lovell/sharp/issues/4061) + +* Ensure `extend` operation stays sequential for multi-page TIFF (regression in 0.32.0). + [#4069](https://github.com/lovell/sharp/issues/4069) + +* Tighten validation of constructor `text` integer properties. + [#4071](https://github.com/lovell/sharp/issues/4071) + +* Simplify internal StaySequential logic. + [#4074](https://github.com/lovell/sharp/pull/4074) + [@kleisauke](https://github.com/kleisauke) + +* Ensure negate operation occurs after profile conversion. + [#4096](https://github.com/lovell/sharp/pull/4096) + [@adriaanmeuris](https://github.com/adriaanmeuris) diff --git a/docs/src/content/docs/changelog/v0.33.5.md b/docs/src/content/docs/changelog/v0.33.5.md new file mode 100644 index 000000000..7fb99baa2 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.33.5.md @@ -0,0 +1,39 @@ +--- +title: v0.33.5 - 16th August 2024 +slug: changelog/v0.33.5 +--- + +* Upgrade to libvips v8.15.3 for upstream bug fixes. + +* Add `pageHeight` and `pages` to response of multi-page output. + [#3411](https://github.com/lovell/sharp/issues/3411) + +* Ensure option to force use of a globally-installed libvips works correctly. + [#4111](https://github.com/lovell/sharp/pull/4111) + [@project0](https://github.com/project0) + +* Minimise use of `engines` property to improve yarn v1 support. + [#4130](https://github.com/lovell/sharp/issues/4130) + +* Ensure `sharp.format.heif` includes only AVIF when using prebuilt binaries. + [#4132](https://github.com/lovell/sharp/issues/4132) + +* Add support to recomb operation for 4x4 matrices. + [#4147](https://github.com/lovell/sharp/pull/4147) + [@ton11797](https://github.com/ton11797) + +* Expose PNG text chunks as `comments` metadata. + [#4157](https://github.com/lovell/sharp/pull/4157) + [@nkeynes](https://github.com/nkeynes) + +* Expose optional `precision` and `minAmplitude` parameters of `blur` operation. + [#4168](https://github.com/lovell/sharp/pull/4168) + [#4172](https://github.com/lovell/sharp/pull/4172) + [@marcosc90](https://github.com/marcosc90) + +* Ensure `keepIccProfile` avoids colour transformation where possible. + [#4186](https://github.com/lovell/sharp/issues/4186) + +* TypeScript: `chromaSubsampling` metadata is optional. + [#4191](https://github.com/lovell/sharp/pull/4191) + [@DavidVaness](https://github.com/DavidVaness) diff --git a/docs/src/content/docs/changelog/v0.34.0.md b/docs/src/content/docs/changelog/v0.34.0.md new file mode 100644 index 000000000..4fc83109e --- /dev/null +++ b/docs/src/content/docs/changelog/v0.34.0.md @@ -0,0 +1,52 @@ +--- +title: v0.34.0 - 4th April 2025 +slug: changelog/v0.34.0 +--- + +* Breaking: Support array of input images to be joined or animated. + [#1580](https://github.com/lovell/sharp/issues/1580) + +* Breaking: Ensure `removeAlpha` removes all alpha channels. + [#2266](https://github.com/lovell/sharp/issues/2266) + +* Breaking: Non-animated GIF output defaults to no-loop instead of loop-forever. + [#3394](https://github.com/lovell/sharp/issues/3394) + +* Breaking: Support `info.size` on wide-character systems via upgrade to C++17. + [#3943](https://github.com/lovell/sharp/issues/3943) + +* Breaking: Ensure `background` metadata can be parsed by `color` package. + [#4090](https://github.com/lovell/sharp/issues/4090) + +* Add `isPalette` and `bitsPerSample` to metadata, deprecate `paletteBitDepth`. + +* Expose WebP `smartDeblock` output option. + +* Prevent use of linux-x64 binaries with v1 microarchitecture. + +* Add `autoOrient` operation and constructor option. + [#4151](https://github.com/lovell/sharp/pull/4151) + [@happycollision](https://github.com/happycollision) + +* TypeScript: Ensure channel counts use the correct range. + [#4197](https://github.com/lovell/sharp/pull/4197) + [@DavidVaness](https://github.com/DavidVaness) + +* Improve support for ppc64le architecture. + [#4203](https://github.com/lovell/sharp/pull/4203) + [@sumitd2](https://github.com/sumitd2) + +* Add `pdfBackground` constructor property. + [#4207](https://github.com/lovell/sharp/pull/4207) + [@calebmer](https://github.com/calebmer) + +* Expose erode and dilate operations. + [#4243](https://github.com/lovell/sharp/pull/4243) + [@qpincon](https://github.com/qpincon) + +* Add support for RGBE images. Requires libvips compiled with radiance support. + [#4316](https://github.com/lovell/sharp/pull/4316) + [@florentzabera](https://github.com/florentzabera) + +* Allow wide-gamut HEIF output at higher bitdepths. + [#4344](https://github.com/lovell/sharp/issues/4344) diff --git a/docs/src/content/docs/changelog/v0.34.1.md b/docs/src/content/docs/changelog/v0.34.1.md new file mode 100644 index 000000000..a5ae94dcf --- /dev/null +++ b/docs/src/content/docs/changelog/v0.34.1.md @@ -0,0 +1,8 @@ +--- +title: v0.34.1 - 7th April 2025 +slug: changelog/v0.34.1 +--- + +* TypeScript: Ensure new `autoOrient` property is optional. + [#4362](https://github.com/lovell/sharp/pull/4362) + [@styfle](https://github.com/styfle) diff --git a/docs/src/content/docs/changelog/v0.34.2.md b/docs/src/content/docs/changelog/v0.34.2.md new file mode 100644 index 000000000..2292304ea --- /dev/null +++ b/docs/src/content/docs/changelog/v0.34.2.md @@ -0,0 +1,28 @@ +--- +title: v0.34.2 - 20th May 2025 +slug: changelog/v0.34.2 +--- + +* Ensure animated GIF to WebP conversion retains loop (regression in 0.34.0). + [#3394](https://github.com/lovell/sharp/issues/3394) + +* Ensure `pdfBackground` constructor property is used. + [#4207](https://github.com/lovell/sharp/pull/4207) + [#4398](https://github.com/lovell/sharp/issues/4398) + +* Add experimental support for prebuilt Windows ARM64 binaries. + [#4375](https://github.com/lovell/sharp/pull/4375) + [@hans00](https://github.com/hans00) + +* Ensure resizing with a `fit` of `contain` supports multiple alpha channels. + [#4382](https://github.com/lovell/sharp/issues/4382) + +* TypeScript: Ensure `metadata` response more closely matches reality. + [#4383](https://github.com/lovell/sharp/issues/4383) + +* TypeScript: Ensure `smartDeblock` property is included in WebP definition. + [#4387](https://github.com/lovell/sharp/pull/4387) + [@Stephen-X](https://github.com/Stephen-X) + +* Ensure support for wide-character filenames on Windows (regression in 0.34.0). + [#4391](https://github.com/lovell/sharp/issues/4391) diff --git a/docs/src/content/docs/changelog/v0.34.3.md b/docs/src/content/docs/changelog/v0.34.3.md new file mode 100644 index 000000000..1f3016741 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.34.3.md @@ -0,0 +1,33 @@ +--- +title: v0.34.3 - 10th July 2025 +slug: changelog/v0.34.3 +--- + +* Upgrade to libvips v8.17.1 for upstream bug fixes. + +* Add "Magic Kernel Sharp" (no relation) to resizing kernels. + +* Deprecate top-level, format-specific constructor parameters, e.g. `subifd` becomes `tiff.subifd`. + +* Expose `stylesheet` and `highBitdepth` SVG input parameters. + +* Expose `keepDuplicateFrames` GIF output parameter. + +* Add support for RAW digital camera image input. Requires libvips compiled with libraw support. + +* Provide XMP metadata as a string, as well as a Buffer, where possible. + +* Add `pageHeight` option to `create` and `raw` input for animated images. + [#3236](https://github.com/lovell/sharp/issues/3236) + +* Expose JPEG 2000 `oneshot` decoder option. + [#4262](https://github.com/lovell/sharp/pull/4262) + [@mbklein](https://github.com/mbklein) + +* Support composite operation with non-sRGB pipeline colourspace. + [#4412](https://github.com/lovell/sharp/pull/4412) + [@kleisauke](https://github.com/kleisauke) + +* Add `keepXmp` and `withXmp` for control over output XMP metadata. + [#4416](https://github.com/lovell/sharp/pull/4416) + [@tpatel](https://github.com/tpatel) diff --git a/docs/src/content/docs/changelog/v0.34.4.md b/docs/src/content/docs/changelog/v0.34.4.md new file mode 100644 index 000000000..83c95ce43 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.34.4.md @@ -0,0 +1,28 @@ +--- +title: v0.34.4 - 17th September 2025 +slug: changelog/v0.34.4 +--- + +* Upgrade to libvips v8.17.2 for upstream bug fixes. + +* Ensure TIFF `subifd` and OpenSlide `level` input options are respected (regression in 0.34.3). + +* Ensure `autoOrient` occurs before non-90 angle rotation. + [#4425](https://github.com/lovell/sharp/issues/4425) + +* Ensure `autoOrient` removes existing metadata after shrink-on-load. + [#4431](https://github.com/lovell/sharp/issues/4431) + +* TypeScript: Ensure `KernelEnum` includes `linear`. + [#4441](https://github.com/lovell/sharp/pull/4441) + [@BayanBennett](https://github.com/BayanBennett) + +* Ensure `unlimited` flag is passed upstream when reading TIFF images. + [#4446](https://github.com/lovell/sharp/issues/4446) + +* Support Electron memory cage when reading XMP metadata (regression in 0.34.3). + [#4451](https://github.com/lovell/sharp/issues/4451) + +* Add sharp-libvips rpath for yarn v5 support. + [#4452](https://github.com/lovell/sharp/pull/4452) + [@arcanis](https://github.com/arcanis) diff --git a/docs/src/content/docs/changelog/v0.34.5.md b/docs/src/content/docs/changelog/v0.34.5.md new file mode 100644 index 000000000..d8321baa4 --- /dev/null +++ b/docs/src/content/docs/changelog/v0.34.5.md @@ -0,0 +1,21 @@ +--- +title: v0.34.5 - 6th November 2025 +slug: changelog/v0.34.5 +--- + +* Upgrade to libvips v8.17.3 for upstream bug fixes. + +* Add experimental support for prebuilt Linux RISC-V 64-bit binaries. + +* Support building from source with npm v12+, deprecate `--build-from-source` flag. + [#4458](https://github.com/lovell/sharp/issues/4458) + +* Add support for BigTIFF output. + [#4459](https://github.com/lovell/sharp/pull/4459) + [@throwbi](https://github.com/throwbi) + +* Improve error messaging when only warnings issued. + [#4465](https://github.com/lovell/sharp/issues/4465) + +* Simplify ICC processing when retaining input profiles. + [#4468](https://github.com/lovell/sharp/issues/4468) diff --git a/docs/src/content/docs/index.md b/docs/src/content/docs/index.md new file mode 100644 index 000000000..3ba704f43 --- /dev/null +++ b/docs/src/content/docs/index.md @@ -0,0 +1,98 @@ +--- +title: "High performance Node.js image processing" +--- + +sharp logo + +The typical use case for this high speed Node-API module +is to convert large images in common formats to +smaller, web-friendly JPEG, PNG, WebP, GIF and AVIF images of varying dimensions. + +It can be used with all JavaScript runtimes +that provide support for Node-API v9, including +Node.js >= 18.17.0, Deno and Bun. + +Resizing an image is typically 4x-5x faster than using the +quickest ImageMagick and GraphicsMagick settings +due to its use of [libvips](https://github.com/libvips/libvips). + +Colour spaces, embedded ICC profiles and alpha transparency channels are all handled correctly. +Lanczos resampling ensures quality is not sacrificed for speed. + +As well as image resizing, operations such as +rotation, extraction, compositing and gamma correction are available. + +Most modern macOS, Windows and Linux systems +do not require any additional install or runtime dependencies. + +```sh frame="none" +npm install sharp +``` + +## Formats + +This module supports reading JPEG, PNG, WebP, GIF, AVIF, TIFF and SVG images. + +Output images can be in JPEG, PNG, WebP, GIF, AVIF and TIFF formats as well as uncompressed raw pixel data. + +Streams, Buffer objects and the filesystem can be used for input and output. + +A single input Stream can be split into multiple processing pipelines and output Streams. + +Deep Zoom image pyramids can be generated, +suitable for use with "slippy map" tile viewers like +[OpenSeadragon](https://github.com/openseadragon/openseadragon). + +## Fast + +This module is powered by the blazingly fast +[libvips](https://github.com/libvips/libvips) image processing library, +originally created in 1989 at Birkbeck College +and currently maintained by a small team led by +[John Cupitt](https://github.com/jcupitt). + +Only small regions of uncompressed image data +are held in memory and processed at a time, +taking full advantage of multiple CPU cores and L1/L2/L3 cache. + +Everything remains non-blocking thanks to _libuv_, +no child processes are spawned and Promises/async/await are supported. + +## Optimal + +The features of `mozjpeg` and `pngquant` can be used +to optimise the file size of JPEG and PNG images respectively, +without having to invoke separate `imagemin` processes. + +Huffman tables are optimised when generating JPEG output images +without having to use separate command line tools like +[jpegoptim](https://github.com/tjko/jpegoptim) and +[jpegtran](http://jpegclub.org/jpegtran/). + +PNG filtering is disabled by default, +which for diagrams and line art often produces the same result +as [pngcrush](https://pmt.sourceforge.io/pngcrush/). + +The file size of animated GIF output is optimised +without having to use separate command line tools such as +[gifsicle](https://www.lcdf.org/gifsicle/). + +## Contributing + +A [guide for contributors](https://github.com/lovell/sharp/blob/main/.github/CONTRIBUTING.md) +covers reporting bugs, requesting features and submitting code changes. + +## Licensing + +Copyright 2013 Lovell Fuller and others. + +Licensed under the Apache License, Version 2.0 (the "License"); +you may not use this file except in compliance with the License. +You may obtain a copy of the License at +[https://www.apache.org/licenses/LICENSE-2.0](https://www.apache.org/licenses/LICENSE-2.0) + +Unless required by applicable law or agreed to in writing, software +distributed under the License is distributed on an "AS IS" BASIS, +WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +See the License for the specific language governing permissions and +limitations under the License. diff --git a/docs/src/content/docs/install.md b/docs/src/content/docs/install.md new file mode 100644 index 000000000..083db5ccd --- /dev/null +++ b/docs/src/content/docs/install.md @@ -0,0 +1,361 @@ +--- +title: Installation +--- + +Works with your choice of JavaScript package manager. + +:::caution +Please ensure your package manager is configured to install optional dependencies +::: + +If a package manager lockfile must support multiple platforms, +please see the [cross-platform](#cross-platform) section +to help decide which package manager is appropriate. + +```sh frame="none" +npm install sharp +``` + +```sh frame="none" +pnpm add sharp +``` + +When using `pnpm`, add `sharp` to +[ignoredBuiltDependencies](https://pnpm.io/settings#ignoredbuiltdependencies) +to silence warnings. + +```sh frame="none" +yarn add sharp +``` + +```sh frame="none" +bun add sharp +``` + +```sh frame="none" +deno add --quiet npm:sharp +deno run --allow-env --allow-ffi --allow-read --allow-sys ... +``` + +## Prerequisites + +* Node-API v9 compatible runtime e.g. Node.js ^18.17.0 or >=20.3.0. + +## Prebuilt binaries + +Ready-compiled sharp and libvips binaries are provided for use on the most common platforms: + +* macOS x64 (>= 10.15) +* macOS ARM64 +* Linux ARM (glibc >= 2.31) +* Linux ARM64 (glibc >= 2.26, musl >= 1.2.2) +* Linux RISC-V 64-bit (glibc >= 2.41) +* Linux ppc64 (glibc >= 2.36) +* Linux s390x (glibc >= 2.36) +* Linux x64 (glibc >= 2.26, musl >= 1.2.2, CPU with SSE4.2) +* Windows x64 +* Windows x86 +* Windows ARM64 (experimental, CPU with ARMv8.4 required for all features) + +This provides support for the +JPEG, PNG, WebP, AVIF (limited to 8-bit depth), TIFF, GIF and SVG (input) image formats. + +## Cross-platform + +At install time, package managers will automatically select prebuilt binaries +for the current OS platform and CPU architecture, where available. + +Some package managers support multiple platforms and architectures +within the same installation tree and/or using the same lockfile. + +### npm v10+ + +:::caution +npm `package-lock.json` files shared by multiple platforms can cause installation problems due to [npm bug #4828](https://github.com/npm/cli/issues/4828) +::: + +Provides limited support via `--os`, `--cpu` and `--libc` flags. + +To support macOS with Intel x64 and ARM64 CPUs: +```sh frame="none" +npm install --cpu=x64 --os=darwin sharp +npm install --cpu=arm64 --os=darwin sharp +``` + +When the cross-target is Linux, the C standard library must be specified. + +To support glibc (e.g. Debian) and musl (e.g. Alpine) Linux with Intel x64 CPUs: +```sh frame="none" +npm install --cpu=x64 --os=linux --libc=glibc sharp +npm install --cpu=x64 --os=linux --libc=musl sharp +``` + +### yarn v3+ + +Use the [supportedArchitectures](https://yarnpkg.com/configuration/yarnrc#supportedArchitectures) configuration. + +### pnpm v8+ + +Use the [supportedArchitectures](https://pnpm.io/settings#supportedarchitectures) configuration. + +## Custom libvips + +To use a custom, globally-installed version of libvips instead of the provided binaries, +make sure it is at least the version listed under `config.libvips` in the `package.json` file +and that it can be located using `pkg-config --modversion vips-cpp`. + +For help compiling libvips and its dependencies, please see +[building libvips from source](https://www.libvips.org/install.html#building-libvips-from-source). + +The use of a globally-installed libvips is unsupported on Windows +and on macOS when running Node.js under Rosetta. + +## Building from source + +This module will be compiled from source when: + +* a globally-installed libvips is detected, or +* using `npm explore sharp -- npm run build`, or +* using the deprecated `npm run --build-from-source` at `npm install` time. + +The logic to detect a globally-installed libvips can be skipped by setting the +`SHARP_IGNORE_GLOBAL_LIBVIPS` (never try to use it) or +`SHARP_FORCE_GLOBAL_LIBVIPS` (always try to use it, even when missing or outdated) +environment variables. + +Building from source requires: + +* C++17 compiler +* [node-addon-api](https://www.npmjs.com/package/node-addon-api) version 7+ +* [node-gyp](https://github.com/nodejs/node-gyp#installation) version 9+ and its dependencies + +There is an install-time check for these dependencies. +If `node-addon-api` or `node-gyp` cannot be found, try adding them via: + +```sh frame="none" +npm install --save node-addon-api node-gyp +``` + +When using `pnpm`, you may need to add `sharp` to +[onlyBuiltDependencies](https://pnpm.io/settings#onlybuiltdependencies) +to ensure the installation script can be run. + +For cross-compiling, the `--platform`, `--arch` and `--libc` npm flags +(or the `npm_config_platform`, `npm_config_arch` and `npm_config_libc` environment variables) +can be used to configure the target environment. + +## WebAssembly + +Experimental support is provided for runtime environments that provide +multi-threaded Wasm via Workers. + +Use in web browsers is unsupported. + +Native text rendering is unsupported. + +[Tile-based output](/api-output#tile) is unsupported. + +```sh frame="none" +npm install --cpu=wasm32 sharp +``` + +## FreeBSD + +The `vips` package must be installed before `npm install` is run, +as well as the additional [building from source](#building-from-source) dependencies. + +```sh frame="none" +pkg install -y pkgconf vips +``` + +```sh frame="none" +cd /usr/ports/graphics/vips/ && make install clean +``` + +## Linux memory allocator + +The default memory allocator on most glibc-based Linux systems +(e.g. Debian, Red Hat) is unsuitable for long-running, multi-threaded +processes that involve lots of small memory allocations. + +For this reason, by default, sharp will limit the use of thread-based +[concurrency](/api-utility#concurrency) when the glibc allocator is +detected at runtime. + +To help avoid fragmentation and improve performance on these systems, +the use of an alternative memory allocator such as +[jemalloc](https://github.com/jemalloc/jemalloc) is recommended. + +Those using musl-based Linux (e.g. Alpine) and non-Linux systems are +unaffected. + +## AWS Lambda + +The `node_modules` directory of the +[deployment package](https://docs.aws.amazon.com/lambda/latest/dg/nodejs-package.html) +must include binaries for either the linux-x64 or linux-arm64 platforms +depending on the chosen architecture. + +When building your deployment package on a machine that differs from the target architecture, +see the [cross-platform](#cross-platform) section to help decide which package manager is appropriate +and how to configure it. + +Some package managers use symbolic links +but AWS Lambda does not support these within deployment packages. + +To get the best performance select the largest memory available. +A 1536 MB function provides ~12x more CPU time than a 128 MB function. + +When integrating with AWS API Gateway, ensure it is configured with the relevant +[binary media types](https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-payload-encodings.html). + +## Bundlers + +### webpack + +Ensure sharp is excluded from bundling via the +[externals](https://webpack.js.org/configuration/externals/) +configuration. + +```js frame="none" +externals: { + 'sharp': 'commonjs sharp' +} +``` + +### esbuild + +Ensure sharp is excluded from bundling via the +[external](https://esbuild.github.io/api/#external) +configuration. + +```js frame="none" +buildSync({ + entryPoints: ['app.js'], + bundle: true, + platform: 'node', + external: ['sharp'], +}) +``` + +```sh frame="none" +esbuild app.js --bundle --platform=node --external:sharp +``` + +For `serverless-esbuild`, ensure platform-specific binaries are installed +via the `serverless.yml` configuration. + +```yaml frame="none" +custom: + esbuild: + external: + - sharp + packagerOptions: + scripts: + - npm install --os=linux --cpu=x64 sharp +``` + +### electron + +#### electron-builder + +Ensure `sharp` is unpacked from the ASAR archive file using the +[asarUnpack](https://www.electron.build/app-builder-lib.interface.platformspecificbuildoptions#asarunpack) +option. + +```json frame="none" +{ + "build": { + "asar": true, + "asarUnpack": [ + "**/node_modules/sharp/**/*", + "**/node_modules/@img/**/*" + ] + } +} +``` + +#### electron-forge + +Ensure `sharp` is unpacked from the ASAR archive file using the +[unpack](https://js.electronforge.io/interfaces/_electron_forge_maker_squirrel.InternalOptions.Options.html#asar) +option. + +```json frame="none" +{ + "packagerConfig": { + "asar": { + "unpack": "**/node_modules/{sharp,@img}/**/*" + } + } +} +``` + +When using `electron-forge` with [Webpack](#webpack), +you may also need to add +[forge-externals-plugin](https://www.npmjs.com/package/@timfish/forge-externals-plugin). + +### vite + +Ensure `sharp` is excluded from bundling via the +[build.rollupOptions](https://vitejs.dev/config/build-options.html) +configuration. + +```js frame="none" +import { defineConfig } from 'vite'; + +export default defineConfig({ + build: { + rollupOptions: { + external: [ + "sharp" + ] + } + } +}); +``` + +## TypeScript + +TypeScript definitions are published as part of +the `sharp` package from v0.32.0. + +Previously these were available via the `@types/sharp` package, +which is now deprecated. + +When using Typescript, please ensure `devDependencies` includes +the `@types/node` package. + +## Fonts + +When creating text images or rendering SVG images that contain text elements, +`fontconfig` is used to find the relevant fonts. + +On Windows and macOS systems, all system fonts are available for use. + +On macOS systems using Homebrew, you may need to set the +`PANGOCAIRO_BACKEND` environment variable to a value of `fontconfig` +to ensure it is used for font discovery instead of Core Text. + +On Linux systems, fonts that include the relevant +[`fontconfig` configuration](https://www.freedesktop.org/software/fontconfig/fontconfig-user.html) +when installed via package manager are available for use. + +If `fontconfig` configuration is not found, the following error will occur: +``` +Fontconfig error: Cannot load default config file +``` + +In serverless environments where there is no control over font packages, +use the `FONTCONFIG_PATH` environment variable to point to a custom location. + +Embedded SVG fonts are unsupported. + +## Known conflicts + +### Canvas and Windows + +If both `canvas` and `sharp` modules are used in the same Windows process, the following error may occur: +``` +The specified procedure could not be found. +``` diff --git a/docs/src/content/docs/performance.md b/docs/src/content/docs/performance.md new file mode 100644 index 000000000..19aead63d --- /dev/null +++ b/docs/src/content/docs/performance.md @@ -0,0 +1,138 @@ +--- +title: Performance +--- + +## Parallelism and concurrency + +Node.js uses a libuv-managed thread pool when processing asynchronous calls to native modules such as sharp. + +The maximum number of images that sharp can process in parallel is controlled by libuv's +[`UV_THREADPOOL_SIZE`](https://nodejs.org/api/cli.html#uv_threadpool_sizesize) +environment variable, which defaults to 4. + +When using more than 4 physical CPU cores, set this environment variable +before the Node.js process starts to increase the thread pool size. + +```sh frame="none" +export UV_THREADPOOL_SIZE="$(lscpu -p | egrep -v "^#" | sort -u -t, -k 2,4 | wc -l)" +``` + +libvips uses a shared thread pool to avoid the overhead of spawning new threads. +The size of this thread pool will grow on demand and shrink when idle. + +The default number of threads used to concurrently process each image is the same as the number of CPU cores, +except when using glibc-based Linux without jemalloc, where the default is `1` to help reduce memory fragmentation. + +Use [`sharp.concurrency()`](/api-utility/#concurrency) to manage the number of threads per image. + +To reduce memory fragmentation when using the default Linux glibc memory allocator, set the +[`MALLOC_ARENA_MAX`](https://sourceware.org/glibc/manual/latest/html_node/Memory-Allocation-Tunables.html) +environment variable before the Node.js process starts to reduce the number of memory pools. + +```sh frame="none" +export MALLOC_ARENA_MAX="2" +``` + +## Benchmark + +A test to benchmark the performance of this module relative to alternatives. + +Greater libvips performance can be expected with caching enabled (default) +and using 8+ core machines, especially those with larger L1/L2 CPU caches. + +The I/O limits of the relevant (de)compression library will generally determine maximum throughput. + +### Contenders + +- [jimp](https://www.npmjs.com/package/jimp) v1.6.0 - Image processing in pure JavaScript. +- [imagemagick](https://www.npmjs.com/package/imagemagick) v0.1.3 - Supports filesystem only and "_has been unmaintained for a long time_". +- [gm](https://www.npmjs.com/package/gm) v1.25.1 - Fully featured wrapper around GraphicsMagick's `gm` command line utility, but "_has been sunset_". +- sharp v0.34.3 / libvips v8.17.0 - Caching within libvips disabled to ensure a fair comparison. + +### Environment + +#### AMD64 + +- AWS EC2 us-west-2 [c7a.xlarge](https://aws.amazon.com/ec2/instance-types/c7a/) (4x AMD EPYC 9R14) +- Ubuntu 25.04 +- Node.js 24.3.0 + +#### ARM64 + +- AWS EC2 us-west-2 [c8g.xlarge](https://aws.amazon.com/ec2/instance-types/c8g/) (4x ARM Graviton4) +- Ubuntu 25.04 +- Node.js 24.3.0 + +### Task: JPEG + +Decompress a 2725x2225 JPEG image, +resize to 720x588 using Lanczos 3 resampling (where available), +then compress to JPEG at a "quality" setting of 80. + +Note: jimp does not support Lanczos 3, bicubic resampling used instead. + +#### Results: JPEG (AMD64) + +| Package | I/O | Ops/sec | Speed-up | +| :---------- | :----- | ------: | -------: | +| jimp | buffer | 2.40 | 1.0 | +| jimp | file | 2.60 | 1.1 | +| imagemagick | file | 9.70 | 4.0 | +| gm | buffer | 11.60 | 4.8 | +| gm | file | 11.72 | 4.9 | +| sharp | stream | 59.40 | 24.8 | +| sharp | file | 62.67 | 26.1 | +| sharp | buffer | 64.42 | 26.8 | + +#### Results: JPEG (ARM64) + +| Package | I/O | Ops/sec | Speed-up | +| :---------- | :----- | ------: | -------: | +| jimp | buffer | 2.24 | 1.0 | +| jimp | file | 2.47 | 1.1 | +| imagemagick | file | 10.42 | 4.7 | +| gm | buffer | 12.80 | 5.7 | +| gm | file | 12.88 | 5.7 | +| sharp | stream | 45.58 | 20.3 | +| sharp | file | 47.99 | 21.4 | +| sharp | buffer | 49.20 | 22.0 | + +### Task: PNG + +Decompress a 2048x1536 RGBA PNG image, +premultiply the alpha channel, +resize to 720x540 using Lanczos 3 resampling (where available), +unpremultiply then compress as PNG with a "default" zlib compression level of 6 +and without adaptive filtering. + +Note: jimp does not support premultiply/unpremultiply. + +#### Results: PNG (AMD64) + +| Package | I/O | Ops/sec | Speed-up | +| :---------- | :----- | ------: | -------: | +| imagemagick | file | 6.06 | 1.0 | +| gm | file | 8.44 | 1.4 | +| jimp | buffer | 10.98 | 1.8 | +| sharp | file | 28.26 | 4.7 | +| sharp | buffer | 28.70 | 4.7 | + +#### Results: PNG (ARM64) + +| Package | I/O | Ops/sec | Speed-up | +| :---------- | :----- | ------: | -------: | +| imagemagick | file | 7.09 | 1.0 | +| gm | file | 8.93 | 1.3 | +| jimp | buffer | 10.28 | 1.5 | +| sharp | file | 23.81 | 3.4 | +| sharp | buffer | 24.19 | 3.4 | + +## Running the benchmark test + +Requires Docker. + +```sh frame="none" +git clone https://github.com/lovell/sharp.git +cd sharp/test/bench +./run-with-docker.sh +``` diff --git a/docs/src/styles/custom.css b/docs/src/styles/custom.css new file mode 100644 index 000000000..197ccf567 --- /dev/null +++ b/docs/src/styles/custom.css @@ -0,0 +1,45 @@ +@view-transition { + navigation: auto; +} + +:root { + --sl-content-width: 60rem; + --sl-color-accent-low: #072d00; + --sl-color-accent: #247f00; + --sl-color-accent-high: #aad7a0; + --sl-color-white: #ffffff; + --sl-color-gray-1: #eaf0e8; + --sl-color-gray-2: #c5cdc3; + --sl-color-gray-3: #99a796; + --sl-color-gray-4: #4f5c4d; + --sl-color-gray-5: #303c2d; + --sl-color-gray-6: #1f2a1c; + --sl-color-black: #151a13; +} + +:root[data-theme="light"] { + --sl-color-accent-low: #c0e2b8; + --sl-color-accent: #165800; + --sl-color-accent-high: #0d3e00; + --sl-color-white: #151a13; + --sl-color-gray-1: #1f2a1c; + --sl-color-gray-2: #303c2d; + --sl-color-gray-3: #4f5c4d; + --sl-color-gray-4: #82907f; + --sl-color-gray-5: #bdc4bb; + --sl-color-gray-6: #eaf0e8; + --sl-color-gray-7: #f4f7f3; + --sl-color-black: #ffffff; +} + +blockquote { + background-color: var(--sl-color-gray-6); + padding: 1rem; +} + +.site-title::after { + content: "High performance Node.js image processing"; + color: var(--sl-color-text); + font-size: var(--sl-text-sm); + padding-top: 0.3rem; +} diff --git a/docs/tsconfig.json b/docs/tsconfig.json new file mode 100644 index 000000000..8bf91d3bb --- /dev/null +++ b/docs/tsconfig.json @@ -0,0 +1,5 @@ +{ + "extends": "astro/tsconfigs/strict", + "include": [".astro/types.d.ts", "**/*"], + "exclude": ["dist"] +} diff --git a/install/build.js b/install/build.js new file mode 100644 index 000000000..2ca224586 --- /dev/null +++ b/install/build.js @@ -0,0 +1,38 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { + useGlobalLibvips, + globalLibvipsVersion, + log, + spawnRebuild, +} = require('../lib/libvips'); + +log('Attempting to build from source via node-gyp'); +log('See https://sharp.pixelplumbing.com/install#building-from-source'); + +try { + const addonApi = require('node-addon-api'); + log(`Found node-addon-api ${addonApi.version || ''}`); +} catch (_err) { + log('Please add node-addon-api to your dependencies'); + process.exit(1); +} +try { + const gyp = require('node-gyp'); + log(`Found node-gyp ${gyp().version}`); +} catch (_err) { + log('Please add node-gyp to your dependencies'); + process.exit(1); +} + +if (useGlobalLibvips(log)) { + log(`Detected globally-installed libvips v${globalLibvipsVersion()}`); +} + +const status = spawnRebuild(); +if (status !== 0) { + process.exit(status); +} diff --git a/install/check.js b/install/check.js new file mode 100644 index 000000000..1cfb7d32e --- /dev/null +++ b/install/check.js @@ -0,0 +1,14 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +try { + const { useGlobalLibvips } = require('../lib/libvips'); + if (useGlobalLibvips() || process.env.npm_config_build_from_source) { + process.exit(1); + } +} catch (err) { + const summary = err.message.split(/\n/).slice(0, 1); + console.log(`sharp: skipping install check: ${summary}`); +} diff --git a/lib/channel.js b/lib/channel.js index 8aee9b1aa..3c6c0b439 100644 --- a/lib/channel.js +++ b/lib/channel.js @@ -1,4 +1,7 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ const is = require('./is'); @@ -12,33 +15,93 @@ const bool = { eor: 'eor' }; +/** + * Remove alpha channels, if any. This is a no-op if the image does not have an alpha channel. + * + * See also {@link /api-operation/#flatten flatten}. + * + * @example + * sharp('rgba.png') + * .removeAlpha() + * .toFile('rgb.png', function(err, info) { + * // rgb.png is a 3 channel image without an alpha channel + * }); + * + * @returns {Sharp} + */ +function removeAlpha () { + this.options.removeAlpha = true; + return this; +} + +/** + * Ensure the output image has an alpha transparency channel. + * If missing, the added alpha channel will have the specified + * transparency level, defaulting to fully-opaque (1). + * This is a no-op if the image already has an alpha channel. + * + * @since 0.21.2 + * + * @example + * // rgba.png will be a 4 channel image with a fully-opaque alpha channel + * await sharp('rgb.jpg') + * .ensureAlpha() + * .toFile('rgba.png') + * + * @example + * // rgba is a 4 channel image with a fully-transparent alpha channel + * const rgba = await sharp(rgb) + * .ensureAlpha(0) + * .toBuffer(); + * + * @param {number} [alpha=1] - alpha transparency level (0=fully-transparent, 1=fully-opaque) + * @returns {Sharp} + * @throws {Error} Invalid alpha transparency level + */ +function ensureAlpha (alpha) { + if (is.defined(alpha)) { + if (is.number(alpha) && is.inRange(alpha, 0, 1)) { + this.options.ensureAlpha = alpha; + } else { + throw is.invalidParameterError('alpha', 'number between 0 and 1', alpha); + } + } else { + this.options.ensureAlpha = 1; + } + return this; +} + /** * Extract a single channel from a multi-channel image. * + * The output colourspace will be either `b-w` (8-bit) or `grey16` (16-bit). + * * @example - * sharp(input) + * // green.jpg is a greyscale image containing the green channel of the input + * await sharp(input) * .extractChannel('green') - * .toFile('input_green.jpg', function(err, info) { - * // info.channels === 1 - * // input_green.jpg contains the green channel of the input image - * }); + * .toFile('green.jpg'); + * + * @example + * // red1 is the red value of the first pixel, red2 the second pixel etc. + * const [red1, red2, ...] = await sharp(input) + * .extractChannel(0) + * .raw() + * .toBuffer(); * - * @param {Number|String} channel - zero-indexed band number to extract, or `red`, `green` or `blue` as alternative to `0`, `1` or `2` respectively. + * @param {number|string} channel - zero-indexed channel/band number to extract, or `red`, `green`, `blue` or `alpha`. * @returns {Sharp} * @throws {Error} Invalid channel */ function extractChannel (channel) { - if (channel === 'red') { - channel = 0; - } else if (channel === 'green') { - channel = 1; - } else if (channel === 'blue') { - channel = 2; + const channelMap = { red: 0, green: 1, blue: 2, alpha: 3 }; + if (Object.keys(channelMap).includes(channel)) { + channel = channelMap[channel]; } if (is.integer(channel) && is.inRange(channel, 0, 4)) { this.options.extractChannel = channel; } else { - throw new Error('Cannot extract invalid channel ' + channel); + throw is.invalidParameterError('channel', 'integer or one of: red, green, blue, alpha', channel); } return this; } @@ -51,10 +114,10 @@ function extractChannel (channel) { * - sRGB: 0: Red, 1: Green, 2: Blue, 3: Alpha. * - CMYK: 0: Magenta, 1: Cyan, 2: Yellow, 3: Black, 4: Alpha. * - * Buffers may be any of the image formats supported by sharp: JPEG, PNG, WebP, GIF, SVG, TIFF or raw pixel image data. + * Buffers may be any of the image formats supported by sharp. * For raw pixel input, the `options` object should contain a `raw` attribute, which follows the format of the attribute of the same name in the `sharp()` constructor. * - * @param {Array|String|Buffer} images - one or more images (file paths, Buffers). + * @param {Array|string|Buffer} images - one or more images (file paths, Buffers). * @param {Object} options - image options, see `sharp()` constructor. * @returns {Sharp} * @throws {Error} Invalid parameters @@ -82,7 +145,7 @@ function joinChannel (images, options) { * // then `O(1,1) = 0b11110111 & 0b10101010 & 0b00001111 = 0b00000010 = 2`. * }); * - * @param {String} boolOp - one of `and`, `or` or `eor` to perform that bitwise operation, like the C logic operators `&`, `|` and `^` respectively. + * @param {string} boolOp - one of `and`, `or` or `eor` to perform that bitwise operation, like the C logic operators `&`, `|` and `^` respectively. * @returns {Sharp} * @throws {Error} Invalid parameters */ @@ -90,23 +153,24 @@ function bandbool (boolOp) { if (is.string(boolOp) && is.inArray(boolOp, ['and', 'or', 'eor'])) { this.options.bandBoolOp = boolOp; } else { - throw new Error('Invalid bandbool operation ' + boolOp); + throw is.invalidParameterError('boolOp', 'one of: and, or, eor', boolOp); } return this; } /** * Decorate the Sharp prototype with channel-related functions. + * @module Sharp * @private */ -module.exports = function (Sharp) { - // Public instance functions - [ +module.exports = (Sharp) => { + Object.assign(Sharp.prototype, { + // Public instance functions + removeAlpha, + ensureAlpha, extractChannel, joinChannel, bandbool - ].forEach(function (f) { - Sharp.prototype[f.name] = f; }); // Class attributes Sharp.bool = bool; diff --git a/lib/colour.js b/lib/colour.js index e115946c8..e61c248a8 100644 --- a/lib/colour.js +++ b/lib/colour.js @@ -1,6 +1,9 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const color = require('color'); +const color = require('@img/colour'); const is = require('./is'); /** @@ -16,35 +19,34 @@ const colourspace = { }; /** - * Set the background for the `embed`, `flatten` and `extend` operations. - * The default background is `{r: 0, g: 0, b: 0, alpha: 1}`, black without transparency. + * Tint the image using the provided colour. + * An alpha channel may be present and will be unchanged by the operation. * - * Delegates to the _color_ module, which can throw an Error - * but is liberal in what it accepts, clipping values to sensible min/max. - * The alpha value is a float between `0` (transparent) and `1` (opaque). + * @example + * const output = await sharp(input) + * .tint({ r: 255, g: 240, b: 16 }) + * .toBuffer(); * - * @param {String|Object} rgba - parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. + * @param {string|Object} tint - Parsed by the [color](https://www.npmjs.org/package/color) module. * @returns {Sharp} * @throws {Error} Invalid parameter */ -function background (rgba) { - const colour = color(rgba); - this.options.background = [ - colour.red(), - colour.green(), - colour.blue(), - Math.round(colour.alpha() * 255) - ]; +function tint (tint) { + this._setBackgroundColourOption('tint', tint); return this; } /** * Convert to 8-bit greyscale; 256 shades of grey. * This is a linear operation. If the input image is in a non-linear colour space such as sRGB, use `gamma()` with `greyscale()` for the best results. - * By default the output image will be web-friendly sRGB and contain three (identical) color channels. + * By default the output image will be web-friendly sRGB and contain three (identical) colour channels. * This may be overridden by other sharp operations such as `toColourspace('b-w')`, - * which will produce an output image containing one color channel. + * which will produce an output image containing one colour channel. * An alpha channel may be present, and will be unchanged by the operation. + * + * @example + * const output = await sharp(input).greyscale().toBuffer(); + * * @param {Boolean} [greyscale=true] * @returns {Sharp} */ @@ -62,16 +64,61 @@ function grayscale (grayscale) { return this.greyscale(grayscale); } +/** + * Set the pipeline colourspace. + * + * The input image will be converted to the provided colourspace at the start of the pipeline. + * All operations will use this colourspace before converting to the output colourspace, + * as defined by {@link #tocolourspace toColourspace}. + * + * @since 0.29.0 + * + * @example + * // Run pipeline in 16 bits per channel RGB while converting final result to 8 bits per channel sRGB. + * await sharp(input) + * .pipelineColourspace('rgb16') + * .toColourspace('srgb') + * .toFile('16bpc-pipeline-to-8bpc-output.png') + * + * @param {string} [colourspace] - pipeline colourspace e.g. `rgb16`, `scrgb`, `lab`, `grey16` [...](https://www.libvips.org/API/current/enum.Interpretation.html) + * @returns {Sharp} + * @throws {Error} Invalid parameters + */ +function pipelineColourspace (colourspace) { + if (!is.string(colourspace)) { + throw is.invalidParameterError('colourspace', 'string', colourspace); + } + this.options.colourspacePipeline = colourspace; + return this; +} + +/** + * Alternative spelling of `pipelineColourspace`. + * @param {string} [colorspace] - pipeline colorspace. + * @returns {Sharp} + * @throws {Error} Invalid parameters + */ +function pipelineColorspace (colorspace) { + return this.pipelineColourspace(colorspace); +} + /** * Set the output colourspace. * By default output image will be web-friendly sRGB, with additional channels interpreted as alpha channels. - * @param {String} [colourspace] - output colourspace e.g. `srgb`, `rgb`, `cmyk`, `lab`, `b-w` [...](https://github.com/jcupitt/libvips/blob/master/libvips/iofuncs/enumtypes.c#L568) + * + * @example + * // Output 16 bits per pixel RGB + * await sharp(input) + * .toColourspace('rgb16') + * .toFile('16-bpp.png') + * + * @param {string} [colourspace] - output colourspace e.g. `srgb`, `rgb`, `cmyk`, `lab`, `b-w` [...](https://www.libvips.org/API/current/enum.Interpretation.html) * @returns {Sharp} * @throws {Error} Invalid parameters */ function toColourspace (colourspace) { if (!is.string(colourspace)) { - throw new Error('Invalid output colourspace ' + colourspace); + throw is.invalidParameterError('colourspace', 'string', colourspace); } this.options.colourspace = colourspace; return this; @@ -79,7 +126,7 @@ function toColourspace (colourspace) { /** * Alternative spelling of `toColourspace`. - * @param {String} [colorspace] - output colorspace. + * @param {string} [colorspace] - output colorspace. * @returns {Sharp} * @throws {Error} Invalid parameters */ @@ -87,20 +134,60 @@ function toColorspace (colorspace) { return this.toColourspace(colorspace); } +/** + * Create a RGBA colour array from a given value. + * @private + * @param {string|Object} value + * @throws {Error} Invalid value + */ +function _getBackgroundColourOption (value) { + if ( + is.object(value) || + (is.string(value) && value.length >= 3 && value.length <= 200) + ) { + const colour = color(value); + return [ + colour.red(), + colour.green(), + colour.blue(), + Math.round(colour.alpha() * 255) + ]; + } else { + throw is.invalidParameterError('background', 'object or string', value); + } +} + +/** + * Update a colour attribute of the this.options Object. + * @private + * @param {string} key + * @param {string|Object} value + * @throws {Error} Invalid value + */ +function _setBackgroundColourOption (key, value) { + if (is.defined(value)) { + this.options[key] = _getBackgroundColourOption(value); + } +} + /** * Decorate the Sharp prototype with colour-related functions. + * @module Sharp * @private */ -module.exports = function (Sharp) { - // Public instance functions - [ - background, +module.exports = (Sharp) => { + Object.assign(Sharp.prototype, { + // Public + tint, greyscale, grayscale, + pipelineColourspace, + pipelineColorspace, toColourspace, - toColorspace - ].forEach(function (f) { - Sharp.prototype[f.name] = f; + toColorspace, + // Private + _getBackgroundColourOption, + _setBackgroundColourOption }); // Class attributes Sharp.colourspace = colourspace; diff --git a/lib/composite.js b/lib/composite.js index f29df7f0f..1c3e5e629 100644 --- a/lib/composite.js +++ b/lib/composite.js @@ -1,26 +1,90 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ const is = require('./is'); /** - * Overlay (composite) an image over the processed (resized, extracted etc.) image. + * Blend modes. + * @member + * @private + */ +const blend = { + clear: 'clear', + source: 'source', + over: 'over', + in: 'in', + out: 'out', + atop: 'atop', + dest: 'dest', + 'dest-over': 'dest-over', + 'dest-in': 'dest-in', + 'dest-out': 'dest-out', + 'dest-atop': 'dest-atop', + xor: 'xor', + add: 'add', + saturate: 'saturate', + multiply: 'multiply', + screen: 'screen', + overlay: 'overlay', + darken: 'darken', + lighten: 'lighten', + 'colour-dodge': 'colour-dodge', + 'color-dodge': 'colour-dodge', + 'colour-burn': 'colour-burn', + 'color-burn': 'colour-burn', + 'hard-light': 'hard-light', + 'soft-light': 'soft-light', + difference: 'difference', + exclusion: 'exclusion' +}; + +/** + * Composite image(s) over the processed (resized, extracted etc.) image. * - * The overlay image must be the same size or smaller than the processed image. + * The images to composite must be the same size or smaller than the processed image. * If both `top` and `left` options are provided, they take precedence over `gravity`. * - * If the overlay image contains an alpha channel then composition with premultiplication will occur. + * Other operations in the same processing pipeline (e.g. resize, rotate, flip, + * flop, extract) will always be applied to the input image before composition. + * + * The `blend` option can be one of `clear`, `source`, `over`, `in`, `out`, `atop`, + * `dest`, `dest-over`, `dest-in`, `dest-out`, `dest-atop`, + * `xor`, `add`, `saturate`, `multiply`, `screen`, `overlay`, `darken`, `lighten`, + * `colour-dodge`, `color-dodge`, `colour-burn`,`color-burn`, + * `hard-light`, `soft-light`, `difference`, `exclusion`. + * + * More information about blend modes can be found at + * https://www.libvips.org/API/current/enum.BlendMode.html + * and https://www.cairographics.org/operators/ + * + * @since 0.22.0 + * + * @example + * await sharp(background) + * .composite([ + * { input: layer1, gravity: 'northwest' }, + * { input: layer2, gravity: 'southeast' }, + * ]) + * .toFile('combined.png'); + * + * @example + * const output = await sharp('input.gif', { animated: true }) + * .composite([ + * { input: 'overlay.png', tile: true, blend: 'saturate' } + * ]) + * .toBuffer(); * * @example * sharp('input.png') * .rotate(180) * .resize(300) - * .flatten() - * .background('#ff6600') - * .overlayWith('overlay.png', { gravity: sharp.gravity.southeast } ) + * .flatten( { background: '#ff6600' } ) + * .composite([{ input: 'overlay.png', gravity: 'southeast' }]) * .sharpen() * .withMetadata() - * .quality(90) - * .webp() + * .webp( { quality: 90 } ) * .toBuffer() * .then(function(outputBuffer) { * // outputBuffer contains upside down, 300px wide, alpha channel flattened @@ -28,70 +92,121 @@ const is = require('./is'); * // sharpened, with metadata, 90% quality WebP image data. Phew! * }); * - * @param {(Buffer|String)} overlay - Buffer containing image data or String containing the path to an image file. - * @param {Object} [options] - * @param {String} [options.gravity='centre'] - gravity at which to place the overlay. - * @param {Number} [options.top] - the pixel offset from the top edge. - * @param {Number} [options.left] - the pixel offset from the left edge. - * @param {Boolean} [options.tile=false] - set to true to repeat the overlay image across the entire image with the given `gravity`. - * @param {Boolean} [options.cutout=false] - set to true to apply only the alpha channel of the overlay image to the input image, giving the appearance of one image being cut out of another. - * @param {Number} [options.density=72] - integral number representing the DPI for vector overlay image. - * @param {Object} [options.raw] - describes overlay when using raw pixel data. - * @param {Number} [options.raw.width] - * @param {Number} [options.raw.height] - * @param {Number} [options.raw.channels] - * @param {Object} [options.create] - describes a blank overlay to be created. - * @param {Number} [options.create.width] - * @param {Number} [options.create.height] - * @param {Number} [options.create.channels] - 3-4 - * @param {String|Object} [options.create.background] - parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. + * @param {Object[]} images - Ordered list of images to composite + * @param {Buffer|String} [images[].input] - Buffer containing image data, String containing the path to an image file, or Create object (see below) + * @param {Object} [images[].input.create] - describes a blank overlay to be created. + * @param {Number} [images[].input.create.width] + * @param {Number} [images[].input.create.height] + * @param {Number} [images[].input.create.channels] - 3-4 + * @param {String|Object} [images[].input.create.background] - parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. + * @param {Object} [images[].input.text] - describes a new text image to be created. + * @param {string} [images[].input.text.text] - text to render as a UTF-8 string. It can contain Pango markup, for example `LeMonde`. + * @param {string} [images[].input.text.font] - font name to render with. + * @param {string} [images[].input.text.fontfile] - absolute filesystem path to a font file that can be used by `font`. + * @param {number} [images[].input.text.width=0] - integral number of pixels to word-wrap at. Lines of text wider than this will be broken at word boundaries. + * @param {number} [images[].input.text.height=0] - integral number of pixels high. When defined, `dpi` will be ignored and the text will automatically fit the pixel resolution defined by `width` and `height`. Will be ignored if `width` is not specified or set to 0. + * @param {string} [images[].input.text.align='left'] - text alignment (`'left'`, `'centre'`, `'center'`, `'right'`). + * @param {boolean} [images[].input.text.justify=false] - set this to true to apply justification to the text. + * @param {number} [images[].input.text.dpi=72] - the resolution (size) at which to render the text. Does not take effect if `height` is specified. + * @param {boolean} [images[].input.text.rgba=false] - set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for Pango markup features like `Red!`. + * @param {number} [images[].input.text.spacing=0] - text line height in points. Will use the font line height if none is specified. + * @param {Boolean} [images[].autoOrient=false] - set to true to use EXIF orientation data, if present, to orient the image. + * @param {String} [images[].blend='over'] - how to blend this image with the image below. + * @param {String} [images[].gravity='centre'] - gravity at which to place the overlay. + * @param {Number} [images[].top] - the pixel offset from the top edge. + * @param {Number} [images[].left] - the pixel offset from the left edge. + * @param {Boolean} [images[].tile=false] - set to true to repeat the overlay image across the entire image with the given `gravity`. + * @param {Boolean} [images[].premultiplied=false] - set to true to avoid premultiplying the image below. Equivalent to the `--premultiplied` vips option. + * @param {Number} [images[].density=72] - number representing the DPI for vector overlay image. + * @param {Object} [images[].raw] - describes overlay when using raw pixel data. + * @param {Number} [images[].raw.width] + * @param {Number} [images[].raw.height] + * @param {Number} [images[].raw.channels] + * @param {boolean} [images[].animated=false] - Set to `true` to read all frames/pages of an animated image. + * @param {string} [images[].failOn='warning'] - @see {@link /api-constructor/ constructor parameters} + * @param {number|boolean} [images[].limitInputPixels=268402689] - @see {@link /api-constructor/ constructor parameters} * @returns {Sharp} * @throws {Error} Invalid parameters */ -function overlayWith (overlay, options) { - this.options.overlay = this._createInputDescriptor(overlay, options, { - allowStream: false - }); - if (is.object(options)) { - if (is.defined(options.tile)) { - if (is.bool(options.tile)) { - this.options.overlayTile = options.tile; +function composite (images) { + if (!Array.isArray(images)) { + throw is.invalidParameterError('images to composite', 'array', images); + } + this.options.composite = images.map(image => { + if (!is.object(image)) { + throw is.invalidParameterError('image to composite', 'object', image); + } + const inputOptions = this._inputOptionsFromObject(image); + const composite = { + input: this._createInputDescriptor(image.input, inputOptions, { allowStream: false }), + blend: 'over', + tile: false, + left: 0, + top: 0, + hasOffset: false, + gravity: 0, + premultiplied: false + }; + if (is.defined(image.blend)) { + if (is.string(blend[image.blend])) { + composite.blend = blend[image.blend]; } else { - throw new Error('Invalid overlay tile ' + options.tile); + throw is.invalidParameterError('blend', 'valid blend name', image.blend); } } - if (is.defined(options.cutout)) { - if (is.bool(options.cutout)) { - this.options.overlayCutout = options.cutout; + if (is.defined(image.tile)) { + if (is.bool(image.tile)) { + composite.tile = image.tile; } else { - throw new Error('Invalid overlay cutout ' + options.cutout); + throw is.invalidParameterError('tile', 'boolean', image.tile); } } - if (is.defined(options.left) || is.defined(options.top)) { - if (is.integer(options.left) && options.left >= 0 && is.integer(options.top) && options.top >= 0) { - this.options.overlayXOffset = options.left; - this.options.overlayYOffset = options.top; + if (is.defined(image.left)) { + if (is.integer(image.left)) { + composite.left = image.left; } else { - throw new Error('Invalid overlay left ' + options.left + ' and/or top ' + options.top); + throw is.invalidParameterError('left', 'integer', image.left); } } - if (is.defined(options.gravity)) { - if (is.integer(options.gravity) && is.inRange(options.gravity, 0, 8)) { - this.options.overlayGravity = options.gravity; - } else if (is.string(options.gravity) && is.integer(this.constructor.gravity[options.gravity])) { - this.options.overlayGravity = this.constructor.gravity[options.gravity]; + if (is.defined(image.top)) { + if (is.integer(image.top)) { + composite.top = image.top; } else { - throw new Error('Unsupported overlay gravity ' + options.gravity); + throw is.invalidParameterError('top', 'integer', image.top); } } - } + if (is.defined(image.top) !== is.defined(image.left)) { + throw new Error('Expected both left and top to be set'); + } else { + composite.hasOffset = is.integer(image.top) && is.integer(image.left); + } + if (is.defined(image.gravity)) { + if (is.integer(image.gravity) && is.inRange(image.gravity, 0, 8)) { + composite.gravity = image.gravity; + } else if (is.string(image.gravity) && is.integer(this.constructor.gravity[image.gravity])) { + composite.gravity = this.constructor.gravity[image.gravity]; + } else { + throw is.invalidParameterError('gravity', 'valid gravity', image.gravity); + } + } + if (is.defined(image.premultiplied)) { + if (is.bool(image.premultiplied)) { + composite.premultiplied = image.premultiplied; + } else { + throw is.invalidParameterError('premultiplied', 'boolean', image.premultiplied); + } + } + return composite; + }); return this; } /** * Decorate the Sharp prototype with composite-related functions. + * @module Sharp * @private */ -module.exports = function (Sharp) { - Sharp.prototype.overlayWith = overlayWith; +module.exports = (Sharp) => { + Sharp.prototype.composite = composite; + Sharp.blend = blend; }; diff --git a/lib/constructor.js b/lib/constructor.js index b4364c6ad..9aac8105c 100644 --- a/lib/constructor.js +++ b/lib/constructor.js @@ -1,43 +1,40 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const path = require('path'); -const util = require('util'); -const stream = require('stream'); -const events = require('events'); -const semver = require('semver'); +const util = require('node:util'); +const stream = require('node:stream'); const is = require('./is'); -const sharp = require('../build/Release/sharp.node'); -// Versioning -let versions = { - vips: sharp.libvipsVersion() -}; -(function () { - // Does libvips meet minimum requirement? - const libvipsVersionMin = require('../package.json').config.libvips; - /* istanbul ignore if */ - if (semver.lt(versions.vips, libvipsVersionMin)) { - throw new Error('Found libvips ' + versions.vips + ' but require at least ' + libvipsVersionMin); - } - // Include versions of dependencies, if present - try { - versions = require('../vendor/lib/versions.json'); - } catch (err) {} -})(); +require('./sharp'); // Use NODE_DEBUG=sharp to enable libvips warnings const debuglog = util.debuglog('sharp'); +const queueListener = (queueLength) => { + Sharp.queue.emit('change', queueLength); +}; + /** - * @class Sharp - * * Constructor factory to create an instance of `sharp`, to which further methods are chained. * - * JPEG, PNG, WebP or TIFF format image data can be streamed out from this object. + * JPEG, PNG, WebP, GIF, AVIF or TIFF format image data can be streamed out from this object. * When using Stream based output, derived attributes are available from the `info` event. * + * Non-critical problems encountered during processing are emitted as `warning` events. + * * Implements the [stream.Duplex](http://nodejs.org/api/stream.html#stream_class_stream_duplex) class. * + * When loading more than one page/frame of an animated image, + * these are combined as a vertically-stacked "toilet roll" image + * where the overall height is the `pageHeight` multiplied by the number of `pages`. + * + * @constructs Sharp + * + * @emits Sharp#info + * @emits Sharp#warning + * * @example * sharp('input.jpg') * .resize(300, 200) @@ -47,50 +44,172 @@ const debuglog = util.debuglog('sharp'); * }); * * @example - * // Read image data from readableStream, + * // Read image data from remote URL, * // resize to 300 pixels wide, * // emit an 'info' event with calculated dimensions * // and finally write image data to writableStream - * var transformer = sharp() + * const { body } = fetch('https://...'); + * const readableStream = Readable.fromWeb(body); + * const transformer = sharp() * .resize(300) - * .on('info', function(info) { - * console.log('Image height is ' + info.height); + * .on('info', ({ height }) => { + * console.log(`Image height is ${height}`); * }); * readableStream.pipe(transformer).pipe(writableStream); * * @example - * // Create a blank 300x200 PNG image of semi-transluent red pixels + * // Create a blank 300x200 PNG image of semi-translucent red pixels * sharp({ * create: { * width: 300, * height: 200, * channels: 4, - * background: { r: 255, g: 0, b: 0, alpha: 128 } + * background: { r: 255, g: 0, b: 0, alpha: 0.5 } * } * }) * .png() * .toBuffer() * .then( ... ); * - * @param {(Buffer|String)} [input] - if present, can be - * a Buffer containing JPEG, PNG, WebP, GIF, SVG, TIFF or raw pixel image data, or - * a String containing the path to an JPEG, PNG, WebP, GIF, SVG or TIFF image file. - * JPEG, PNG, WebP, GIF, SVG, TIFF or raw pixel image data can be streamed into the object when not present. + * @example + * // Convert an animated GIF to an animated WebP + * await sharp('in.gif', { animated: true }).toFile('out.webp'); + * + * @example + * // Read a raw array of pixels and save it to a png + * const input = Uint8Array.from([255, 255, 255, 0, 0, 0]); // or Uint8ClampedArray + * const image = sharp(input, { + * // because the input does not contain its dimensions or how many channels it has + * // we need to specify it in the constructor options + * raw: { + * width: 2, + * height: 1, + * channels: 3 + * } + * }); + * await image.toFile('my-two-pixels.png'); + * + * @example + * // Generate RGB Gaussian noise + * await sharp({ + * create: { + * width: 300, + * height: 200, + * channels: 3, + * noise: { + * type: 'gaussian', + * mean: 128, + * sigma: 30 + * } + * } + * }).toFile('noise.png'); + * + * @example + * // Generate an image from text + * await sharp({ + * text: { + * text: 'Hello, world!', + * width: 400, // max width + * height: 300 // max height + * } + * }).toFile('text_bw.png'); + * + * @example + * // Generate an rgba image from text using pango markup and font + * await sharp({ + * text: { + * text: 'Red!blue', + * font: 'sans', + * rgba: true, + * dpi: 300 + * } + * }).toFile('text_rgba.png'); + * + * @example + * // Join four input images as a 2x2 grid with a 4 pixel gutter + * const data = await sharp( + * [image1, image2, image3, image4], + * { join: { across: 2, shim: 4 } } + * ).toBuffer(); + * + * @example + * // Generate a two-frame animated image from emoji + * const images = ['😀', '😛'].map(text => ({ + * text: { text, width: 64, height: 64, channels: 4, rgba: true } + * })); + * await sharp(images, { join: { animated: true } }).toFile('out.gif'); + * + * @param {(Buffer|ArrayBuffer|Uint8Array|Uint8ClampedArray|Int8Array|Uint16Array|Int16Array|Uint32Array|Int32Array|Float32Array|Float64Array|string|Array)} [input] - if present, can be + * a Buffer / ArrayBuffer / Uint8Array / Uint8ClampedArray containing JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image data, or + * a TypedArray containing raw pixel image data, or + * a String containing the filesystem path to an JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image file. + * An array of inputs can be provided, and these will be joined together. + * JPEG, PNG, WebP, AVIF, GIF, SVG, TIFF or raw pixel image data can be streamed into the object when not present. * @param {Object} [options] - if present, is an Object with optional attributes. - * @param {Number} [options.density=72] - integral number representing the DPI for vector images. + * @param {string} [options.failOn='warning'] - When to abort processing of invalid pixel data, one of (in order of sensitivity, least to most): 'none', 'truncated', 'error', 'warning'. Higher levels imply lower levels. Invalid metadata will always abort. + * @param {number|boolean} [options.limitInputPixels=268402689] - Do not process input images where the number of pixels + * (width x height) exceeds this limit. Assumes image dimensions contained in the input metadata can be trusted. + * An integral Number of pixels, zero or false to remove limit, true to use default limit of 268402689 (0x3FFF x 0x3FFF). + * @param {boolean} [options.unlimited=false] - Set this to `true` to remove safety features that help prevent memory exhaustion (JPEG, PNG, SVG, HEIF). + * @param {boolean} [options.autoOrient=false] - Set this to `true` to rotate/flip the image to match EXIF `Orientation`, if any. + * @param {boolean} [options.sequentialRead=true] - Set this to `false` to use random access rather than sequential read. Some operations will do this automatically. + * @param {number} [options.density=72] - number representing the DPI for vector images in the range 1 to 100000. + * @param {number} [options.ignoreIcc=false] - should the embedded ICC profile, if any, be ignored. + * @param {number} [options.pages=1] - Number of pages to extract for multi-page input (GIF, WebP, TIFF), use -1 for all pages. + * @param {number} [options.page=0] - Page number to start extracting from for multi-page input (GIF, WebP, TIFF), zero based. + * @param {boolean} [options.animated=false] - Set to `true` to read all frames/pages of an animated image (GIF, WebP, TIFF), equivalent of setting `pages` to `-1`. * @param {Object} [options.raw] - describes raw pixel input image data. See `raw()` for pixel ordering. - * @param {Number} [options.raw.width] - * @param {Number} [options.raw.height] - * @param {Number} [options.raw.channels] - 1-4 + * @param {number} [options.raw.width] - integral number of pixels wide. + * @param {number} [options.raw.height] - integral number of pixels high. + * @param {number} [options.raw.channels] - integral number of channels, between 1 and 4. + * @param {boolean} [options.raw.premultiplied] - specifies that the raw input has already been premultiplied, set to `true` + * to avoid sharp premultiplying the image. (optional, default `false`) + * @param {number} [options.raw.pageHeight] - The pixel height of each page/frame for animated images, must be an integral factor of `raw.height`. * @param {Object} [options.create] - describes a new image to be created. - * @param {Number} [options.create.width] - * @param {Number} [options.create.height] - * @param {Number} [options.create.channels] - 3-4 - * @param {String|Object} [options.create.background] - parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. + * @param {number} [options.create.width] - integral number of pixels wide. + * @param {number} [options.create.height] - integral number of pixels high. + * @param {number} [options.create.channels] - integral number of channels, either 3 (RGB) or 4 (RGBA). + * @param {string|Object} [options.create.background] - parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. + * @param {number} [options.create.pageHeight] - The pixel height of each page/frame for animated images, must be an integral factor of `create.height`. + * @param {Object} [options.create.noise] - describes a noise to be created. + * @param {string} [options.create.noise.type] - type of generated noise, currently only `gaussian` is supported. + * @param {number} [options.create.noise.mean=128] - Mean value of pixels in the generated noise. + * @param {number} [options.create.noise.sigma=30] - Standard deviation of pixel values in the generated noise. + * @param {Object} [options.text] - describes a new text image to be created. + * @param {string} [options.text.text] - text to render as a UTF-8 string. It can contain Pango markup, for example `LeMonde`. + * @param {string} [options.text.font] - font name to render with. + * @param {string} [options.text.fontfile] - absolute filesystem path to a font file that can be used by `font`. + * @param {number} [options.text.width=0] - Integral number of pixels to word-wrap at. Lines of text wider than this will be broken at word boundaries. + * @param {number} [options.text.height=0] - Maximum integral number of pixels high. When defined, `dpi` will be ignored and the text will automatically fit the pixel resolution defined by `width` and `height`. Will be ignored if `width` is not specified or set to 0. + * @param {string} [options.text.align='left'] - Alignment style for multi-line text (`'left'`, `'centre'`, `'center'`, `'right'`). + * @param {boolean} [options.text.justify=false] - set this to true to apply justification to the text. + * @param {number} [options.text.dpi=72] - the resolution (size) at which to render the text. Does not take effect if `height` is specified. + * @param {boolean} [options.text.rgba=false] - set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for pango markup features like `Red!`. + * @param {number} [options.text.spacing=0] - text line height in points. Will use the font line height if none is specified. + * @param {string} [options.text.wrap='word'] - word wrapping style when width is provided, one of: 'word', 'char', 'word-char' (prefer word, fallback to char) or 'none'. + * @param {Object} [options.join] - describes how an array of input images should be joined. + * @param {number} [options.join.across=1] - number of images to join horizontally. + * @param {boolean} [options.join.animated=false] - set this to `true` to join the images as an animated image. + * @param {number} [options.join.shim=0] - number of pixels to insert between joined images. + * @param {string|Object} [options.join.background] - parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. + * @param {string} [options.join.halign='left'] - horizontal alignment style for images joined horizontally (`'left'`, `'centre'`, `'center'`, `'right'`). + * @param {string} [options.join.valign='top'] - vertical alignment style for images joined vertically (`'top'`, `'centre'`, `'center'`, `'bottom'`). + * @param {Object} [options.tiff] - Describes TIFF specific options. + * @param {number} [options.tiff.subifd=-1] - Sub Image File Directory to extract for OME-TIFF, defaults to main image. + * @param {Object} [options.svg] - Describes SVG specific options. + * @param {string} [options.svg.stylesheet] - Custom CSS for SVG input, applied with a User Origin during the CSS cascade. + * @param {boolean} [options.svg.highBitdepth=false] - Set to `true` to render SVG input at 32-bits per channel (128-bit) instead of 8-bits per channel (32-bit) RGBA. + * @param {Object} [options.pdf] - Describes PDF specific options. Requires the use of a globally-installed libvips compiled with support for PDFium, Poppler, ImageMagick or GraphicsMagick. + * @param {string|Object} [options.pdf.background] - Background colour to use when PDF is partially transparent. Parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. + * @param {Object} [options.openSlide] - Describes OpenSlide specific options. Requires the use of a globally-installed libvips compiled with support for OpenSlide. + * @param {number} [options.openSlide.level=0] - Level to extract from a multi-level input, zero based. + * @param {Object} [options.jp2] - Describes JPEG 2000 specific options. Requires the use of a globally-installed libvips compiled with support for OpenJPEG. + * @param {boolean} [options.jp2.oneshot=false] - Set to `true` to decode tiled JPEG 2000 images in a single operation, improving compatibility. * @returns {Sharp} * @throws {Error} Invalid parameters */ const Sharp = function (input, options) { + // biome-ignore lint/complexity/noArguments: constructor factory if (arguments.length === 1 && !is.defined(input)) { throw new Error('Invalid input'); } @@ -99,11 +218,6 @@ const Sharp = function (input, options) { } stream.Duplex.call(this); this.options = { - // input options - sequentialRead: false, - limitInputPixels: Math.pow(0x3FFF, 2), - // ICC profiles - iccProfilePath: path.join(__dirname, 'icc') + path.sep, // resize options topOffsetPre: -1, leftOffsetPre: -1, @@ -116,52 +230,92 @@ const Sharp = function (input, options) { width: -1, height: -1, canvas: 'crop', - crop: 0, - useExifOrientation: false, + position: 0, + resizeBackground: [0, 0, 0, 255], angle: 0, - rotateBeforePreExtract: false, + rotationAngle: 0, + rotationBackground: [0, 0, 0, 255], + rotateBefore: false, + orientBefore: false, flip: false, flop: false, extendTop: 0, extendBottom: 0, extendLeft: 0, extendRight: 0, + extendBackground: [0, 0, 0, 255], + extendWith: 'background', withoutEnlargement: false, + withoutReduction: false, + affineMatrix: [], + affineBackground: [0, 0, 0, 255], + affineIdx: 0, + affineIdy: 0, + affineOdx: 0, + affineOdy: 0, + affineInterpolator: this.constructor.interpolators.bilinear, kernel: 'lanczos3', - interpolator: 'bicubic', - centreSampling: false, + fastShrinkOnLoad: true, // operations - background: [0, 0, 0, 255], + tint: [-1, 0, 0, 0], flatten: false, + flattenBackground: [0, 0, 0], + unflatten: false, negate: false, + negateAlpha: true, + medianSize: 0, blurSigma: 0, + precision: 'integer', + minAmpl: 0.2, sharpenSigma: 0, - sharpenFlat: 1, - sharpenJagged: 2, + sharpenM1: 1, + sharpenM2: 2, + sharpenX1: 2, + sharpenY2: 10, + sharpenY3: 20, threshold: 0, thresholdGrayscale: true, - trimTolerance: 0, + trimBackground: [], + trimThreshold: -1, + trimLineArt: false, + dilateWidth: 0, + erodeWidth: 0, gamma: 0, + gammaOut: 0, greyscale: false, - normalise: 0, + normalise: false, + normaliseLower: 1, + normaliseUpper: 99, + claheWidth: 0, + claheHeight: 0, + claheMaxSlope: 3, + brightness: 1, + saturation: 1, + hue: 0, + lightness: 0, booleanBufferIn: null, booleanFileIn: '', joinChannelIn: [], extractChannel: -1, + removeAlpha: false, + ensureAlpha: -1, colourspace: 'srgb', - // overlay - overlayGravity: 0, - overlayXOffset: -1, - overlayYOffset: -1, - overlayTile: false, - overlayCutout: false, + colourspacePipeline: 'last', + composite: [], // output fileOut: '', formatOut: 'input', streamOut: false, - withMetadata: false, + keepMetadata: 0, withMetadataOrientation: -1, + withMetadataDensity: 0, + withIccProfile: '', + withExif: {}, + withExifMerge: true, + withXmp: '', resolveWithObject: false, + loop: -1, + delay: [], // output format jpegQuality: 80, jpegProgressive: false, @@ -169,64 +323,177 @@ const Sharp = function (input, options) { jpegTrellisQuantisation: false, jpegOvershootDeringing: false, jpegOptimiseScans: false, + jpegOptimiseCoding: true, + jpegQuantisationTable: 0, pngProgressive: false, pngCompressionLevel: 6, - pngAdaptiveFiltering: true, + pngAdaptiveFiltering: false, + pngPalette: false, + pngQuality: 100, + pngEffort: 7, + pngBitdepth: 8, + pngDither: 1, + jp2Quality: 80, + jp2TileHeight: 512, + jp2TileWidth: 512, + jp2Lossless: false, + jp2ChromaSubsampling: '4:4:4', webpQuality: 80, webpAlphaQuality: 100, webpLossless: false, webpNearLossless: false, + webpSmartSubsample: false, + webpSmartDeblock: false, + webpPreset: 'default', + webpEffort: 4, + webpMinSize: false, + webpMixed: false, + gifBitdepth: 8, + gifEffort: 7, + gifDither: 1, + gifInterFrameMaxError: 0, + gifInterPaletteMaxError: 3, + gifKeepDuplicateFrames: false, + gifReuse: true, + gifProgressive: false, tiffQuality: 80, tiffCompression: 'jpeg', - tiffPredictor: 'none', - tiffSquash: false, + tiffBigtiff: false, + tiffPredictor: 'horizontal', + tiffPyramid: false, + tiffMiniswhite: false, + tiffBitdepth: 8, + tiffTile: false, + tiffTileHeight: 256, + tiffTileWidth: 256, tiffXres: 1.0, tiffYres: 1.0, + tiffResolutionUnit: 'inch', + heifQuality: 50, + heifLossless: false, + heifCompression: 'av1', + heifEffort: 4, + heifChromaSubsampling: '4:4:4', + heifBitdepth: 8, + jxlDistance: 1, + jxlDecodingTier: 0, + jxlEffort: 7, + jxlLossless: false, + rawDepth: 'uchar', tileSize: 256, tileOverlap: 0, + tileContainer: 'fs', + tileLayout: 'dz', + tileFormat: 'last', + tileDepth: 'last', + tileAngle: 0, + tileSkipBlanks: -1, + tileBackground: [255, 255, 255, 255], + tileCentre: false, + tileId: 'https://example.com/iiif', + tileBasename: '', + timeoutSeconds: 0, + linearA: [], + linearB: [], + pdfBackground: [255, 255, 255, 255], // Function to notify of libvips warnings - debuglog: debuglog, + debuglog: warning => { + this.emit('warning', warning); + debuglog(warning); + }, // Function to notify of queue length changes - queueListener: function (queueLength) { - queue.emit('change', queueLength); - } + queueListener }; this.options.input = this._createInputDescriptor(input, options, { allowStream: true }); return this; }; -util.inherits(Sharp, stream.Duplex); +Object.setPrototypeOf(Sharp.prototype, stream.Duplex.prototype); +Object.setPrototypeOf(Sharp, stream.Duplex); /** - * An EventEmitter that emits a `change` event when a task is either: - * - queued, waiting for _libuv_ to provide a worker thread - * - complete - * @member - * @example - * sharp.queue.on('change', function(queueLength) { - * console.log('Queue contains ' + queueLength + ' task(s)'); - * }); - */ -const queue = new events.EventEmitter(); -Sharp.queue = queue; - -/** - * An Object containing nested boolean values representing the available input and output formats/methods. + * Take a "snapshot" of the Sharp instance, returning a new instance. + * Cloned instances inherit the input of their parent instance. + * This allows multiple output Streams and therefore multiple processing pipelines to share a single input Stream. + * * @example - * console.log(sharp.format); - * @returns {Object} - */ -Sharp.format = sharp.format(); - -/** - * An Object containing the version numbers of libvips and its dependencies. - * @member + * const pipeline = sharp().rotate(); + * pipeline.clone().resize(800, 600).pipe(firstWritableStream); + * pipeline.clone().extract({ left: 20, top: 20, width: 100, height: 100 }).pipe(secondWritableStream); + * readableStream.pipe(pipeline); + * // firstWritableStream receives auto-rotated, resized readableStream + * // secondWritableStream receives auto-rotated, extracted region of readableStream + * * @example - * console.log(sharp.versions); + * // Create a pipeline that will download an image, resize it and format it to different files + * // Using Promises to know when the pipeline is complete + * const fs = require("fs"); + * const got = require("got"); + * const sharpStream = sharp({ failOn: 'none' }); + * + * const promises = []; + * + * promises.push( + * sharpStream + * .clone() + * .jpeg({ quality: 100 }) + * .toFile("originalFile.jpg") + * ); + * + * promises.push( + * sharpStream + * .clone() + * .resize({ width: 500 }) + * .jpeg({ quality: 80 }) + * .toFile("optimized-500.jpg") + * ); + * + * promises.push( + * sharpStream + * .clone() + * .resize({ width: 500 }) + * .webp({ quality: 80 }) + * .toFile("optimized-500.webp") + * ); + * + * // https://github.com/sindresorhus/got/blob/main/documentation/3-streams.md + * got.stream("https://www.example.com/some-file.jpg").pipe(sharpStream); + * + * Promise.all(promises) + * .then(res => { console.log("Done!", res); }) + * .catch(err => { + * console.error("Error processing files, let's clean it up", err); + * try { + * fs.unlinkSync("originalFile.jpg"); + * fs.unlinkSync("optimized-500.jpg"); + * fs.unlinkSync("optimized-500.webp"); + * } catch (e) {} + * }); + * + * @returns {Sharp} */ -Sharp.versions = versions; +function clone () { + // Clone existing options + const clone = this.constructor.call(); + const { debuglog, queueListener, ...options } = this.options; + clone.options = structuredClone(options); + clone.options.debuglog = debuglog; + clone.options.queueListener = queueListener; + // Pass 'finish' event to clone for Stream-based input + if (this._isStreamInput()) { + this.on('finish', () => { + // Clone inherits input data + this._flattenBufferIn(); + clone.options.input.buffer = this.options.input.buffer; + clone.emit('finish'); + }); + } + return clone; +} +Object.assign(Sharp.prototype, { clone }); /** * Export constructor. + * @module Sharp * @private */ module.exports = Sharp; diff --git a/lib/icc/sRGB.icc b/lib/icc/sRGB.icc deleted file mode 100644 index 95e249b85..000000000 Binary files a/lib/icc/sRGB.icc and /dev/null differ diff --git a/lib/index.d.ts b/lib/index.d.ts new file mode 100644 index 000000000..13714c947 --- /dev/null +++ b/lib/index.d.ts @@ -0,0 +1,1971 @@ +/** + * Copyright 2017 François Nguyen and others. + * + * Billy Kwok + * Bradley Odell + * Espen Hovlandsdal + * Floris de Bijl + * François Nguyen + * Jamie Woodbury + * Wooseop Kim + * + * Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated + * documentation files (the "Software"), to deal in the Software without restriction, including without limitation + * the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to + * permit persons to whom the Software is furnished to do so, subject to the following conditions: + * + * The above copyright notice and this permission notice shall be included in all copies or substantial portions of + * the Software. + * + * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE + * WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR + * COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR + * OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. + */ + +// SPDX-License-Identifier: MIT + +/// + +import type { Duplex } from 'node:stream'; + +//#region Constructor functions + +/** + * Creates a sharp instance from an image + * @param input Buffer containing JPEG, PNG, WebP, AVIF, GIF, SVG, TIFF or raw pixel image data, or String containing the path to an JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image file. + * @param options Object with optional attributes. + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ +declare function sharp(options?: sharp.SharpOptions): sharp.Sharp; +declare function sharp( + input?: sharp.SharpInput | Array, + options?: sharp.SharpOptions, +): sharp.Sharp; + +declare namespace sharp { + /** Object containing nested boolean values representing the available input and output formats/methods. */ + const format: FormatEnum; + + /** An Object containing the version numbers of sharp, libvips and its dependencies. */ + const versions: { + aom?: string | undefined; + archive?: string | undefined; + cairo?: string | undefined; + cgif?: string | undefined; + exif?: string | undefined; + expat?: string | undefined; + ffi?: string | undefined; + fontconfig?: string | undefined; + freetype?: string | undefined; + fribidi?: string | undefined; + glib?: string | undefined; + harfbuzz?: string | undefined; + heif?: string | undefined; + highway?: string | undefined; + imagequant?: string | undefined; + lcms?: string | undefined; + mozjpeg?: string | undefined; + pango?: string | undefined; + pixman?: string | undefined; + png?: string | undefined; + "proxy-libintl"?: string | undefined; + rsvg?: string | undefined; + sharp: string; + spng?: string | undefined; + tiff?: string | undefined; + vips: string; + webp?: string | undefined; + xml?: string | undefined; + "zlib-ng"?: string | undefined; + }; + + /** An Object containing the available interpolators and their proper values */ + const interpolators: Interpolators; + + /** An EventEmitter that emits a change event when a task is either queued, waiting for libuv to provide a worker thread, complete */ + const queue: NodeJS.EventEmitter; + + //#endregion + + //#region Utility functions + + /** + * Gets or, when options are provided, sets the limits of libvips' operation cache. + * Existing entries in the cache will be trimmed after any change in limits. + * This method always returns cache statistics, useful for determining how much working memory is required for a particular task. + * @param options Object with the following attributes, or Boolean where true uses default cache settings and false removes all caching (optional, default true) + * @returns The cache results. + */ + function cache(options?: boolean | CacheOptions): CacheResult; + + /** + * Gets or sets the number of threads libvips' should create to process each image. + * The default value is the number of CPU cores. A value of 0 will reset to this default. + * The maximum number of images that can be processed in parallel is limited by libuv's UV_THREADPOOL_SIZE environment variable. + * @param concurrency The new concurrency value. + * @returns The current concurrency value. + */ + function concurrency(concurrency?: number): number; + + /** + * Provides access to internal task counters. + * @returns Object containing task counters + */ + function counters(): SharpCounters; + + /** + * Get and set use of SIMD vector unit instructions. Requires libvips to have been compiled with highway support. + * Improves the performance of resize, blur and sharpen operations by taking advantage of the SIMD vector unit of the CPU, e.g. Intel SSE and ARM NEON. + * @param enable enable or disable use of SIMD vector unit instructions + * @returns true if usage of SIMD vector unit instructions is enabled + */ + function simd(enable?: boolean): boolean; + + /** + * Block libvips operations at runtime. + * + * This is in addition to the `VIPS_BLOCK_UNTRUSTED` environment variable, + * which when set will block all "untrusted" operations. + * + * @since 0.32.4 + * + * @example Block all TIFF input. + * sharp.block({ + * operation: ['VipsForeignLoadTiff'] + * }); + * + * @param {Object} options + * @param {Array} options.operation - List of libvips low-level operation names to block. + */ + function block(options: { operation: string[] }): void; + + /** + * Unblock libvips operations at runtime. + * + * This is useful for defining a list of allowed operations. + * + * @since 0.32.4 + * + * @example Block all input except WebP from the filesystem. + * sharp.block({ + * operation: ['VipsForeignLoad'] + * }); + * sharp.unblock({ + * operation: ['VipsForeignLoadWebpFile'] + * }); + * + * @example Block all input except JPEG and PNG from a Buffer or Stream. + * sharp.block({ + * operation: ['VipsForeignLoad'] + * }); + * sharp.unblock({ + * operation: ['VipsForeignLoadJpegBuffer', 'VipsForeignLoadPngBuffer'] + * }); + * + * @param {Object} options + * @param {Array} options.operation - List of libvips low-level operation names to unblock. + */ + function unblock(options: { operation: string[] }): void; + + //#endregion + + const gravity: GravityEnum; + const strategy: StrategyEnum; + const kernel: KernelEnum; + const fit: FitEnum; + const bool: BoolEnum; + + interface Sharp extends Duplex { + //#region Channel functions + + /** + * Remove alpha channel, if any. This is a no-op if the image does not have an alpha channel. + * @returns A sharp instance that can be used to chain operations + */ + removeAlpha(): Sharp; + + /** + * Ensure alpha channel, if missing. The added alpha channel will be fully opaque. This is a no-op if the image already has an alpha channel. + * @param alpha transparency level (0=fully-transparent, 1=fully-opaque) (optional, default 1). + * @returns A sharp instance that can be used to chain operations + */ + ensureAlpha(alpha?: number): Sharp; + + /** + * Extract a single channel from a multi-channel image. + * @param channel zero-indexed channel/band number to extract, or red, green, blue or alpha. + * @throws {Error} Invalid channel + * @returns A sharp instance that can be used to chain operations + */ + extractChannel(channel: 0 | 1 | 2 | 3 | 'red' | 'green' | 'blue' | 'alpha'): Sharp; + + /** + * Join one or more channels to the image. The meaning of the added channels depends on the output colourspace, set with toColourspace(). + * By default the output image will be web-friendly sRGB, with additional channels interpreted as alpha channels. Channel ordering follows vips convention: + * - sRGB: 0: Red, 1: Green, 2: Blue, 3: Alpha. + * - CMYK: 0: Magenta, 1: Cyan, 2: Yellow, 3: Black, 4: Alpha. + * + * Buffers may be any of the image formats supported by sharp. + * For raw pixel input, the options object should contain a raw attribute, which follows the format of the attribute of the same name in the sharp() constructor. + * @param images one or more images (file paths, Buffers). + * @param options image options, see sharp() constructor. + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + joinChannel(images: string | Buffer | ArrayLike, options?: SharpOptions): Sharp; + + /** + * Perform a bitwise boolean operation on all input image channels (bands) to produce a single channel output image. + * @param boolOp one of "and", "or" or "eor" to perform that bitwise operation, like the C logic operators &, | and ^ respectively. + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + bandbool(boolOp: keyof BoolEnum): Sharp; + + //#endregion + + //#region Color functions + + /** + * Tint the image using the provided colour. + * An alpha channel may be present and will be unchanged by the operation. + * @param tint Parsed by the color module. + * @returns A sharp instance that can be used to chain operations + */ + tint(tint: Colour | Color): Sharp; + + /** + * Convert to 8-bit greyscale; 256 shades of grey. + * This is a linear operation. + * If the input image is in a non-linear colour space such as sRGB, use gamma() with greyscale() for the best results. + * By default the output image will be web-friendly sRGB and contain three (identical) colour channels. + * This may be overridden by other sharp operations such as toColourspace('b-w'), which will produce an output image containing one colour channel. + * An alpha channel may be present, and will be unchanged by the operation. + * @param greyscale true to enable and false to disable (defaults to true) + * @returns A sharp instance that can be used to chain operations + */ + greyscale(greyscale?: boolean): Sharp; + + /** + * Alternative spelling of greyscale(). + * @param grayscale true to enable and false to disable (defaults to true) + * @returns A sharp instance that can be used to chain operations + */ + grayscale(grayscale?: boolean): Sharp; + + /** + * Set the pipeline colourspace. + * The input image will be converted to the provided colourspace at the start of the pipeline. + * All operations will use this colourspace before converting to the output colourspace, as defined by toColourspace. + * This feature is experimental and has not yet been fully-tested with all operations. + * + * @param colourspace pipeline colourspace e.g. rgb16, scrgb, lab, grey16 ... + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + pipelineColourspace(colourspace?: string): Sharp; + + /** + * Alternative spelling of pipelineColourspace + * @param colorspace pipeline colourspace e.g. rgb16, scrgb, lab, grey16 ... + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + pipelineColorspace(colorspace?: string): Sharp; + + /** + * Set the output colourspace. + * By default output image will be web-friendly sRGB, with additional channels interpreted as alpha channels. + * @param colourspace output colourspace e.g. srgb, rgb, cmyk, lab, b-w ... + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + toColourspace(colourspace?: string): Sharp; + + /** + * Alternative spelling of toColourspace(). + * @param colorspace output colorspace e.g. srgb, rgb, cmyk, lab, b-w ... + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + toColorspace(colorspace: string): Sharp; + + //#endregion + + //#region Composite functions + + /** + * Composite image(s) over the processed (resized, extracted etc.) image. + * + * The images to composite must be the same size or smaller than the processed image. + * If both `top` and `left` options are provided, they take precedence over `gravity`. + * @param images - Ordered list of images to composite + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + composite(images: OverlayOptions[]): Sharp; + + //#endregion + + //#region Input functions + + /** + * Take a "snapshot" of the Sharp instance, returning a new instance. + * Cloned instances inherit the input of their parent instance. + * This allows multiple output Streams and therefore multiple processing pipelines to share a single input Stream. + * @returns A sharp instance that can be used to chain operations + */ + clone(): Sharp; + + /** + * Fast access to (uncached) image metadata without decoding any compressed image data. + * @returns A sharp instance that can be used to chain operations + */ + metadata(callback: (err: Error, metadata: Metadata) => void): Sharp; + + /** + * Fast access to (uncached) image metadata without decoding any compressed image data. + * @returns A promise that resolves with a metadata object + */ + metadata(): Promise; + + /** + * Keep all metadata (EXIF, ICC, XMP, IPTC) from the input image in the output image. + * @returns A sharp instance that can be used to chain operations + */ + keepMetadata(): Sharp; + + /** + * Access to pixel-derived image statistics for every channel in the image. + * @returns A sharp instance that can be used to chain operations + */ + stats(callback: (err: Error, stats: Stats) => void): Sharp; + + /** + * Access to pixel-derived image statistics for every channel in the image. + * @returns A promise that resolves with a stats object + */ + stats(): Promise; + + //#endregion + + //#region Operation functions + + /** + * Rotate the output image by either an explicit angle + * or auto-orient based on the EXIF `Orientation` tag. + * + * If an angle is provided, it is converted to a valid positive degree rotation. + * For example, `-450` will produce a 270 degree rotation. + * + * When rotating by an angle other than a multiple of 90, + * the background colour can be provided with the `background` option. + * + * If no angle is provided, it is determined from the EXIF data. + * Mirroring is supported and may infer the use of a flip operation. + * + * The use of `rotate` without an angle will remove the EXIF `Orientation` tag, if any. + * + * Only one rotation can occur per pipeline (aside from an initial call without + * arguments to orient via EXIF data). Previous calls to `rotate` in the same + * pipeline will be ignored. + * + * Multi-page images can only be rotated by 180 degrees. + * + * Method order is important when rotating, resizing and/or extracting regions, + * for example `.rotate(x).extract(y)` will produce a different result to `.extract(y).rotate(x)`. + * + * @example + * const pipeline = sharp() + * .rotate() + * .resize(null, 200) + * .toBuffer(function (err, outputBuffer, info) { + * // outputBuffer contains 200px high JPEG image data, + * // auto-rotated using EXIF Orientation tag + * // info.width and info.height contain the dimensions of the resized image + * }); + * readableStream.pipe(pipeline); + * + * @example + * const rotateThenResize = await sharp(input) + * .rotate(90) + * .resize({ width: 16, height: 8, fit: 'fill' }) + * .toBuffer(); + * const resizeThenRotate = await sharp(input) + * .resize({ width: 16, height: 8, fit: 'fill' }) + * .rotate(90) + * .toBuffer(); + * + * @param {number} [angle=auto] angle of rotation. + * @param {Object} [options] - if present, is an Object with optional attributes. + * @param {string|Object} [options.background="#000000"] parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. + * @returns {Sharp} + * @throws {Error} Invalid parameters + */ + rotate(angle?: number, options?: RotateOptions): Sharp; + + /** + * Alias for calling `rotate()` with no arguments, which orients the image based + * on EXIF orientsion. + * + * This operation is aliased to emphasize its purpose, helping to remove any + * confusion between rotation and orientation. + * + * @example + * const output = await sharp(input).autoOrient().toBuffer(); + * + * @returns {Sharp} + */ + autoOrient(): Sharp + + /** + * Flip the image about the vertical Y axis. This always occurs after rotation, if any. + * The use of flip implies the removal of the EXIF Orientation tag, if any. + * @param flip true to enable and false to disable (defaults to true) + * @returns A sharp instance that can be used to chain operations + */ + flip(flip?: boolean): Sharp; + + /** + * Flop the image about the horizontal X axis. This always occurs after rotation, if any. + * The use of flop implies the removal of the EXIF Orientation tag, if any. + * @param flop true to enable and false to disable (defaults to true) + * @returns A sharp instance that can be used to chain operations + */ + flop(flop?: boolean): Sharp; + + /** + * Perform an affine transform on an image. This operation will always occur after resizing, extraction and rotation, if any. + * You must provide an array of length 4 or a 2x2 affine transformation matrix. + * By default, new pixels are filled with a black background. You can provide a background colour with the `background` option. + * A particular interpolator may also be specified. Set the `interpolator` option to an attribute of the `sharp.interpolators` Object e.g. `sharp.interpolators.nohalo`. + * + * In the case of a 2x2 matrix, the transform is: + * X = matrix[0, 0] * (x + idx) + matrix[0, 1] * (y + idy) + odx + * Y = matrix[1, 0] * (x + idx) + matrix[1, 1] * (y + idy) + ody + * + * where: + * + * x and y are the coordinates in input image. + * X and Y are the coordinates in output image. + * (0,0) is the upper left corner. + * + * @param matrix Affine transformation matrix, may either by a array of length four or a 2x2 matrix array + * @param options if present, is an Object with optional attributes. + * + * @returns A sharp instance that can be used to chain operations + */ + affine(matrix: [number, number, number, number] | Matrix2x2, options?: AffineOptions): Sharp; + + /** + * Sharpen the image. + * When used without parameters, performs a fast, mild sharpen of the output image. + * When a sigma is provided, performs a slower, more accurate sharpen of the L channel in the LAB colour space. + * Fine-grained control over the level of sharpening in "flat" (m1) and "jagged" (m2) areas is available. + * @param options if present, is an Object with optional attributes + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + sharpen(options?: SharpenOptions): Sharp; + + /** + * Sharpen the image. + * When used without parameters, performs a fast, mild sharpen of the output image. + * When a sigma is provided, performs a slower, more accurate sharpen of the L channel in the LAB colour space. + * Fine-grained control over the level of sharpening in "flat" (m1) and "jagged" (m2) areas is available. + * @param sigma the sigma of the Gaussian mask, where sigma = 1 + radius / 2. + * @param flat the level of sharpening to apply to "flat" areas. (optional, default 1.0) + * @param jagged the level of sharpening to apply to "jagged" areas. (optional, default 2.0) + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + * + * @deprecated Use the object parameter `sharpen({sigma, m1, m2, x1, y2, y3})` instead + */ + sharpen(sigma?: number, flat?: number, jagged?: number): Sharp; + + /** + * Apply median filter. When used without parameters the default window is 3x3. + * @param size square mask size: size x size (optional, default 3) + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + median(size?: number): Sharp; + + /** + * Blur the image. + * When used without parameters, performs a fast, mild blur of the output image. + * When a sigma is provided, performs a slower, more accurate Gaussian blur. + * When a boolean sigma is provided, ether blur mild or disable blur + * @param sigma a value between 0.3 and 1000 representing the sigma of the Gaussian mask, where sigma = 1 + radius / 2. + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + blur(sigma?: number | boolean | BlurOptions): Sharp; + + /** + * Expand foreground objects using the dilate morphological operator. + * @param {Number} [width=1] dilation width in pixels. + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + dilate(width?: number): Sharp; + + /** + * Shrink foreground objects using the erode morphological operator. + * @param {Number} [width=1] erosion width in pixels. + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + erode(width?: number): Sharp; + + /** + * Merge alpha transparency channel, if any, with background. + * @param flatten true to enable and false to disable (defaults to true) + * @returns A sharp instance that can be used to chain operations + */ + flatten(flatten?: boolean | FlattenOptions): Sharp; + + /** + * Ensure the image has an alpha channel with all white pixel values made fully transparent. + * Existing alpha channel values for non-white pixels remain unchanged. + * @returns A sharp instance that can be used to chain operations + */ + unflatten(): Sharp; + + /** + * Apply a gamma correction by reducing the encoding (darken) pre-resize at a factor of 1/gamma then increasing the encoding (brighten) post-resize at a factor of gamma. + * This can improve the perceived brightness of a resized image in non-linear colour spaces. + * JPEG and WebP input images will not take advantage of the shrink-on-load performance optimisation when applying a gamma correction. + * Supply a second argument to use a different output gamma value, otherwise the first value is used in both cases. + * @param gamma value between 1.0 and 3.0. (optional, default 2.2) + * @param gammaOut value between 1.0 and 3.0. (optional, defaults to same as gamma) + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + gamma(gamma?: number, gammaOut?: number): Sharp; + + /** + * Produce the "negative" of the image. + * @param negate true to enable and false to disable, or an object of options (defaults to true) + * @returns A sharp instance that can be used to chain operations + */ + negate(negate?: boolean | NegateOptions): Sharp; + + /** + * Enhance output image contrast by stretching its luminance to cover a full dynamic range. + * + * Uses a histogram-based approach, taking a default range of 1% to 99% to reduce sensitivity to noise at the extremes. + * + * Luminance values below the `lower` percentile will be underexposed by clipping to zero. + * Luminance values above the `upper` percentile will be overexposed by clipping to the max pixel value. + * + * @param normalise options + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + normalise(normalise?: NormaliseOptions): Sharp; + + /** + * Alternative spelling of normalise. + * @param normalize options + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + normalize(normalize?: NormaliseOptions): Sharp; + + /** + * Perform contrast limiting adaptive histogram equalization (CLAHE) + * + * This will, in general, enhance the clarity of the image by bringing out + * darker details. Please read more about CLAHE here: + * https://en.wikipedia.org/wiki/Adaptive_histogram_equalization#Contrast_Limited_AHE + * + * @param options clahe options + */ + clahe(options: ClaheOptions): Sharp; + + /** + * Convolve the image with the specified kernel. + * @param kernel the specified kernel + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + convolve(kernel: Kernel): Sharp; + + /** + * Any pixel value greather than or equal to the threshold value will be set to 255, otherwise it will be set to 0. + * @param threshold a value in the range 0-255 representing the level at which the threshold will be applied. (optional, default 128) + * @param options threshold options + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + threshold(threshold?: number, options?: ThresholdOptions): Sharp; + + /** + * Perform a bitwise boolean operation with operand image. + * This operation creates an output image where each pixel is the result of the selected bitwise boolean operation between the corresponding pixels of the input images. + * @param operand Buffer containing image data or String containing the path to an image file. + * @param operator one of "and", "or" or "eor" to perform that bitwise operation, like the C logic operators &, | and ^ respectively. + * @param options describes operand when using raw pixel data. + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + boolean(operand: string | Buffer, operator: keyof BoolEnum, options?: { raw: Raw }): Sharp; + + /** + * Apply the linear formula a * input + b to the image (levels adjustment) + * @param a multiplier (optional, default 1.0) + * @param b offset (optional, default 0.0) + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + linear(a?: number | number[] | null, b?: number | number[]): Sharp; + + /** + * Recomb the image with the specified matrix. + * @param inputMatrix 3x3 Recombination matrix or 4x4 Recombination matrix + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + recomb(inputMatrix: Matrix3x3 | Matrix4x4): Sharp; + + /** + * Transforms the image using brightness, saturation, hue rotation and lightness. + * Brightness and lightness both operate on luminance, with the difference being that brightness is multiplicative whereas lightness is additive. + * @param options describes the modulation + * @returns A sharp instance that can be used to chain operations + */ + modulate(options?: { + brightness?: number | undefined; + saturation?: number | undefined; + hue?: number | undefined; + lightness?: number | undefined; + }): Sharp; + + //#endregion + + //#region Output functions + + /** + * Write output image data to a file. + * If an explicit output format is not selected, it will be inferred from the extension, with JPEG, PNG, WebP, AVIF, TIFF, DZI, and libvips' V format supported. + * Note that raw pixel data is only supported for buffer output. + * @param fileOut The path to write the image data to. + * @param callback Callback function called on completion with two arguments (err, info). info contains the output image format, size (bytes), width, height and channels. + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + toFile(fileOut: string, callback: (err: Error, info: OutputInfo) => void): Sharp; + + /** + * Write output image data to a file. + * @param fileOut The path to write the image data to. + * @throws {Error} Invalid parameters + * @returns A promise that fulfills with an object containing information on the resulting file + */ + toFile(fileOut: string): Promise; + + /** + * Write output to a Buffer. JPEG, PNG, WebP, AVIF, TIFF, GIF and RAW output are supported. + * By default, the format will match the input image, except SVG input which becomes PNG output. + * @param callback Callback function called on completion with three arguments (err, buffer, info). + * @returns A sharp instance that can be used to chain operations + */ + toBuffer(callback: (err: Error, buffer: Buffer, info: OutputInfo) => void): Sharp; + + /** + * Write output to a Buffer. JPEG, PNG, WebP, AVIF, TIFF, GIF and RAW output are supported. + * By default, the format will match the input image, except SVG input which becomes PNG output. + * @param options resolve options + * @param options.resolveWithObject Resolve the Promise with an Object containing data and info properties instead of resolving only with data. + * @returns A promise that resolves with the Buffer data. + */ + toBuffer(options?: { resolveWithObject: false }): Promise; + + /** + * Write output to a Buffer. JPEG, PNG, WebP, AVIF, TIFF, GIF and RAW output are supported. + * By default, the format will match the input image, except SVG input which becomes PNG output. + * @param options resolve options + * @param options.resolveWithObject Resolve the Promise with an Object containing data and info properties instead of resolving only with data. + * @returns A promise that resolves with an object containing the Buffer data and an info object containing the output image format, size (bytes), width, height and channels + */ + toBuffer(options: { resolveWithObject: true }): Promise<{ data: Buffer; info: OutputInfo }>; + + /** + * Keep all EXIF metadata from the input image in the output image. + * EXIF metadata is unsupported for TIFF output. + * @returns A sharp instance that can be used to chain operations + */ + keepExif(): Sharp; + + /** + * Set EXIF metadata in the output image, ignoring any EXIF in the input image. + * @param {Exif} exif Object keyed by IFD0, IFD1 etc. of key/value string pairs to write as EXIF data. + * @returns A sharp instance that can be used to chain operations + * @throws {Error} Invalid parameters + */ + withExif(exif: Exif): Sharp; + + /** + * Update EXIF metadata from the input image in the output image. + * @param {Exif} exif Object keyed by IFD0, IFD1 etc. of key/value string pairs to write as EXIF data. + * @returns A sharp instance that can be used to chain operations + * @throws {Error} Invalid parameters + */ + withExifMerge(exif: Exif): Sharp; + + /** + * Keep ICC profile from the input image in the output image where possible. + * @returns A sharp instance that can be used to chain operations + */ + keepIccProfile(): Sharp; + + /** + * Transform using an ICC profile and attach to the output image. + * @param {string} icc - Absolute filesystem path to output ICC profile or built-in profile name (srgb, p3, cmyk). + * @returns A sharp instance that can be used to chain operations + * @throws {Error} Invalid parameters + */ + withIccProfile(icc: string, options?: WithIccProfileOptions): Sharp; + + /** + * Keep all XMP metadata from the input image in the output image. + * @returns A sharp instance that can be used to chain operations + */ + keepXmp(): Sharp; + + /** + * Set XMP metadata in the output image. + * @param {string} xmp - String containing XMP metadata to be embedded in the output image. + * @returns A sharp instance that can be used to chain operations + * @throws {Error} Invalid parameters + */ + withXmp(xmp: string): Sharp; + + /** + * Include all metadata (EXIF, XMP, IPTC) from the input image in the output image. + * The default behaviour, when withMetadata is not used, is to strip all metadata and convert to the device-independent sRGB colour space. + * This will also convert to and add a web-friendly sRGB ICC profile. + * @param withMetadata + * @throws {Error} Invalid parameters. + */ + withMetadata(withMetadata?: WriteableMetadata): Sharp; + + /** + * Use these JPEG options for output image. + * @param options Output options. + * @throws {Error} Invalid options + * @returns A sharp instance that can be used to chain operations + */ + jpeg(options?: JpegOptions): Sharp; + + /** + * Use these JP2 (JPEG 2000) options for output image. + * @param options Output options. + * @throws {Error} Invalid options + * @returns A sharp instance that can be used to chain operations + */ + jp2(options?: Jp2Options): Sharp; + + /** + * Use these JPEG-XL (JXL) options for output image. + * This feature is experimental, please do not use in production systems. + * Requires libvips compiled with support for libjxl. + * The prebuilt binaries do not include this. + * Image metadata (EXIF, XMP) is unsupported. + * @param options Output options. + * @throws {Error} Invalid options + * @returns A sharp instance that can be used to chain operations + */ + jxl(options?: JxlOptions): Sharp; + + /** + * Use these PNG options for output image. + * PNG output is always full colour at 8 or 16 bits per pixel. + * Indexed PNG input at 1, 2 or 4 bits per pixel is converted to 8 bits per pixel. + * @param options Output options. + * @throws {Error} Invalid options + * @returns A sharp instance that can be used to chain operations + */ + png(options?: PngOptions): Sharp; + + /** + * Use these WebP options for output image. + * @param options Output options. + * @throws {Error} Invalid options + * @returns A sharp instance that can be used to chain operations + */ + webp(options?: WebpOptions): Sharp; + + /** + * Use these GIF options for output image. + * Requires libvips compiled with support for ImageMagick or GraphicsMagick. The prebuilt binaries do not include this - see installing a custom libvips. + * @param options Output options. + * @throws {Error} Invalid options + * @returns A sharp instance that can be used to chain operations + */ + gif(options?: GifOptions): Sharp; + + /** + * Use these AVIF options for output image. + * @param options Output options. + * @throws {Error} Invalid options + * @returns A sharp instance that can be used to chain operations + */ + avif(options?: AvifOptions): Sharp; + + /** + * Use these HEIF options for output image. + * Support for patent-encumbered HEIC images requires the use of a globally-installed libvips compiled with support for libheif, libde265 and x265. + * @param options Output options. + * @throws {Error} Invalid options + * @returns A sharp instance that can be used to chain operations + */ + heif(options?: HeifOptions): Sharp; + + /** + * Use these TIFF options for output image. + * @param options Output options. + * @throws {Error} Invalid options + * @returns A sharp instance that can be used to chain operations + */ + tiff(options?: TiffOptions): Sharp; + + /** + * Force output to be raw, uncompressed uint8 pixel data. + * @param options Raw output options. + * @throws {Error} Invalid options + * @returns A sharp instance that can be used to chain operations + */ + raw(options?: RawOptions): Sharp; + + /** + * Force output to a given format. + * @param format a String or an Object with an 'id' attribute + * @param options output options + * @throws {Error} Unsupported format or options + * @returns A sharp instance that can be used to chain operations + */ + toFormat( + format: keyof FormatEnum | AvailableFormatInfo, + options?: + | OutputOptions + | JpegOptions + | PngOptions + | WebpOptions + | AvifOptions + | HeifOptions + | JxlOptions + | GifOptions + | Jp2Options + | RawOptions + | TiffOptions, + ): Sharp; + + /** + * Use tile-based deep zoom (image pyramid) output. + * Set the format and options for tile images via the toFormat, jpeg, png or webp functions. + * Use a .zip or .szi file extension with toFile to write to a compressed archive file format. + * @param tile tile options + * @throws {Error} Invalid options + * @returns A sharp instance that can be used to chain operations + */ + tile(tile?: TileOptions): Sharp; + + /** + * Set a timeout for processing, in seconds. Use a value of zero to continue processing indefinitely, the default behaviour. + * The clock starts when libvips opens an input image for processing. Time spent waiting for a libuv thread to become available is not included. + * @param options Object with a `seconds` attribute between 0 and 3600 (number) + * @throws {Error} Invalid options + * @returns A sharp instance that can be used to chain operations + */ + timeout(options: TimeoutOptions): Sharp; + + //#endregion + + //#region Resize functions + + /** + * Resize image to width, height or width x height. + * + * When both a width and height are provided, the possible methods by which the image should fit these are: + * - cover: Crop to cover both provided dimensions (the default). + * - contain: Embed within both provided dimensions. + * - fill: Ignore the aspect ratio of the input and stretch to both provided dimensions. + * - inside: Preserving aspect ratio, resize the image to be as large as possible while ensuring its dimensions are less than or equal to both those specified. + * - outside: Preserving aspect ratio, resize the image to be as small as possible while ensuring its dimensions are greater than or equal to both those specified. + * Some of these values are based on the object-fit CSS property. + * + * When using a fit of cover or contain, the default position is centre. Other options are: + * - sharp.position: top, right top, right, right bottom, bottom, left bottom, left, left top. + * - sharp.gravity: north, northeast, east, southeast, south, southwest, west, northwest, center or centre. + * - sharp.strategy: cover only, dynamically crop using either the entropy or attention strategy. Some of these values are based on the object-position CSS property. + * + * The experimental strategy-based approach resizes so one dimension is at its target length then repeatedly ranks edge regions, + * discarding the edge with the lowest score based on the selected strategy. + * - entropy: focus on the region with the highest Shannon entropy. + * - attention: focus on the region with the highest luminance frequency, colour saturation and presence of skin tones. + * + * Possible interpolation kernels are: + * - nearest: Use nearest neighbour interpolation. + * - cubic: Use a Catmull-Rom spline. + * - lanczos2: Use a Lanczos kernel with a=2. + * - lanczos3: Use a Lanczos kernel with a=3 (the default). + * + * @param width pixels wide the resultant image should be. Use null or undefined to auto-scale the width to match the height. + * @param height pixels high the resultant image should be. Use null or undefined to auto-scale the height to match the width. + * @param options resize options + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + resize(widthOrOptions?: number | ResizeOptions | null, height?: number | null, options?: ResizeOptions): Sharp; + + /** + * Shorthand for resize(null, null, options); + * + * @param options resize options + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + resize(options: ResizeOptions): Sharp; + + /** + * Extend / pad / extrude one or more edges of the image with either + * the provided background colour or pixels derived from the image. + * This operation will always occur after resizing and extraction, if any. + * @param extend single pixel count to add to all edges or an Object with per-edge counts + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + extend(extend: number | ExtendOptions): Sharp; + + /** + * Extract a region of the image. + * - Use extract() before resize() for pre-resize extraction. + * - Use extract() after resize() for post-resize extraction. + * - Use extract() before and after for both. + * + * @param region The region to extract + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + extract(region: Region): Sharp; + + /** + * Trim pixels from all edges that contain values similar to the given background colour, which defaults to that of the top-left pixel. + * Images with an alpha channel will use the combined bounding box of alpha and non-alpha channels. + * The info response Object will contain trimOffsetLeft and trimOffsetTop properties. + * @param options trim options + * @throws {Error} Invalid parameters + * @returns A sharp instance that can be used to chain operations + */ + trim(options?: TrimOptions): Sharp; + + //#endregion + } + + type SharpInput = Buffer + | ArrayBuffer + | Uint8Array + | Uint8ClampedArray + | Int8Array + | Uint16Array + | Int16Array + | Uint32Array + | Int32Array + | Float32Array + | Float64Array + | string; + + interface SharpOptions { + /** + * Auto-orient based on the EXIF `Orientation` tag, if present. + * Mirroring is supported and may infer the use of a flip operation. + * + * Using this option will remove the EXIF `Orientation` tag, if any. + */ + autoOrient?: boolean | undefined; + /** + * When to abort processing of invalid pixel data, one of (in order of sensitivity): + * 'none' (least), 'truncated', 'error' or 'warning' (most), highers level imply lower levels, invalid metadata will always abort. (optional, default 'warning') + */ + failOn?: FailOnOptions | undefined; + /** + * By default halt processing and raise an error when loading invalid images. + * Set this flag to false if you'd rather apply a "best effort" to decode images, + * even if the data is corrupt or invalid. (optional, default true) + * + * @deprecated Use `failOn` instead + */ + failOnError?: boolean | undefined; + /** + * Do not process input images where the number of pixels (width x height) exceeds this limit. + * Assumes image dimensions contained in the input metadata can be trusted. + * An integral Number of pixels, zero or false to remove limit, true to use default limit of 268402689 (0x3FFF x 0x3FFF). (optional, default 268402689) + */ + limitInputPixels?: number | boolean | undefined; + /** Set this to true to remove safety features that help prevent memory exhaustion (SVG, PNG). (optional, default false) */ + unlimited?: boolean | undefined; + /** Set this to false to use random access rather than sequential read. Some operations will do this automatically. */ + sequentialRead?: boolean | undefined; + /** Number representing the DPI for vector images in the range 1 to 100000. (optional, default 72) */ + density?: number | undefined; + /** Should the embedded ICC profile, if any, be ignored. */ + ignoreIcc?: boolean | undefined; + /** Number of pages to extract for multi-page input (GIF, TIFF, PDF), use -1 for all pages */ + pages?: number | undefined; + /** Page number to start extracting from for multi-page input (GIF, TIFF, PDF), zero based. (optional, default 0) */ + page?: number | undefined; + /** TIFF specific input options */ + tiff?: TiffInputOptions | undefined; + /** SVG specific input options */ + svg?: SvgInputOptions | undefined; + /** PDF specific input options */ + pdf?: PdfInputOptions | undefined; + /** OpenSlide specific input options */ + openSlide?: OpenSlideInputOptions | undefined; + /** JPEG 2000 specific input options */ + jp2?: Jp2InputOptions | undefined; + /** @deprecated Use {@link SharpOptions.tiff} instead */ + subifd?: number | undefined; + /** @deprecated Use {@link SharpOptions.pdf} instead */ + pdfBackground?: Colour | Color | undefined; + /** @deprecated Use {@link SharpOptions.openSlide} instead */ + level?: number | undefined; + /** Set to `true` to read all frames/pages of an animated image (equivalent of setting `pages` to `-1`). (optional, default false) */ + animated?: boolean | undefined; + /** Describes raw pixel input image data. See raw() for pixel ordering. */ + raw?: CreateRaw | undefined; + /** Describes a new image to be created. */ + create?: Create | undefined; + /** Describes a new text image to be created. */ + text?: CreateText | undefined; + /** Describes how array of input images should be joined. */ + join?: Join | undefined; + } + + interface CacheOptions { + /** Is the maximum memory in MB to use for this cache (optional, default 50) */ + memory?: number | undefined; + /** Is the maximum number of files to hold open (optional, default 20) */ + files?: number | undefined; + /** Is the maximum number of operations to cache (optional, default 100) */ + items?: number | undefined; + } + + interface TimeoutOptions { + /** Number of seconds after which processing will be stopped (default 0, eg disabled) */ + seconds: number; + } + + interface SharpCounters { + /** The number of tasks this module has queued waiting for libuv to provide a worker thread from its pool. */ + queue: number; + /** The number of resize tasks currently being processed. */ + process: number; + } + + interface Raw { + width: number; + height: number; + channels: Channels; + } + + interface CreateRaw extends Raw { + /** Specifies that the raw input has already been premultiplied, set to true to avoid sharp premultiplying the image. (optional, default false) */ + premultiplied?: boolean | undefined; + /** The height of each page/frame for animated images, must be an integral factor of the overall image height. */ + pageHeight?: number | undefined; + } + + type CreateChannels = 3 | 4; + + interface Create { + /** Number of pixels wide. */ + width: number; + /** Number of pixels high. */ + height: number; + /** Number of bands, 3 for RGB, 4 for RGBA */ + channels: CreateChannels; + /** Parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. */ + background: Colour | Color; + /** Describes a noise to be created. */ + noise?: Noise | undefined; + /** The height of each page/frame for animated images, must be an integral factor of the overall image height. */ + pageHeight?: number | undefined; + + } + + interface CreateText { + /** Text to render as a UTF-8 string. It can contain Pango markup, for example `LeMonde`. */ + text: string; + /** Font name to render with. */ + font?: string; + /** Absolute filesystem path to a font file that can be used by `font`. */ + fontfile?: string; + /** Integral number of pixels to word-wrap at. Lines of text wider than this will be broken at word boundaries. (optional, default `0`) */ + width?: number; + /** + * Integral number of pixels high. When defined, `dpi` will be ignored and the text will automatically fit the pixel resolution + * defined by `width` and `height`. Will be ignored if `width` is not specified or set to 0. (optional, default `0`) + */ + height?: number; + /** Text alignment ('left', 'centre', 'center', 'right'). (optional, default 'left') */ + align?: TextAlign; + /** Set this to true to apply justification to the text. (optional, default `false`) */ + justify?: boolean; + /** The resolution (size) at which to render the text. Does not take effect if `height` is specified. (optional, default `72`) */ + dpi?: number; + /** + * Set this to true to enable RGBA output. This is useful for colour emoji rendering, + * or support for pango markup features like `Red!`. (optional, default `false`) + */ + rgba?: boolean; + /** Text line height in points. Will use the font line height if none is specified. (optional, default `0`) */ + spacing?: number; + /** Word wrapping style when width is provided, one of: 'word', 'char', 'word-char' (prefer word, fallback to char) or 'none' */ + wrap?: TextWrap; + } + + interface Join { + /** Number of images per row. */ + across?: number | undefined; + /** Treat input as frames of an animated image. */ + animated?: boolean | undefined; + /** Space between images, in pixels. */ + shim?: number | undefined; + /** Background colour. */ + background?: Colour | Color | undefined; + /** Horizontal alignment. */ + halign?: HorizontalAlignment | undefined; + /** Vertical alignment. */ + valign?: VerticalAlignment | undefined; + } + + interface TiffInputOptions { + /** Sub Image File Directory to extract, defaults to main image. Use -1 for all subifds. */ + subifd?: number | undefined; + } + + interface SvgInputOptions { + /** Custom CSS for SVG input, applied with a User Origin during the CSS cascade. */ + stylesheet?: string | undefined; + /** Set to `true` to render SVG input at 32-bits per channel (128-bit) instead of 8-bits per channel (32-bit) RGBA. */ + highBitdepth?: boolean | undefined; + } + + interface PdfInputOptions { + /** Background colour to use when PDF is partially transparent. Requires the use of a globally-installed libvips compiled with support for PDFium, Poppler, ImageMagick or GraphicsMagick. */ + background?: Colour | Color | undefined; + } + + interface OpenSlideInputOptions { + /** Level to extract from a multi-level input, zero based. (optional, default 0) */ + level?: number | undefined; + } + + interface Jp2InputOptions { + /** Set to `true` to load JPEG 2000 images using [oneshot mode](https://github.com/libvips/libvips/issues/4205) */ + oneshot?: boolean | undefined; + } + + interface ExifDir { + [k: string]: string; + } + + interface Exif { + 'IFD0'?: ExifDir; + 'IFD1'?: ExifDir; + 'IFD2'?: ExifDir; + 'IFD3'?: ExifDir; + } + + type HeifCompression = 'av1' | 'hevc'; + + type Unit = 'inch' | 'cm'; + + interface WriteableMetadata { + /** Number of pixels per inch (DPI) */ + density?: number | undefined; + /** Value between 1 and 8, used to update the EXIF Orientation tag. */ + orientation?: number | undefined; + /** + * Filesystem path to output ICC profile, defaults to sRGB. + * @deprecated Use `withIccProfile()` instead. + */ + icc?: string | undefined; + /** + * Object keyed by IFD0, IFD1 etc. of key/value string pairs to write as EXIF data. + * @deprecated Use `withExif()` or `withExifMerge()` instead. + */ + exif?: Exif | undefined; + } + + interface Metadata { + /** Number value of the EXIF Orientation header, if present */ + orientation?: number | undefined; + /** Name of decoder used to decompress image data e.g. jpeg, png, webp, gif, svg */ + format: keyof FormatEnum; + /** Total size of image in bytes, for Stream and Buffer input only */ + size?: number | undefined; + /** Number of pixels wide (EXIF orientation is not taken into consideration) */ + width: number; + /** Number of pixels high (EXIF orientation is not taken into consideration) */ + height: number; + /** Any changed metadata after the image orientation is applied. */ + autoOrient: { + /** Number of pixels wide (EXIF orientation is taken into consideration) */ + width: number; + /** Number of pixels high (EXIF orientation is taken into consideration) */ + height: number; + }; + /** Name of colour space interpretation */ + space: keyof ColourspaceEnum; + /** Number of bands e.g. 3 for sRGB, 4 for CMYK */ + channels: Channels; + /** Name of pixel depth format e.g. uchar, char, ushort, float ... */ + depth: keyof DepthEnum; + /** Number of pixels per inch (DPI), if present */ + density?: number | undefined; + /** String containing JPEG chroma subsampling, 4:2:0 or 4:4:4 for RGB, 4:2:0:4 or 4:4:4:4 for CMYK */ + chromaSubsampling?: string | undefined; + /** Boolean indicating whether the image is interlaced using a progressive scan */ + isProgressive: boolean; + /** Boolean indicating whether the image is palette-based (GIF, PNG). */ + isPalette: boolean; + /** Number of bits per sample for each channel (GIF, PNG). */ + bitsPerSample?: number | undefined; + /** Number of pages/frames contained within the image, with support for TIFF, HEIF, PDF, animated GIF and animated WebP */ + pages?: number | undefined; + /** Number of pixels high each page in a multi-page image will be. */ + pageHeight?: number | undefined; + /** Number of times to loop an animated image, zero refers to a continuous loop. */ + loop?: number | undefined; + /** Delay in ms between each page in an animated image, provided as an array of integers. */ + delay?: number[] | undefined; + /** Number of the primary page in a HEIF image */ + pagePrimary?: number | undefined; + /** Boolean indicating the presence of an embedded ICC profile */ + hasProfile: boolean; + /** Boolean indicating the presence of an alpha transparency channel */ + hasAlpha: boolean; + /** Buffer containing raw EXIF data, if present */ + exif?: Buffer | undefined; + /** Buffer containing raw ICC profile data, if present */ + icc?: Buffer | undefined; + /** Buffer containing raw IPTC data, if present */ + iptc?: Buffer | undefined; + /** Buffer containing raw XMP data, if present */ + xmp?: Buffer | undefined; + /** String containing XMP data, if valid UTF-8 */ + xmpAsString?: string | undefined; + /** Buffer containing raw TIFFTAG_PHOTOSHOP data, if present */ + tifftagPhotoshop?: Buffer | undefined; + /** The encoder used to compress an HEIF file, `av1` (AVIF) or `hevc` (HEIC) */ + compression?: HeifCompression | undefined; + /** Default background colour, if present, for PNG (bKGD) and GIF images */ + background?: { r: number; g: number; b: number } | { gray: number }; + /** Details of each level in a multi-level image provided as an array of objects, requires libvips compiled with support for OpenSlide */ + levels?: LevelMetadata[] | undefined; + /** Number of Sub Image File Directories in an OME-TIFF image */ + subifds?: number | undefined; + /** The unit of resolution (density) */ + resolutionUnit?: Unit | undefined; + /** String containing format for images loaded via *magick */ + formatMagick?: string | undefined; + /** Array of keyword/text pairs representing PNG text blocks, if present. */ + comments?: CommentsMetadata[] | undefined; + } + + interface LevelMetadata { + width: number; + height: number; + } + + interface CommentsMetadata { + keyword: string; + text: string; + } + + interface Stats { + /** Array of channel statistics for each channel in the image. */ + channels: ChannelStats[]; + /** Value to identify if the image is opaque or transparent, based on the presence and use of alpha channel */ + isOpaque: boolean; + /** Histogram-based estimation of greyscale entropy, discarding alpha channel if any (experimental) */ + entropy: number; + /** Estimation of greyscale sharpness based on the standard deviation of a Laplacian convolution, discarding alpha channel if any (experimental) */ + sharpness: number; + /** Object containing most dominant sRGB colour based on a 4096-bin 3D histogram (experimental) */ + dominant: { r: number; g: number; b: number }; + } + + interface ChannelStats { + /** minimum value in the channel */ + min: number; + /** maximum value in the channel */ + max: number; + /** sum of all values in a channel */ + sum: number; + /** sum of squared values in a channel */ + squaresSum: number; + /** mean of the values in a channel */ + mean: number; + /** standard deviation for the values in a channel */ + stdev: number; + /** x-coordinate of one of the pixel where the minimum lies */ + minX: number; + /** y-coordinate of one of the pixel where the minimum lies */ + minY: number; + /** x-coordinate of one of the pixel where the maximum lies */ + maxX: number; + /** y-coordinate of one of the pixel where the maximum lies */ + maxY: number; + } + + interface OutputOptions { + /** Force format output, otherwise attempt to use input format (optional, default true) */ + force?: boolean | undefined; + } + + interface WithIccProfileOptions { + /** Should the ICC profile be included in the output image metadata? (optional, default true) */ + attach?: boolean | undefined; + } + + interface JpegOptions extends OutputOptions { + /** Quality, integer 1-100 (optional, default 80) */ + quality?: number | undefined; + /** Use progressive (interlace) scan (optional, default false) */ + progressive?: boolean | undefined; + /** Set to '4:4:4' to prevent chroma subsampling when quality <= 90 (optional, default '4:2:0') */ + chromaSubsampling?: string | undefined; + /** Apply trellis quantisation (optional, default false) */ + trellisQuantisation?: boolean | undefined; + /** Apply overshoot deringing (optional, default false) */ + overshootDeringing?: boolean | undefined; + /** Optimise progressive scans, forces progressive (optional, default false) */ + optimiseScans?: boolean | undefined; + /** Alternative spelling of optimiseScans (optional, default false) */ + optimizeScans?: boolean | undefined; + /** Optimise Huffman coding tables (optional, default true) */ + optimiseCoding?: boolean | undefined; + /** Alternative spelling of optimiseCoding (optional, default true) */ + optimizeCoding?: boolean | undefined; + /** Quantization table to use, integer 0-8 (optional, default 0) */ + quantisationTable?: number | undefined; + /** Alternative spelling of quantisationTable (optional, default 0) */ + quantizationTable?: number | undefined; + /** Use mozjpeg defaults (optional, default false) */ + mozjpeg?: boolean | undefined; + } + + interface Jp2Options extends OutputOptions { + /** Quality, integer 1-100 (optional, default 80) */ + quality?: number; + /** Use lossless compression mode (optional, default false) */ + lossless?: boolean; + /** Horizontal tile size (optional, default 512) */ + tileWidth?: number; + /** Vertical tile size (optional, default 512) */ + tileHeight?: number; + /** Set to '4:2:0' to enable chroma subsampling (optional, default '4:4:4') */ + chromaSubsampling?: '4:4:4' | '4:2:0'; + } + + interface JxlOptions extends OutputOptions { + /** Maximum encoding error, between 0 (highest quality) and 15 (lowest quality) (optional, default 1.0) */ + distance?: number; + /** Calculate distance based on JPEG-like quality, between 1 and 100, overrides distance if specified */ + quality?: number; + /** Target decode speed tier, between 0 (highest quality) and 4 (lowest quality) (optional, default 0) */ + decodingTier?: number; + /** Use lossless compression (optional, default false) */ + lossless?: boolean; + /** CPU effort, between 3 (fastest) and 9 (slowest) (optional, default 7) */ + effort?: number | undefined; + } + + interface WebpOptions extends OutputOptions, AnimationOptions { + /** Quality, integer 1-100 (optional, default 80) */ + quality?: number | undefined; + /** Quality of alpha layer, number from 0-100 (optional, default 100) */ + alphaQuality?: number | undefined; + /** Use lossless compression mode (optional, default false) */ + lossless?: boolean | undefined; + /** Use near_lossless compression mode (optional, default false) */ + nearLossless?: boolean | undefined; + /** Use high quality chroma subsampling (optional, default false) */ + smartSubsample?: boolean | undefined; + /** Auto-adjust the deblocking filter, slow but can improve low contrast edges (optional, default false) */ + smartDeblock?: boolean | undefined; + /** Level of CPU effort to reduce file size, integer 0-6 (optional, default 4) */ + effort?: number | undefined; + /** Prevent use of animation key frames to minimise file size (slow) (optional, default false) */ + minSize?: boolean; + /** Allow mixture of lossy and lossless animation frames (slow) (optional, default false) */ + mixed?: boolean; + /** Preset options: one of default, photo, picture, drawing, icon, text (optional, default 'default') */ + preset?: keyof PresetEnum | undefined; + } + + interface AvifOptions extends OutputOptions { + /** quality, integer 1-100 (optional, default 50) */ + quality?: number | undefined; + /** use lossless compression (optional, default false) */ + lossless?: boolean | undefined; + /** Level of CPU effort to reduce file size, between 0 (fastest) and 9 (slowest) (optional, default 4) */ + effort?: number | undefined; + /** set to '4:2:0' to use chroma subsampling, requires libvips v8.11.0 (optional, default '4:4:4') */ + chromaSubsampling?: string | undefined; + /** Set bitdepth to 8, 10 or 12 bit (optional, default 8) */ + bitdepth?: 8 | 10 | 12 | undefined; + } + + interface HeifOptions extends OutputOptions { + /** quality, integer 1-100 (optional, default 50) */ + quality?: number | undefined; + /** compression format: av1, hevc (optional, default 'av1') */ + compression?: HeifCompression | undefined; + /** use lossless compression (optional, default false) */ + lossless?: boolean | undefined; + /** Level of CPU effort to reduce file size, between 0 (fastest) and 9 (slowest) (optional, default 4) */ + effort?: number | undefined; + /** set to '4:2:0' to use chroma subsampling (optional, default '4:4:4') */ + chromaSubsampling?: string | undefined; + /** Set bitdepth to 8, 10 or 12 bit (optional, default 8) */ + bitdepth?: 8 | 10 | 12 | undefined; + } + + interface GifOptions extends OutputOptions, AnimationOptions { + /** Re-use existing palette, otherwise generate new (slow) */ + reuse?: boolean | undefined; + /** Use progressive (interlace) scan */ + progressive?: boolean | undefined; + /** Maximum number of palette entries, including transparency, between 2 and 256 (optional, default 256) */ + colours?: number | undefined; + /** Alternative spelling of "colours". Maximum number of palette entries, including transparency, between 2 and 256 (optional, default 256) */ + colors?: number | undefined; + /** Level of CPU effort to reduce file size, between 1 (fastest) and 10 (slowest) (optional, default 7) */ + effort?: number | undefined; + /** Level of Floyd-Steinberg error diffusion, between 0 (least) and 1 (most) (optional, default 1.0) */ + dither?: number | undefined; + /** Maximum inter-frame error for transparency, between 0 (lossless) and 32 (optional, default 0) */ + interFrameMaxError?: number | undefined; + /** Maximum inter-palette error for palette reuse, between 0 and 256 (optional, default 3) */ + interPaletteMaxError?: number | undefined; + /** Keep duplicate frames in the output instead of combining them (optional, default false) */ + keepDuplicateFrames?: boolean | undefined; + } + + interface TiffOptions extends OutputOptions { + /** Quality, integer 1-100 (optional, default 80) */ + quality?: number | undefined; + /** Compression options: none, jpeg, deflate, packbits, ccittfax4, lzw, webp, zstd, jp2k (optional, default 'jpeg') */ + compression?: string | undefined; + /** Use BigTIFF variant (has no effect when compression is none) (optional, default false) */ + bigtiff?: boolean | undefined; + /** Compression predictor options: none, horizontal, float (optional, default 'horizontal') */ + predictor?: string | undefined; + /** Write an image pyramid (optional, default false) */ + pyramid?: boolean | undefined; + /** Write a tiled tiff (optional, default false) */ + tile?: boolean | undefined; + /** Horizontal tile size (optional, default 256) */ + tileWidth?: number | undefined; + /** Vertical tile size (optional, default 256) */ + tileHeight?: number | undefined; + /** Horizontal resolution in pixels/mm (optional, default 1.0) */ + xres?: number | undefined; + /** Vertical resolution in pixels/mm (optional, default 1.0) */ + yres?: number | undefined; + /** Reduce bitdepth to 1, 2 or 4 bit (optional, default 8) */ + bitdepth?: 1 | 2 | 4 | 8 | undefined; + /** Write 1-bit images as miniswhite (optional, default false) */ + miniswhite?: boolean | undefined; + /** Resolution unit options: inch, cm (optional, default 'inch') */ + resolutionUnit?: Unit | undefined; + } + + interface PngOptions extends OutputOptions { + /** Use progressive (interlace) scan (optional, default false) */ + progressive?: boolean | undefined; + /** zlib compression level, 0-9 (optional, default 6) */ + compressionLevel?: number | undefined; + /** Use adaptive row filtering (optional, default false) */ + adaptiveFiltering?: boolean | undefined; + /** Use the lowest number of colours needed to achieve given quality (optional, default `100`) */ + quality?: number | undefined; + /** Level of CPU effort to reduce file size, between 1 (fastest) and 10 (slowest), sets palette to true (optional, default 7) */ + effort?: number | undefined; + /** Quantise to a palette-based image with alpha transparency support (optional, default false) */ + palette?: boolean | undefined; + /** Maximum number of palette entries (optional, default 256) */ + colours?: number | undefined; + /** Alternative Spelling of "colours". Maximum number of palette entries (optional, default 256) */ + colors?: number | undefined; + /** Level of Floyd-Steinberg error diffusion (optional, default 1.0) */ + dither?: number | undefined; + } + + interface RotateOptions { + /** parsed by the color module to extract values for red, green, blue and alpha. (optional, default "#000000") */ + background?: Colour | Color | undefined; + } + + type Precision = 'integer' | 'float' | 'approximate'; + + interface BlurOptions { + /** A value between 0.3 and 1000 representing the sigma of the Gaussian mask, where `sigma = 1 + radius / 2` */ + sigma: number; + /** A value between 0.001 and 1. A smaller value will generate a larger, more accurate mask. */ + minAmplitude?: number; + /** How accurate the operation should be, one of: integer, float, approximate. (optional, default "integer") */ + precision?: Precision | undefined; + } + + interface FlattenOptions { + /** background colour, parsed by the color module, defaults to black. (optional, default {r:0,g:0,b:0}) */ + background?: Colour | Color | undefined; + } + + interface NegateOptions { + /** whether or not to negate any alpha channel. (optional, default true) */ + alpha?: boolean | undefined; + } + + interface NormaliseOptions { + /** Percentile below which luminance values will be underexposed. */ + lower?: number | undefined; + /** Percentile above which luminance values will be overexposed. */ + upper?: number | undefined; + } + + interface ResizeOptions { + /** Alternative means of specifying width. If both are present this takes priority. */ + width?: number | undefined; + /** Alternative means of specifying height. If both are present this takes priority. */ + height?: number | undefined; + /** How the image should be resized to fit both provided dimensions, one of cover, contain, fill, inside or outside. (optional, default 'cover') */ + fit?: keyof FitEnum | undefined; + /** Position, gravity or strategy to use when fit is cover or contain. (optional, default 'centre') */ + position?: number | string | undefined; + /** Background colour when using a fit of contain, parsed by the color module, defaults to black without transparency. (optional, default {r:0,g:0,b:0,alpha:1}) */ + background?: Colour | Color | undefined; + /** The kernel to use for image reduction. (optional, default 'lanczos3') */ + kernel?: keyof KernelEnum | undefined; + /** Do not enlarge if the width or height are already less than the specified dimensions, equivalent to GraphicsMagick's > geometry option. (optional, default false) */ + withoutEnlargement?: boolean | undefined; + /** Do not reduce if the width or height are already greater than the specified dimensions, equivalent to GraphicsMagick's < geometry option. (optional, default false) */ + withoutReduction?: boolean | undefined; + /** Take greater advantage of the JPEG and WebP shrink-on-load feature, which can lead to a slight moiré pattern on some images. (optional, default true) */ + fastShrinkOnLoad?: boolean | undefined; + } + + interface Region { + /** zero-indexed offset from left edge */ + left: number; + /** zero-indexed offset from top edge */ + top: number; + /** dimension of extracted image */ + width: number; + /** dimension of extracted image */ + height: number; + } + + interface Noise { + /** type of generated noise, currently only gaussian is supported. */ + type: 'gaussian'; + /** mean of pixels in generated noise. */ + mean?: number | undefined; + /** standard deviation of pixels in generated noise. */ + sigma?: number | undefined; + } + + type ExtendWith = 'background' | 'copy' | 'repeat' | 'mirror'; + + interface ExtendOptions { + /** single pixel count to top edge (optional, default 0) */ + top?: number | undefined; + /** single pixel count to left edge (optional, default 0) */ + left?: number | undefined; + /** single pixel count to bottom edge (optional, default 0) */ + bottom?: number | undefined; + /** single pixel count to right edge (optional, default 0) */ + right?: number | undefined; + /** background colour, parsed by the color module, defaults to black without transparency. (optional, default {r:0,g:0,b:0,alpha:1}) */ + background?: Colour | Color | undefined; + /** how the extension is done, one of: "background", "copy", "repeat", "mirror" (optional, default `'background'`) */ + extendWith?: ExtendWith | undefined; + } + + interface TrimOptions { + /** Background colour, parsed by the color module, defaults to that of the top-left pixel. (optional) */ + background?: Colour | Color | undefined; + /** Allowed difference from the above colour, a positive number. (optional, default 10) */ + threshold?: number | undefined; + /** Does the input more closely resemble line art (e.g. vector) rather than being photographic? (optional, default false) */ + lineArt?: boolean | undefined; + } + + interface RawOptions { + depth?: keyof DepthEnum; + } + + /** 1 for grayscale, 2 for grayscale + alpha, 3 for sRGB, 4 for CMYK or RGBA */ + type Channels = 1 | 2 | 3 | 4; + + interface RGBA { + r?: number | undefined; + g?: number | undefined; + b?: number | undefined; + alpha?: number | undefined; + } + + type Colour = string | RGBA; + type Color = Colour; + + interface Kernel { + /** width of the kernel in pixels. */ + width: number; + /** height of the kernel in pixels. */ + height: number; + /** Array of length width*height containing the kernel values. */ + kernel: ArrayLike; + /** the scale of the kernel in pixels. (optional, default sum) */ + scale?: number | undefined; + /** the offset of the kernel in pixels. (optional, default 0) */ + offset?: number | undefined; + } + + interface ClaheOptions { + /** width of the region */ + width: number; + /** height of the region */ + height: number; + /** max slope of the cumulative contrast. A value of 0 disables contrast limiting. Valid values are integers in the range 0-100 (inclusive) (optional, default 3) */ + maxSlope?: number | undefined; + } + + interface ThresholdOptions { + /** convert to single channel greyscale. (optional, default true) */ + greyscale?: boolean | undefined; + /** alternative spelling for greyscale. (optional, default true) */ + grayscale?: boolean | undefined; + } + + interface OverlayOptions extends SharpOptions { + /** Buffer containing image data, String containing the path to an image file, or Create object */ + input?: string | Buffer | { create: Create } | { text: CreateText } | { raw: CreateRaw } | undefined; + /** how to blend this image with the image below. (optional, default `'over'`) */ + blend?: Blend | undefined; + /** gravity at which to place the overlay. (optional, default 'centre') */ + gravity?: Gravity | undefined; + /** the pixel offset from the top edge. */ + top?: number | undefined; + /** the pixel offset from the left edge. */ + left?: number | undefined; + /** set to true to repeat the overlay image across the entire image with the given gravity. (optional, default false) */ + tile?: boolean | undefined; + /** Set to true to avoid premultipling the image below. Equivalent to the --premultiplied vips option. */ + premultiplied?: boolean | undefined; + /** number representing the DPI for vector overlay image. (optional, default 72)*/ + density?: number | undefined; + /** Set to true to read all frames/pages of an animated image. (optional, default false) */ + animated?: boolean | undefined; + /** see sharp() constructor, (optional, default 'warning') */ + failOn?: FailOnOptions | undefined; + /** see sharp() constructor, (optional, default 268402689) */ + limitInputPixels?: number | boolean | undefined; + /** see sharp() constructor, (optional, default false) */ + autoOrient?: boolean | undefined; + } + + interface TileOptions { + /** Tile size in pixels, a value between 1 and 8192. (optional, default 256) */ + size?: number | undefined; + /** Tile overlap in pixels, a value between 0 and 8192. (optional, default 0) */ + overlap?: number | undefined; + /** Tile angle of rotation, must be a multiple of 90. (optional, default 0) */ + angle?: number | undefined; + /** background colour, parsed by the color module, defaults to white without transparency. (optional, default {r:255,g:255,b:255,alpha:1}) */ + background?: string | RGBA | undefined; + /** How deep to make the pyramid, possible values are "onepixel", "onetile" or "one" (default based on layout) */ + depth?: string | undefined; + /** Threshold to skip tile generation, a value 0 - 255 for 8-bit images or 0 - 65535 for 16-bit images */ + skipBlanks?: number | undefined; + /** Tile container, with value fs (filesystem) or zip (compressed file). (optional, default 'fs') */ + container?: TileContainer | undefined; + /** Filesystem layout, possible values are dz, iiif, iiif3, zoomify or google. (optional, default 'dz') */ + layout?: TileLayout | undefined; + /** Centre image in tile. (optional, default false) */ + centre?: boolean | undefined; + /** Alternative spelling of centre. (optional, default false) */ + center?: boolean | undefined; + /** When layout is iiif/iiif3, sets the @id/id attribute of info.json (optional, default 'https://example.com/iiif') */ + id?: string | undefined; + /** The name of the directory within the zip file when container is `zip`. */ + basename?: string | undefined; + } + + interface AnimationOptions { + /** Number of animation iterations, a value between 0 and 65535. Use 0 for infinite animation. (optional, default 0) */ + loop?: number | undefined; + /** delay(s) between animation frames (in milliseconds), each value between 0 and 65535. (optional) */ + delay?: number | number[] | undefined; + } + + interface SharpenOptions { + /** The sigma of the Gaussian mask, where sigma = 1 + radius / 2, between 0.000001 and 10000 */ + sigma: number; + /** The level of sharpening to apply to "flat" areas, between 0 and 1000000 (optional, default 1.0) */ + m1?: number | undefined; + /** The level of sharpening to apply to "jagged" areas, between 0 and 1000000 (optional, default 2.0) */ + m2?: number | undefined; + /** Threshold between "flat" and "jagged", between 0 and 1000000 (optional, default 2.0) */ + x1?: number | undefined; + /** Maximum amount of brightening, between 0 and 1000000 (optional, default 10.0) */ + y2?: number | undefined; + /** Maximum amount of darkening, between 0 and 1000000 (optional, default 20.0) */ + y3?: number | undefined; + } + + interface AffineOptions { + /** Parsed by the color module to extract values for red, green, blue and alpha. (optional, default "#000000") */ + background?: string | object | undefined; + /** Input horizontal offset (optional, default 0) */ + idx?: number | undefined; + /** Input vertical offset (optional, default 0) */ + idy?: number | undefined; + /** Output horizontal offset (optional, default 0) */ + odx?: number | undefined; + /** Output horizontal offset (optional, default 0) */ + ody?: number | undefined; + /** Interpolator (optional, default sharp.interpolators.bicubic) */ + interpolator?: Interpolators[keyof Interpolators] | undefined; + } + + interface OutputInfo { + format: string; + size: number; + width: number; + height: number; + channels: Channels; + /** indicating if premultiplication was used */ + premultiplied: boolean; + /** Only defined when using a crop strategy */ + cropOffsetLeft?: number | undefined; + /** Only defined when using a crop strategy */ + cropOffsetTop?: number | undefined; + /** Only defined when using a trim method */ + trimOffsetLeft?: number | undefined; + /** Only defined when using a trim method */ + trimOffsetTop?: number | undefined; + /** DPI the font was rendered at, only defined when using `text` input */ + textAutofitDpi?: number | undefined; + /** When using the attention crop strategy, the focal point of the cropped region */ + attentionX?: number | undefined; + attentionY?: number | undefined; + /** Number of pages/frames contained within the image, with support for TIFF, HEIF, PDF, animated GIF and animated WebP */ + pages?: number | undefined; + /** Number of pixels high each page in a multi-page image will be. */ + pageHeight?: number | undefined; + } + + interface AvailableFormatInfo { + id: string; + input: { file: boolean; buffer: boolean; stream: boolean; fileSuffix?: string[] }; + output: { file: boolean; buffer: boolean; stream: boolean; alias?: string[] }; + } + + interface FitEnum { + contain: 'contain'; + cover: 'cover'; + fill: 'fill'; + inside: 'inside'; + outside: 'outside'; + } + + interface KernelEnum { + nearest: 'nearest'; + cubic: 'cubic'; + linear: 'linear'; + mitchell: 'mitchell'; + lanczos2: 'lanczos2'; + lanczos3: 'lanczos3'; + mks2013: 'mks2013'; + mks2021: 'mks2021'; + } + + interface PresetEnum { + default: 'default'; + picture: 'picture'; + photo: 'photo'; + drawing: 'drawing'; + icon: 'icon'; + text: 'text'; + } + + interface BoolEnum { + and: 'and'; + or: 'or'; + eor: 'eor'; + } + + interface ColourspaceEnum { + 'b-w': string; + cmc: string; + cmyk: string; + fourier: string; + grey16: string; + histogram: string; + hsv: string; + lab: string; + labq: string; + labs: string; + lch: string; + matrix: string; + multiband: string; + rgb: string; + rgb16: string; + scrgb: string; + srgb: string; + xyz: string; + yxy: string; + } + + interface DepthEnum { + char: string; + complex: string; + double: string; + dpcomplex: string; + float: string; + int: string; + short: string; + uchar: string; + uint: string; + ushort: string; + } + + type FailOnOptions = 'none' | 'truncated' | 'error' | 'warning'; + + type TextAlign = 'left' | 'centre' | 'center' | 'right'; + + type TextWrap = 'word' | 'char' | 'word-char' | 'none'; + + type HorizontalAlignment = 'left' | 'centre' | 'center' | 'right'; + + type VerticalAlignment = 'top' | 'centre' | 'center' | 'bottom'; + + type TileContainer = 'fs' | 'zip'; + + type TileLayout = 'dz' | 'iiif' | 'iiif3' | 'zoomify' | 'google'; + + type Blend = + | 'clear' + | 'source' + | 'over' + | 'in' + | 'out' + | 'atop' + | 'dest' + | 'dest-over' + | 'dest-in' + | 'dest-out' + | 'dest-atop' + | 'xor' + | 'add' + | 'saturate' + | 'multiply' + | 'screen' + | 'overlay' + | 'darken' + | 'lighten' + | 'color-dodge' + | 'colour-dodge' + | 'color-burn' + | 'colour-burn' + | 'hard-light' + | 'soft-light' + | 'difference' + | 'exclusion'; + + type Gravity = number | string; + + interface GravityEnum { + north: number; + northeast: number; + southeast: number; + south: number; + southwest: number; + west: number; + northwest: number; + east: number; + center: number; + centre: number; + } + + interface StrategyEnum { + entropy: number; + attention: number; + } + + interface FormatEnum { + avif: AvailableFormatInfo; + dcraw: AvailableFormatInfo; + dz: AvailableFormatInfo; + exr: AvailableFormatInfo; + fits: AvailableFormatInfo; + gif: AvailableFormatInfo; + heif: AvailableFormatInfo; + input: AvailableFormatInfo; + jpeg: AvailableFormatInfo; + jpg: AvailableFormatInfo; + jp2: AvailableFormatInfo; + jxl: AvailableFormatInfo; + magick: AvailableFormatInfo; + openslide: AvailableFormatInfo; + pdf: AvailableFormatInfo; + png: AvailableFormatInfo; + ppm: AvailableFormatInfo; + rad: AvailableFormatInfo; + raw: AvailableFormatInfo; + svg: AvailableFormatInfo; + tiff: AvailableFormatInfo; + tif: AvailableFormatInfo; + v: AvailableFormatInfo; + webp: AvailableFormatInfo; + } + + interface CacheResult { + memory: { current: number; high: number; max: number }; + files: { current: number; max: number }; + items: { current: number; max: number }; + } + + interface Interpolators { + /** [Nearest neighbour interpolation](http://en.wikipedia.org/wiki/Nearest-neighbor_interpolation). Suitable for image enlargement only. */ + nearest: 'nearest'; + /** [Bilinear interpolation](http://en.wikipedia.org/wiki/Bilinear_interpolation). Faster than bicubic but with less smooth results. */ + bilinear: 'bilinear'; + /** [Bicubic interpolation](http://en.wikipedia.org/wiki/Bicubic_interpolation) (the default). */ + bicubic: 'bicubic'; + /** + * [LBB interpolation](https://github.com/libvips/libvips/blob/master/libvips/resample/lbb.cpp#L100). + * Prevents some "[acutance](http://en.wikipedia.org/wiki/Acutance)" but typically reduces performance by a factor of 2. + */ + locallyBoundedBicubic: 'lbb'; + /** [Nohalo interpolation](http://eprints.soton.ac.uk/268086/). Prevents acutance but typically reduces performance by a factor of 3. */ + nohalo: 'nohalo'; + /** [VSQBS interpolation](https://github.com/libvips/libvips/blob/master/libvips/resample/vsqbs.cpp#L48). Prevents "staircasing" when enlarging. */ + vertexSplitQuadraticBasisSpline: 'vsqbs'; + } + + type Matrix2x2 = [[number, number], [number, number]]; + type Matrix3x3 = [[number, number, number], [number, number, number], [number, number, number]]; + type Matrix4x4 = [[number, number, number, number], [number, number, number, number], [number, number, number, number], [number, number, number, number]]; +} + +export = sharp; diff --git a/lib/index.js b/lib/index.js index 0976cbb6e..b80191d71 100644 --- a/lib/index.js +++ b/lib/index.js @@ -1,17 +1,16 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ const Sharp = require('./constructor'); -[ - 'input', - 'resize', - 'composite', - 'operation', - 'colour', - 'channel', - 'output', - 'utility' -].forEach(function (decorator) { - require('./' + decorator)(Sharp); -}); +require('./input')(Sharp); +require('./resize')(Sharp); +require('./composite')(Sharp); +require('./operation')(Sharp); +require('./colour')(Sharp); +require('./channel')(Sharp); +require('./output')(Sharp); +require('./utility')(Sharp); module.exports = Sharp; diff --git a/lib/input.js b/lib/input.js index 3c434a338..48388a1d2 100644 --- a/lib/input.js +++ b/lib/input.js @@ -1,37 +1,177 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const color = require('color'); const is = require('./is'); -const sharp = require('../build/Release/sharp.node'); +const sharp = require('./sharp'); + +/** + * Justification alignment + * @member + * @private + */ +const align = { + left: 'low', + top: 'low', + low: 'low', + center: 'centre', + centre: 'centre', + right: 'high', + bottom: 'high', + high: 'high' +}; + +const inputStreamParameters = [ + // Limits and error handling + 'failOn', 'limitInputPixels', 'unlimited', + // Format-generic + 'animated', 'autoOrient', 'density', 'ignoreIcc', 'page', 'pages', 'sequentialRead', + // Format-specific + 'jp2', 'openSlide', 'pdf', 'raw', 'svg', 'tiff', + // Deprecated + 'failOnError', 'openSlideLevel', 'pdfBackground', 'tiffSubifd' +]; + +/** + * Extract input options, if any, from an object. + * @private + */ +function _inputOptionsFromObject (obj) { + const params = inputStreamParameters + .filter(p => is.defined(obj[p])) + .map(p => ([p, obj[p]])); + return params.length + ? Object.fromEntries(params) + : undefined; +} /** * Create Object containing input and input-related options. * @private */ function _createInputDescriptor (input, inputOptions, containerOptions) { - const inputDescriptor = {}; + const inputDescriptor = { + autoOrient: false, + failOn: 'warning', + limitInputPixels: 0x3FFF ** 2, + ignoreIcc: false, + unlimited: false, + sequentialRead: true + }; if (is.string(input)) { // filesystem inputDescriptor.file = input; } else if (is.buffer(input)) { // Buffer + if (input.length === 0) { + throw Error('Input Buffer is empty'); + } inputDescriptor.buffer = input; + } else if (is.arrayBuffer(input)) { + if (input.byteLength === 0) { + throw Error('Input bit Array is empty'); + } + inputDescriptor.buffer = Buffer.from(input, 0, input.byteLength); + } else if (is.typedArray(input)) { + if (input.length === 0) { + throw Error('Input Bit Array is empty'); + } + inputDescriptor.buffer = Buffer.from(input.buffer, input.byteOffset, input.byteLength); } else if (is.plainObject(input) && !is.defined(inputOptions)) { // Plain Object descriptor, e.g. create inputOptions = input; - } else if (!is.defined(input) && is.object(containerOptions) && containerOptions.allowStream) { - // Stream + if (_inputOptionsFromObject(inputOptions)) { + // Stream with options + inputDescriptor.buffer = []; + } + } else if (!is.defined(input) && !is.defined(inputOptions) && is.object(containerOptions) && containerOptions.allowStream) { + // Stream without options inputDescriptor.buffer = []; + } else if (Array.isArray(input)) { + if (input.length > 1) { + // Join images together + if (!this.options.joining) { + this.options.joining = true; + this.options.join = input.map(i => this._createInputDescriptor(i)); + } else { + throw new Error('Recursive join is unsupported'); + } + } else { + throw new Error('Expected at least two images to join'); + } } else { - throw new Error('Unsupported input ' + typeof input); + throw new Error(`Unsupported input '${input}' of type ${typeof input}${ + is.defined(inputOptions) ? ` when also providing options of type ${typeof inputOptions}` : '' + }`); } if (is.object(inputOptions)) { + // Deprecated: failOnError + if (is.defined(inputOptions.failOnError)) { + if (is.bool(inputOptions.failOnError)) { + inputDescriptor.failOn = inputOptions.failOnError ? 'warning' : 'none'; + } else { + throw is.invalidParameterError('failOnError', 'boolean', inputOptions.failOnError); + } + } + // failOn + if (is.defined(inputOptions.failOn)) { + if (is.string(inputOptions.failOn) && is.inArray(inputOptions.failOn, ['none', 'truncated', 'error', 'warning'])) { + inputDescriptor.failOn = inputOptions.failOn; + } else { + throw is.invalidParameterError('failOn', 'one of: none, truncated, error, warning', inputOptions.failOn); + } + } + // autoOrient + if (is.defined(inputOptions.autoOrient)) { + if (is.bool(inputOptions.autoOrient)) { + inputDescriptor.autoOrient = inputOptions.autoOrient; + } else { + throw is.invalidParameterError('autoOrient', 'boolean', inputOptions.autoOrient); + } + } // Density if (is.defined(inputOptions.density)) { - if (is.integer(inputOptions.density) && is.inRange(inputOptions.density, 1, 2400)) { + if (is.inRange(inputOptions.density, 1, 100000)) { inputDescriptor.density = inputOptions.density; } else { - throw new Error('Invalid density (1 to 2400) ' + inputOptions.density); + throw is.invalidParameterError('density', 'number between 1 and 100000', inputOptions.density); + } + } + // Ignore embeddded ICC profile + if (is.defined(inputOptions.ignoreIcc)) { + if (is.bool(inputOptions.ignoreIcc)) { + inputDescriptor.ignoreIcc = inputOptions.ignoreIcc; + } else { + throw is.invalidParameterError('ignoreIcc', 'boolean', inputOptions.ignoreIcc); + } + } + // limitInputPixels + if (is.defined(inputOptions.limitInputPixels)) { + if (is.bool(inputOptions.limitInputPixels)) { + inputDescriptor.limitInputPixels = inputOptions.limitInputPixels + ? 0x3FFF ** 2 + : 0; + } else if (is.integer(inputOptions.limitInputPixels) && is.inRange(inputOptions.limitInputPixels, 0, Number.MAX_SAFE_INTEGER)) { + inputDescriptor.limitInputPixels = inputOptions.limitInputPixels; + } else { + throw is.invalidParameterError('limitInputPixels', 'positive integer', inputOptions.limitInputPixels); + } + } + // unlimited + if (is.defined(inputOptions.unlimited)) { + if (is.bool(inputOptions.unlimited)) { + inputDescriptor.unlimited = inputOptions.unlimited; + } else { + throw is.invalidParameterError('unlimited', 'boolean', inputOptions.unlimited); + } + } + // sequentialRead + if (is.defined(inputOptions.sequentialRead)) { + if (is.bool(inputOptions.sequentialRead)) { + inputDescriptor.sequentialRead = inputOptions.sequentialRead; + } else { + throw is.invalidParameterError('sequentialRead', 'boolean', inputOptions.sequentialRead); } } // Raw pixel input @@ -45,9 +185,142 @@ function _createInputDescriptor (input, inputOptions, containerOptions) { inputDescriptor.rawWidth = inputOptions.raw.width; inputDescriptor.rawHeight = inputOptions.raw.height; inputDescriptor.rawChannels = inputOptions.raw.channels; + switch (input.constructor) { + case Uint8Array: + case Uint8ClampedArray: + inputDescriptor.rawDepth = 'uchar'; + break; + case Int8Array: + inputDescriptor.rawDepth = 'char'; + break; + case Uint16Array: + inputDescriptor.rawDepth = 'ushort'; + break; + case Int16Array: + inputDescriptor.rawDepth = 'short'; + break; + case Uint32Array: + inputDescriptor.rawDepth = 'uint'; + break; + case Int32Array: + inputDescriptor.rawDepth = 'int'; + break; + case Float32Array: + inputDescriptor.rawDepth = 'float'; + break; + case Float64Array: + inputDescriptor.rawDepth = 'double'; + break; + default: + inputDescriptor.rawDepth = 'uchar'; + break; + } } else { throw new Error('Expected width, height and channels for raw pixel input'); } + inputDescriptor.rawPremultiplied = false; + if (is.defined(inputOptions.raw.premultiplied)) { + if (is.bool(inputOptions.raw.premultiplied)) { + inputDescriptor.rawPremultiplied = inputOptions.raw.premultiplied; + } else { + throw is.invalidParameterError('raw.premultiplied', 'boolean', inputOptions.raw.premultiplied); + } + } + inputDescriptor.rawPageHeight = 0; + if (is.defined(inputOptions.raw.pageHeight)) { + if (is.integer(inputOptions.raw.pageHeight) && inputOptions.raw.pageHeight > 0 && inputOptions.raw.pageHeight <= inputOptions.raw.height) { + if (inputOptions.raw.height % inputOptions.raw.pageHeight !== 0) { + throw new Error(`Expected raw.height ${inputOptions.raw.height} to be a multiple of raw.pageHeight ${inputOptions.raw.pageHeight}`); + } + inputDescriptor.rawPageHeight = inputOptions.raw.pageHeight; + } else { + throw is.invalidParameterError('raw.pageHeight', 'positive integer', inputOptions.raw.pageHeight); + } + } + } + // Multi-page input (GIF, TIFF, PDF) + if (is.defined(inputOptions.animated)) { + if (is.bool(inputOptions.animated)) { + inputDescriptor.pages = inputOptions.animated ? -1 : 1; + } else { + throw is.invalidParameterError('animated', 'boolean', inputOptions.animated); + } + } + if (is.defined(inputOptions.pages)) { + if (is.integer(inputOptions.pages) && is.inRange(inputOptions.pages, -1, 100000)) { + inputDescriptor.pages = inputOptions.pages; + } else { + throw is.invalidParameterError('pages', 'integer between -1 and 100000', inputOptions.pages); + } + } + if (is.defined(inputOptions.page)) { + if (is.integer(inputOptions.page) && is.inRange(inputOptions.page, 0, 100000)) { + inputDescriptor.page = inputOptions.page; + } else { + throw is.invalidParameterError('page', 'integer between 0 and 100000', inputOptions.page); + } + } + // OpenSlide specific options + if (is.object(inputOptions.openSlide) && is.defined(inputOptions.openSlide.level)) { + if (is.integer(inputOptions.openSlide.level) && is.inRange(inputOptions.openSlide.level, 0, 256)) { + inputDescriptor.openSlideLevel = inputOptions.openSlide.level; + } else { + throw is.invalidParameterError('openSlide.level', 'integer between 0 and 256', inputOptions.openSlide.level); + } + } else if (is.defined(inputOptions.level)) { + // Deprecated + if (is.integer(inputOptions.level) && is.inRange(inputOptions.level, 0, 256)) { + inputDescriptor.openSlideLevel = inputOptions.level; + } else { + throw is.invalidParameterError('level', 'integer between 0 and 256', inputOptions.level); + } + } + // TIFF specific options + if (is.object(inputOptions.tiff) && is.defined(inputOptions.tiff.subifd)) { + if (is.integer(inputOptions.tiff.subifd) && is.inRange(inputOptions.tiff.subifd, -1, 100000)) { + inputDescriptor.tiffSubifd = inputOptions.tiff.subifd; + } else { + throw is.invalidParameterError('tiff.subifd', 'integer between -1 and 100000', inputOptions.tiff.subifd); + } + } else if (is.defined(inputOptions.subifd)) { + // Deprecated + if (is.integer(inputOptions.subifd) && is.inRange(inputOptions.subifd, -1, 100000)) { + inputDescriptor.tiffSubifd = inputOptions.subifd; + } else { + throw is.invalidParameterError('subifd', 'integer between -1 and 100000', inputOptions.subifd); + } + } + // SVG specific options + if (is.object(inputOptions.svg)) { + if (is.defined(inputOptions.svg.stylesheet)) { + if (is.string(inputOptions.svg.stylesheet)) { + inputDescriptor.svgStylesheet = inputOptions.svg.stylesheet; + } else { + throw is.invalidParameterError('svg.stylesheet', 'string', inputOptions.svg.stylesheet); + } + } + if (is.defined(inputOptions.svg.highBitdepth)) { + if (is.bool(inputOptions.svg.highBitdepth)) { + inputDescriptor.svgHighBitdepth = inputOptions.svg.highBitdepth; + } else { + throw is.invalidParameterError('svg.highBitdepth', 'boolean', inputOptions.svg.highBitdepth); + } + } + } + // PDF specific options + if (is.object(inputOptions.pdf) && is.defined(inputOptions.pdf.background)) { + inputDescriptor.pdfBackground = this._getBackgroundColourOption(inputOptions.pdf.background); + } else if (is.defined(inputOptions.pdfBackground)) { + // Deprecated + inputDescriptor.pdfBackground = this._getBackgroundColourOption(inputOptions.pdfBackground); + } + // JPEG 2000 specific options + if (is.object(inputOptions.jp2) && is.defined(inputOptions.jp2.oneshot)) { + if (is.bool(inputOptions.jp2.oneshot)) { + inputDescriptor.jp2Oneshot = inputOptions.jp2.oneshot; + } else { + throw is.invalidParameterError('jp2.oneshot', 'boolean', inputOptions.jp2.oneshot); + } } // Create new image if (is.defined(inputOptions.create)) { @@ -55,26 +328,192 @@ function _createInputDescriptor (input, inputOptions, containerOptions) { is.object(inputOptions.create) && is.integer(inputOptions.create.width) && inputOptions.create.width > 0 && is.integer(inputOptions.create.height) && inputOptions.create.height > 0 && - is.integer(inputOptions.create.channels) && is.inRange(inputOptions.create.channels, 3, 4) && - is.defined(inputOptions.create.background) + is.integer(inputOptions.create.channels) ) { inputDescriptor.createWidth = inputOptions.create.width; inputDescriptor.createHeight = inputOptions.create.height; inputDescriptor.createChannels = inputOptions.create.channels; - const background = color(inputOptions.create.background); - inputDescriptor.createBackground = [ - background.red(), - background.green(), - background.blue(), - Math.round(background.alpha() * 255) - ]; + inputDescriptor.createPageHeight = 0; + if (is.defined(inputOptions.create.pageHeight)) { + if (is.integer(inputOptions.create.pageHeight) && inputOptions.create.pageHeight > 0 && inputOptions.create.pageHeight <= inputOptions.create.height) { + if (inputOptions.create.height % inputOptions.create.pageHeight !== 0) { + throw new Error(`Expected create.height ${inputOptions.create.height} to be a multiple of create.pageHeight ${inputOptions.create.pageHeight}`); + } + inputDescriptor.createPageHeight = inputOptions.create.pageHeight; + } else { + throw is.invalidParameterError('create.pageHeight', 'positive integer', inputOptions.create.pageHeight); + } + } + // Noise + if (is.defined(inputOptions.create.noise)) { + if (!is.object(inputOptions.create.noise)) { + throw new Error('Expected noise to be an object'); + } + if (inputOptions.create.noise.type !== 'gaussian') { + throw new Error('Only gaussian noise is supported at the moment'); + } + inputDescriptor.createNoiseType = inputOptions.create.noise.type; + if (!is.inRange(inputOptions.create.channels, 1, 4)) { + throw is.invalidParameterError('create.channels', 'number between 1 and 4', inputOptions.create.channels); + } + inputDescriptor.createNoiseMean = 128; + if (is.defined(inputOptions.create.noise.mean)) { + if (is.number(inputOptions.create.noise.mean) && is.inRange(inputOptions.create.noise.mean, 0, 10000)) { + inputDescriptor.createNoiseMean = inputOptions.create.noise.mean; + } else { + throw is.invalidParameterError('create.noise.mean', 'number between 0 and 10000', inputOptions.create.noise.mean); + } + } + inputDescriptor.createNoiseSigma = 30; + if (is.defined(inputOptions.create.noise.sigma)) { + if (is.number(inputOptions.create.noise.sigma) && is.inRange(inputOptions.create.noise.sigma, 0, 10000)) { + inputDescriptor.createNoiseSigma = inputOptions.create.noise.sigma; + } else { + throw is.invalidParameterError('create.noise.sigma', 'number between 0 and 10000', inputOptions.create.noise.sigma); + } + } + } else if (is.defined(inputOptions.create.background)) { + if (!is.inRange(inputOptions.create.channels, 3, 4)) { + throw is.invalidParameterError('create.channels', 'number between 3 and 4', inputOptions.create.channels); + } + inputDescriptor.createBackground = this._getBackgroundColourOption(inputOptions.create.background); + } else { + throw new Error('Expected valid noise or background to create a new input image'); + } delete inputDescriptor.buffer; } else { - throw new Error('Expected width, height, channels and background to create a new input image'); + throw new Error('Expected valid width, height and channels to create a new input image'); + } + } + // Create a new image with text + if (is.defined(inputOptions.text)) { + if (is.object(inputOptions.text) && is.string(inputOptions.text.text)) { + inputDescriptor.textValue = inputOptions.text.text; + if (is.defined(inputOptions.text.height) && is.defined(inputOptions.text.dpi)) { + throw new Error('Expected only one of dpi or height'); + } + if (is.defined(inputOptions.text.font)) { + if (is.string(inputOptions.text.font)) { + inputDescriptor.textFont = inputOptions.text.font; + } else { + throw is.invalidParameterError('text.font', 'string', inputOptions.text.font); + } + } + if (is.defined(inputOptions.text.fontfile)) { + if (is.string(inputOptions.text.fontfile)) { + inputDescriptor.textFontfile = inputOptions.text.fontfile; + } else { + throw is.invalidParameterError('text.fontfile', 'string', inputOptions.text.fontfile); + } + } + if (is.defined(inputOptions.text.width)) { + if (is.integer(inputOptions.text.width) && inputOptions.text.width > 0) { + inputDescriptor.textWidth = inputOptions.text.width; + } else { + throw is.invalidParameterError('text.width', 'positive integer', inputOptions.text.width); + } + } + if (is.defined(inputOptions.text.height)) { + if (is.integer(inputOptions.text.height) && inputOptions.text.height > 0) { + inputDescriptor.textHeight = inputOptions.text.height; + } else { + throw is.invalidParameterError('text.height', 'positive integer', inputOptions.text.height); + } + } + if (is.defined(inputOptions.text.align)) { + if (is.string(inputOptions.text.align) && is.string(this.constructor.align[inputOptions.text.align])) { + inputDescriptor.textAlign = this.constructor.align[inputOptions.text.align]; + } else { + throw is.invalidParameterError('text.align', 'valid alignment', inputOptions.text.align); + } + } + if (is.defined(inputOptions.text.justify)) { + if (is.bool(inputOptions.text.justify)) { + inputDescriptor.textJustify = inputOptions.text.justify; + } else { + throw is.invalidParameterError('text.justify', 'boolean', inputOptions.text.justify); + } + } + if (is.defined(inputOptions.text.dpi)) { + if (is.integer(inputOptions.text.dpi) && is.inRange(inputOptions.text.dpi, 1, 1000000)) { + inputDescriptor.textDpi = inputOptions.text.dpi; + } else { + throw is.invalidParameterError('text.dpi', 'integer between 1 and 1000000', inputOptions.text.dpi); + } + } + if (is.defined(inputOptions.text.rgba)) { + if (is.bool(inputOptions.text.rgba)) { + inputDescriptor.textRgba = inputOptions.text.rgba; + } else { + throw is.invalidParameterError('text.rgba', 'bool', inputOptions.text.rgba); + } + } + if (is.defined(inputOptions.text.spacing)) { + if (is.integer(inputOptions.text.spacing) && is.inRange(inputOptions.text.spacing, -1000000, 1000000)) { + inputDescriptor.textSpacing = inputOptions.text.spacing; + } else { + throw is.invalidParameterError('text.spacing', 'integer between -1000000 and 1000000', inputOptions.text.spacing); + } + } + if (is.defined(inputOptions.text.wrap)) { + if (is.string(inputOptions.text.wrap) && is.inArray(inputOptions.text.wrap, ['word', 'char', 'word-char', 'none'])) { + inputDescriptor.textWrap = inputOptions.text.wrap; + } else { + throw is.invalidParameterError('text.wrap', 'one of: word, char, word-char, none', inputOptions.text.wrap); + } + } + delete inputDescriptor.buffer; + } else { + throw new Error('Expected a valid string to create an image with text.'); + } + } + // Join images together + if (is.defined(inputOptions.join)) { + if (is.defined(this.options.join)) { + if (is.defined(inputOptions.join.animated)) { + if (is.bool(inputOptions.join.animated)) { + inputDescriptor.joinAnimated = inputOptions.join.animated; + } else { + throw is.invalidParameterError('join.animated', 'boolean', inputOptions.join.animated); + } + } + if (is.defined(inputOptions.join.across)) { + if (is.integer(inputOptions.join.across) && is.inRange(inputOptions.join.across, 1, 1000000)) { + inputDescriptor.joinAcross = inputOptions.join.across; + } else { + throw is.invalidParameterError('join.across', 'integer between 1 and 100000', inputOptions.join.across); + } + } + if (is.defined(inputOptions.join.shim)) { + if (is.integer(inputOptions.join.shim) && is.inRange(inputOptions.join.shim, 0, 1000000)) { + inputDescriptor.joinShim = inputOptions.join.shim; + } else { + throw is.invalidParameterError('join.shim', 'integer between 0 and 100000', inputOptions.join.shim); + } + } + if (is.defined(inputOptions.join.background)) { + inputDescriptor.joinBackground = this._getBackgroundColourOption(inputOptions.join.background); + } + if (is.defined(inputOptions.join.halign)) { + if (is.string(inputOptions.join.halign) && is.string(this.constructor.align[inputOptions.join.halign])) { + inputDescriptor.joinHalign = this.constructor.align[inputOptions.join.halign]; + } else { + throw is.invalidParameterError('join.halign', 'valid alignment', inputOptions.join.halign); + } + } + if (is.defined(inputOptions.join.valign)) { + if (is.string(inputOptions.join.valign) && is.string(this.constructor.align[inputOptions.join.valign])) { + inputDescriptor.joinValign = this.constructor.align[inputOptions.join.valign]; + } else { + throw is.invalidParameterError('join.valign', 'valid alignment', inputOptions.join.valign); + } + } + } else { + throw new Error('Expected input to be an array of images to join'); } } } else if (is.defined(inputOptions)) { - throw new Error('Invalid input options ' + inputOptions); + throw new Error(`Invalid input options ${inputOptions}`); } return inputDescriptor; } @@ -83,18 +522,15 @@ function _createInputDescriptor (input, inputOptions, containerOptions) { * Handle incoming Buffer chunk on Writable Stream. * @private * @param {Buffer} chunk - * @param {String} encoding - unused + * @param {string} encoding - unused * @param {Function} callback */ -function _write (chunk, encoding, callback) { - /* istanbul ignore else */ +function _write (chunk, _encoding, callback) { if (Array.isArray(this.options.input.buffer)) { - /* istanbul ignore else */ if (is.buffer(chunk)) { if (this.options.input.buffer.length === 0) { - const that = this; - this.on('finish', function () { - that.streamInFinished = true; + this.on('finish', () => { + this.streamInFinished = true; }); } this.options.input.buffer.push(chunk); @@ -120,58 +556,60 @@ function _flattenBufferIn () { /** * Are we expecting Stream-based input? * @private - * @returns {Boolean} + * @returns {boolean} */ function _isStreamInput () { return Array.isArray(this.options.input.buffer); } /** - * Take a "snapshot" of the Sharp instance, returning a new instance. - * Cloned instances inherit the input of their parent instance. - * This allows multiple output Streams and therefore multiple processing pipelines to share a single input Stream. + * Fast access to (uncached) image metadata without decoding any compressed pixel data. * - * @example - * const pipeline = sharp().rotate(); - * pipeline.clone().resize(800, 600).pipe(firstWritableStream); - * pipeline.clone().extract({ left: 20, top: 20, width: 100, height: 100 }).pipe(secondWritableStream); - * readableStream.pipe(pipeline); - * // firstWritableStream receives auto-rotated, resized readableStream - * // secondWritableStream receives auto-rotated, extracted region of readableStream + * This is read from the header of the input image. + * It does not take into consideration any operations to be applied to the output image, + * such as resize or rotate. * - * @returns {Sharp} - */ -function clone () { - const that = this; - // Clone existing options - const clone = this.constructor.call(); - clone.options = Object.assign({}, this.options); - // Pass 'finish' event to clone for Stream-based input - this.on('finish', function () { - // Clone inherits input data - that._flattenBufferIn(); - clone.options.bufferIn = that.options.bufferIn; - clone.emit('finish'); - }); - return clone; -} - -/** - * Fast access to (uncached) image metadata without decoding any compressed image data. - * A Promises/A+ promise is returned when `callback` is not provided. + * Dimensions in the response will respect the `page` and `pages` properties of the + * {@link /api-constructor/ constructor parameters}. * - * - `format`: Name of decoder used to decompress image data e.g. `jpeg`, `png`, `webp`, `gif`, `svg` - * - `width`: Number of pixels wide - * - `height`: Number of pixels high - * - `space`: Name of colour space interpretation e.g. `srgb`, `rgb`, `cmyk`, `lab`, `b-w` [...](https://github.com/jcupitt/libvips/blob/master/libvips/iofuncs/enumtypes.c#L636) + * A `Promise` is returned when `callback` is not provided. + * + * - `format`: Name of decoder used to parse image e.g. `jpeg`, `png`, `webp`, `gif`, `svg`, `heif`, `tiff` + * - `size`: Total size of image in bytes, for Stream and Buffer input only + * - `width`: Number of pixels wide (EXIF orientation is not taken into consideration, see example below) + * - `height`: Number of pixels high (EXIF orientation is not taken into consideration, see example below) + * - `space`: Name of colour space interpretation e.g. `srgb`, `rgb`, `cmyk`, `lab`, `b-w` [...](https://www.libvips.org/API/current/enum.Interpretation.html) * - `channels`: Number of bands e.g. `3` for sRGB, `4` for CMYK - * - `depth`: Name of pixel depth format e.g. `uchar`, `char`, `ushort`, `float` [...](https://github.com/jcupitt/libvips/blob/master/libvips/iofuncs/enumtypes.c#L672) + * - `depth`: Name of pixel depth format e.g. `uchar`, `char`, `ushort`, `float` [...](https://www.libvips.org/API/current/enum.BandFormat.html) * - `density`: Number of pixels per inch (DPI), if present + * - `chromaSubsampling`: String containing JPEG chroma subsampling, `4:2:0` or `4:4:4` for RGB, `4:2:0:4` or `4:4:4:4` for CMYK + * - `isProgressive`: Boolean indicating whether the image is interlaced using a progressive scan + * - `isPalette`: Boolean indicating whether the image is palette-based (GIF, PNG). + * - `bitsPerSample`: Number of bits per sample for each channel (GIF, PNG, HEIF). + * - `pages`: Number of pages/frames contained within the image, with support for TIFF, HEIF, PDF, animated GIF and animated WebP + * - `pageHeight`: Number of pixels high each page in a multi-page image will be. + * - `loop`: Number of times to loop an animated image, zero refers to a continuous loop. + * - `delay`: Delay in ms between each page in an animated image, provided as an array of integers. + * - `pagePrimary`: Number of the primary page in a HEIF image + * - `levels`: Details of each level in a multi-level image provided as an array of objects, requires libvips compiled with support for OpenSlide + * - `subifds`: Number of Sub Image File Directories in an OME-TIFF image + * - `background`: Default background colour, if present, for PNG (bKGD) and GIF images + * - `compression`: The encoder used to compress an HEIF file, `av1` (AVIF) or `hevc` (HEIC) + * - `resolutionUnit`: The unit of resolution (density), either `inch` or `cm`, if present * - `hasProfile`: Boolean indicating the presence of an embedded ICC profile * - `hasAlpha`: Boolean indicating the presence of an alpha transparency channel * - `orientation`: Number value of the EXIF Orientation header, if present * - `exif`: Buffer containing raw EXIF data, if present * - `icc`: Buffer containing raw [ICC](https://www.npmjs.com/package/icc) profile data, if present + * - `iptc`: Buffer containing raw IPTC data, if present + * - `xmp`: Buffer containing raw XMP data, if present + * - `xmpAsString`: String containing XMP data, if valid UTF-8. + * - `tifftagPhotoshop`: Buffer containing raw TIFFTAG_PHOTOSHOP data, if present + * - `formatMagick`: String containing format for images loaded via *magick + * - `comments`: Array of keyword/text pairs representing PNG text blocks, if present. + * + * @example + * const metadata = await sharp(input).metadata(); * * @example * const image = sharp(inputJpg); @@ -187,40 +625,62 @@ function clone () { * // data contains a WebP image half the width and height of the original JPEG * }); * + * @example + * // Get dimensions taking EXIF Orientation into account. + * const { autoOrient } = await sharp(input).metadata(); + * const { width, height } = autoOrient; + * * @param {Function} [callback] - called with the arguments `(err, metadata)` * @returns {Promise|Sharp} */ function metadata (callback) { - const that = this; + const stack = Error(); if (is.fn(callback)) { if (this._isStreamInput()) { - this.on('finish', function () { - that._flattenBufferIn(); - sharp.metadata(that.options, callback); + this.on('finish', () => { + this._flattenBufferIn(); + sharp.metadata(this.options, (err, metadata) => { + if (err) { + callback(is.nativeError(err, stack)); + } else { + callback(null, metadata); + } + }); }); } else { - sharp.metadata(this.options, callback); + sharp.metadata(this.options, (err, metadata) => { + if (err) { + callback(is.nativeError(err, stack)); + } else { + callback(null, metadata); + } + }); } return this; } else { if (this._isStreamInput()) { - return new Promise(function (resolve, reject) { - that.on('finish', function () { - that._flattenBufferIn(); - sharp.metadata(that.options, function (err, metadata) { + return new Promise((resolve, reject) => { + const finished = () => { + this._flattenBufferIn(); + sharp.metadata(this.options, (err, metadata) => { if (err) { - reject(err); + reject(is.nativeError(err, stack)); } else { resolve(metadata); } }); - }); + }; + if (this.writableFinished) { + finished(); + } else { + this.once('finish', finished); + } }); } else { - return new Promise(function (resolve, reject) { - sharp.metadata(that.options, function (err, metadata) { + return new Promise((resolve, reject) => { + sharp.metadata(this.options, (err, metadata) => { if (err) { - reject(err); + reject(is.nativeError(err, stack)); } else { resolve(metadata); } @@ -231,56 +691,119 @@ function metadata (callback) { } /** - * Do not process input images where the number of pixels (width * height) exceeds this limit. - * Assumes image dimensions contained in the input metadata can be trusted. - * The default limit is 268402689 (0x3FFF * 0x3FFF) pixels. - * @param {(Number|Boolean)} limit - an integral Number of pixels, zero or false to remove limit, true to use default limit. - * @returns {Sharp} - * @throws {Error} Invalid limit -*/ -function limitInputPixels (limit) { - // if we pass in false we represent the integer as 0 to disable - if (limit === false) { - limit = 0; - } else if (limit === true) { - limit = Math.pow(0x3FFF, 2); - } - if (is.integer(limit) && limit >= 0) { - this.options.limitInputPixels = limit; + * Access to pixel-derived image statistics for every channel in the image. + * A `Promise` is returned when `callback` is not provided. + * + * - `channels`: Array of channel statistics for each channel in the image. Each channel statistic contains + * - `min` (minimum value in the channel) + * - `max` (maximum value in the channel) + * - `sum` (sum of all values in a channel) + * - `squaresSum` (sum of squared values in a channel) + * - `mean` (mean of the values in a channel) + * - `stdev` (standard deviation for the values in a channel) + * - `minX` (x-coordinate of one of the pixel where the minimum lies) + * - `minY` (y-coordinate of one of the pixel where the minimum lies) + * - `maxX` (x-coordinate of one of the pixel where the maximum lies) + * - `maxY` (y-coordinate of one of the pixel where the maximum lies) + * - `isOpaque`: Is the image fully opaque? Will be `true` if the image has no alpha channel or if every pixel is fully opaque. + * - `entropy`: Histogram-based estimation of greyscale entropy, discarding alpha channel if any. + * - `sharpness`: Estimation of greyscale sharpness based on the standard deviation of a Laplacian convolution, discarding alpha channel if any. + * - `dominant`: Object containing most dominant sRGB colour based on a 4096-bin 3D histogram. + * + * **Note**: Statistics are derived from the original input image. Any operations performed on the image must first be + * written to a buffer in order to run `stats` on the result (see third example). + * + * @example + * const image = sharp(inputJpg); + * image + * .stats() + * .then(function(stats) { + * // stats contains the channel-wise statistics array and the isOpaque value + * }); + * + * @example + * const { entropy, sharpness, dominant } = await sharp(input).stats(); + * const { r, g, b } = dominant; + * + * @example + * const image = sharp(input); + * // store intermediate result + * const part = await image.extract(region).toBuffer(); + * // create new instance to obtain statistics of extracted region + * const stats = await sharp(part).stats(); + * + * @param {Function} [callback] - called with the arguments `(err, stats)` + * @returns {Promise} + */ +function stats (callback) { + const stack = Error(); + if (is.fn(callback)) { + if (this._isStreamInput()) { + this.on('finish', () => { + this._flattenBufferIn(); + sharp.stats(this.options, (err, stats) => { + if (err) { + callback(is.nativeError(err, stack)); + } else { + callback(null, stats); + } + }); + }); + } else { + sharp.stats(this.options, (err, stats) => { + if (err) { + callback(is.nativeError(err, stack)); + } else { + callback(null, stats); + } + }); + } + return this; } else { - throw is.invalidParameterError('limitInputPixels', 'integer', limit); + if (this._isStreamInput()) { + return new Promise((resolve, reject) => { + this.on('finish', function () { + this._flattenBufferIn(); + sharp.stats(this.options, (err, stats) => { + if (err) { + reject(is.nativeError(err, stack)); + } else { + resolve(stats); + } + }); + }); + }); + } else { + return new Promise((resolve, reject) => { + sharp.stats(this.options, (err, stats) => { + if (err) { + reject(is.nativeError(err, stack)); + } else { + resolve(stats); + } + }); + }); + } } - return this; -} - -/** - * An advanced setting that switches the libvips access method to `VIPS_ACCESS_SEQUENTIAL`. - * This will reduce memory usage and can improve performance on some systems. - * @param {Boolean} [sequentialRead=true] - * @returns {Sharp} - */ -function sequentialRead (sequentialRead) { - this.options.sequentialRead = is.bool(sequentialRead) ? sequentialRead : true; - return this; } /** * Decorate the Sharp prototype with input-related functions. + * @module Sharp * @private */ -module.exports = function (Sharp) { - [ +module.exports = (Sharp) => { + Object.assign(Sharp.prototype, { // Private + _inputOptionsFromObject, _createInputDescriptor, _write, _flattenBufferIn, _isStreamInput, // Public - clone, metadata, - limitInputPixels, - sequentialRead - ].forEach(function (f) { - Sharp.prototype[f.name] = f; + stats }); + // Class attributes + Sharp.align = align; }; diff --git a/lib/is.js b/lib/is.js index eaabc7cf6..3ac9a1a35 100644 --- a/lib/is.js +++ b/lib/is.js @@ -1,119 +1,143 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ /** * Is this value defined and not null? * @private */ -const defined = function (val) { - return typeof val !== 'undefined' && val !== null; -}; +const defined = (val) => typeof val !== 'undefined' && val !== null; /** * Is this value an object? * @private */ -const object = function (val) { - return typeof val === 'object'; -}; +const object = (val) => typeof val === 'object'; /** * Is this value a plain object? * @private */ -const plainObject = function (val) { - return object(val) && Object.prototype.toString.call(val) === '[object Object]'; -}; +const plainObject = (val) => Object.prototype.toString.call(val) === '[object Object]'; /** * Is this value a function? * @private */ -const fn = function (val) { - return typeof val === 'function'; -}; +const fn = (val) => typeof val === 'function'; /** * Is this value a boolean? * @private */ -const bool = function (val) { - return typeof val === 'boolean'; -}; +const bool = (val) => typeof val === 'boolean'; /** * Is this value a Buffer object? * @private */ -const buffer = function (val) { - return object(val) && val instanceof Buffer; +const buffer = (val) => val instanceof Buffer; + +/** + * Is this value a typed array object?. E.g. Uint8Array or Uint8ClampedArray? + * @private + */ +const typedArray = (val) => { + if (defined(val)) { + switch (val.constructor) { + case Uint8Array: + case Uint8ClampedArray: + case Int8Array: + case Uint16Array: + case Int16Array: + case Uint32Array: + case Int32Array: + case Float32Array: + case Float64Array: + return true; + } + } + + return false; }; +/** + * Is this value an ArrayBuffer object? + * @private + */ +const arrayBuffer = (val) => val instanceof ArrayBuffer; + /** * Is this value a non-empty string? * @private */ -const string = function (val) { - return typeof val === 'string' && val.length > 0; -}; +const string = (val) => typeof val === 'string' && val.length > 0; /** * Is this value a real number? * @private */ -const number = function (val) { - return typeof val === 'number' && !Number.isNaN(val); -}; +const number = (val) => typeof val === 'number' && !Number.isNaN(val); /** * Is this value an integer? * @private */ -const integer = function (val) { - return number(val) && val % 1 === 0; -}; +const integer = (val) => Number.isInteger(val); /** * Is this value within an inclusive given range? * @private */ -const inRange = function (val, min, max) { - return val >= min && val <= max; -}; +const inRange = (val, min, max) => val >= min && val <= max; /** * Is this value within the elements of an array? * @private */ -const inArray = function (val, list) { - return list.indexOf(val) !== -1; -}; +const inArray = (val, list) => list.includes(val); /** * Create an Error with a message relating to an invalid parameter. * - * @param {String} name - parameter name. - * @param {String} expected - description of the type/value/range expected. + * @param {string} name - parameter name. + * @param {string} expected - description of the type/value/range expected. * @param {*} actual - the value received. * @returns {Error} Containing the formatted message. * @private */ -const invalidParameterError = function (name, expected, actual) { - return new Error( +const invalidParameterError = (name, expected, actual) => new Error( `Expected ${expected} for ${name} but received ${actual} of type ${typeof actual}` ); + +/** + * Ensures an Error from C++ contains a JS stack. + * + * @param {Error} native - Error with message from C++. + * @param {Error} context - Error with stack from JS. + * @returns {Error} Error with message and stack. + * @private + */ +const nativeError = (native, context) => { + context.message = native.message; + return context; }; module.exports = { - defined: defined, - object: object, - plainObject: plainObject, - fn: fn, - bool: bool, - buffer: buffer, - string: string, - number: number, - integer: integer, - inRange: inRange, - inArray: inArray, - invalidParameterError: invalidParameterError + defined, + object, + plainObject, + fn, + bool, + buffer, + typedArray, + arrayBuffer, + string, + number, + integer, + inRange, + inArray, + invalidParameterError, + nativeError }; diff --git a/lib/libvips.js b/lib/libvips.js new file mode 100644 index 000000000..881dc5c13 --- /dev/null +++ b/lib/libvips.js @@ -0,0 +1,207 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { spawnSync } = require('node:child_process'); +const { createHash } = require('node:crypto'); +const semverCoerce = require('semver/functions/coerce'); +const semverGreaterThanOrEqualTo = require('semver/functions/gte'); +const semverSatisfies = require('semver/functions/satisfies'); +const detectLibc = require('detect-libc'); + +const { config, engines, optionalDependencies } = require('../package.json'); + +/* node:coverage ignore next */ +const minimumLibvipsVersionLabelled = process.env.npm_package_config_libvips || config.libvips; +const minimumLibvipsVersion = semverCoerce(minimumLibvipsVersionLabelled).version; + +const prebuiltPlatforms = [ + 'darwin-arm64', 'darwin-x64', + 'linux-arm', 'linux-arm64', 'linux-ppc64', 'linux-riscv64', 'linux-s390x', 'linux-x64', + 'linuxmusl-arm64', 'linuxmusl-x64', + 'win32-arm64', 'win32-ia32', 'win32-x64' +]; + +const spawnSyncOptions = { + encoding: 'utf8', + shell: true +}; + +const log = (item) => { + if (item instanceof Error) { + console.error(`sharp: Installation error: ${item.message}`); + } else { + console.log(`sharp: ${item}`); + } +}; + +/* node:coverage ignore next */ +const runtimeLibc = () => detectLibc.isNonGlibcLinuxSync() ? detectLibc.familySync() : ''; + +const runtimePlatformArch = () => `${process.platform}${runtimeLibc()}-${process.arch}`; + +const buildPlatformArch = () => { + /* node:coverage ignore next 3 */ + if (isEmscripten()) { + return 'wasm32'; + } + const { npm_config_arch, npm_config_platform, npm_config_libc } = process.env; + const libc = typeof npm_config_libc === 'string' ? npm_config_libc : runtimeLibc(); + return `${npm_config_platform || process.platform}${libc}-${npm_config_arch || process.arch}`; +}; + +const buildSharpLibvipsIncludeDir = () => { + try { + return require(`@img/sharp-libvips-dev-${buildPlatformArch()}/include`); + } catch { + /* node:coverage ignore next 5 */ + try { + return require('@img/sharp-libvips-dev/include'); + } catch {} + } + return ''; +}; + +const buildSharpLibvipsCPlusPlusDir = () => { + /* node:coverage ignore next 4 */ + try { + return require('@img/sharp-libvips-dev/cplusplus'); + } catch {} + return ''; +}; + +const buildSharpLibvipsLibDir = () => { + try { + return require(`@img/sharp-libvips-dev-${buildPlatformArch()}/lib`); + } catch { + /* node:coverage ignore next 5 */ + try { + return require(`@img/sharp-libvips-${buildPlatformArch()}/lib`); + } catch {} + } + return ''; +}; + +/* node:coverage disable */ + +const isUnsupportedNodeRuntime = () => { + if (process.release?.name === 'node' && process.versions) { + if (!semverSatisfies(process.versions.node, engines.node)) { + return { found: process.versions.node, expected: engines.node }; + } + } +}; + +const isEmscripten = () => { + const { CC } = process.env; + return Boolean(CC?.endsWith('/emcc')); +}; + +const isRosetta = () => { + if (process.platform === 'darwin' && process.arch === 'x64') { + const translated = spawnSync('sysctl sysctl.proc_translated', spawnSyncOptions).stdout; + return (translated || '').trim() === 'sysctl.proc_translated: 1'; + } + return false; +}; + +/* node:coverage enable */ + +const sha512 = (s) => createHash('sha512').update(s).digest('hex'); + +const yarnLocator = () => { + try { + const identHash = sha512(`imgsharp-libvips-${buildPlatformArch()}`); + const npmVersion = semverCoerce(optionalDependencies[`@img/sharp-libvips-${buildPlatformArch()}`], { + includePrerelease: true + }).version; + return sha512(`${identHash}npm:${npmVersion}`).slice(0, 10); + } catch {} + return ''; +}; + +/* node:coverage disable */ + +const spawnRebuild = () => + spawnSync(`node-gyp rebuild --directory=src ${isEmscripten() ? '--nodedir=emscripten' : ''}`, { + ...spawnSyncOptions, + stdio: 'inherit' + }).status; + +const globalLibvipsVersion = () => { + if (process.platform !== 'win32') { + const globalLibvipsVersion = spawnSync('pkg-config --modversion vips-cpp', { + ...spawnSyncOptions, + env: { + ...process.env, + PKG_CONFIG_PATH: pkgConfigPath() + } + }).stdout; + return (globalLibvipsVersion || '').trim(); + } else { + return ''; + } +}; + +/* node:coverage enable */ + +const pkgConfigPath = () => { + if (process.platform !== 'win32') { + /* node:coverage ignore next 4 */ + const brewPkgConfigPath = spawnSync( + 'which brew >/dev/null 2>&1 && brew environment --plain | grep PKG_CONFIG_LIBDIR | cut -d" " -f2', + spawnSyncOptions + ).stdout || ''; + return [ + brewPkgConfigPath.trim(), + process.env.PKG_CONFIG_PATH, + '/usr/local/lib/pkgconfig', + '/usr/lib/pkgconfig', + '/usr/local/libdata/pkgconfig', + '/usr/libdata/pkgconfig' + ].filter(Boolean).join(':'); + } else { + return ''; + } +}; + +const skipSearch = (status, reason, logger) => { + if (logger) { + logger(`Detected ${reason}, skipping search for globally-installed libvips`); + } + return status; +}; + +const useGlobalLibvips = (logger) => { + if (Boolean(process.env.SHARP_IGNORE_GLOBAL_LIBVIPS) === true) { + return skipSearch(false, 'SHARP_IGNORE_GLOBAL_LIBVIPS', logger); + } + if (Boolean(process.env.SHARP_FORCE_GLOBAL_LIBVIPS) === true) { + return skipSearch(true, 'SHARP_FORCE_GLOBAL_LIBVIPS', logger); + } + /* node:coverage ignore next 3 */ + if (isRosetta()) { + return skipSearch(false, 'Rosetta', logger); + } + const globalVipsVersion = globalLibvipsVersion(); + /* node:coverage ignore next */ + return !!globalVipsVersion && semverGreaterThanOrEqualTo(globalVipsVersion, minimumLibvipsVersion); +}; + +module.exports = { + minimumLibvipsVersion, + prebuiltPlatforms, + buildPlatformArch, + buildSharpLibvipsIncludeDir, + buildSharpLibvipsCPlusPlusDir, + buildSharpLibvipsLibDir, + isUnsupportedNodeRuntime, + runtimePlatformArch, + log, + yarnLocator, + spawnRebuild, + globalLibvipsVersion, + pkgConfigPath, + useGlobalLibvips +}; diff --git a/lib/operation.js b/lib/operation.js index 692b68eb9..ebbf54e9c 100644 --- a/lib/operation.js +++ b/lib/operation.js @@ -1,98 +1,116 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ const is = require('./is'); /** - * Rotate the output image by either an explicit angle - * or auto-orient based on the EXIF `Orientation` tag. + * How accurate an operation should be. + * @member + * @private + */ +const vipsPrecision = { + integer: 'integer', + float: 'float', + approximate: 'approximate' +}; + +/** + * Rotate the output image. * - * If an angle is provided, it is converted to a valid 90/180/270deg rotation. - * For example, `-450` will produce a 270deg rotation. + * The provided angle is converted to a valid positive degree rotation. + * For example, `-450` will produce a 270 degree rotation. * - * If no angle is provided, it is determined from the EXIF data. - * Mirroring is supported and may infer the use of a flip operation. + * When rotating by an angle other than a multiple of 90, + * the background colour can be provided with the `background` option. + * + * For backwards compatibility, if no angle is provided, `.autoOrient()` will be called. * - * The use of `rotate` implies the removal of the EXIF `Orientation` tag, if any. + * Only one rotation can occur per pipeline (aside from an initial call without + * arguments to orient via EXIF data). Previous calls to `rotate` in the same + * pipeline will be ignored. * - * Method order is important when both rotating and extracting regions, - * for example `rotate(x).extract(y)` will produce a different result to `extract(y).rotate(x)`. + * Multi-page images can only be rotated by 180 degrees. + * + * Method order is important when rotating, resizing and/or extracting regions, + * for example `.rotate(x).extract(y)` will produce a different result to `.extract(y).rotate(x)`. * * @example - * const pipeline = sharp() - * .rotate() - * .resize(null, 200) - * .toBuffer(function (err, outputBuffer, info) { - * // outputBuffer contains 200px high JPEG image data, - * // auto-rotated using EXIF Orientation tag - * // info.width and info.height contain the dimensions of the resized image - * }); - * readableStream.pipe(pipeline); + * const rotateThenResize = await sharp(input) + * .rotate(90) + * .resize({ width: 16, height: 8, fit: 'fill' }) + * .toBuffer(); + * const resizeThenRotate = await sharp(input) + * .resize({ width: 16, height: 8, fit: 'fill' }) + * .rotate(90) + * .toBuffer(); * - * @param {Number} [angle=auto] angle of rotation, must be a multiple of 90. + * @param {number} [angle=auto] angle of rotation. + * @param {Object} [options] - if present, is an Object with optional attributes. + * @param {string|Object} [options.background="#000000"] parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. * @returns {Sharp} * @throws {Error} Invalid parameters */ -function rotate (angle) { +function rotate (angle, options) { if (!is.defined(angle)) { - this.options.useExifOrientation = true; - } else if (is.integer(angle) && !(angle % 90)) { + return this.autoOrient(); + } + if (this.options.angle || this.options.rotationAngle) { + this.options.debuglog('ignoring previous rotate options'); + this.options.angle = 0; + this.options.rotationAngle = 0; + } + if (is.integer(angle) && !(angle % 90)) { this.options.angle = angle; + } else if (is.number(angle)) { + this.options.rotationAngle = angle; + if (is.object(options) && options.background) { + this._setBackgroundColourOption('rotationBackground', options.background); + } } else { - throw new Error('Unsupported angle: angle must be a positive/negative multiple of 90 ' + angle); + throw is.invalidParameterError('angle', 'numeric', angle); } return this; } /** - * Extract a region of the image. + * Auto-orient based on the EXIF `Orientation` tag, then remove the tag. + * Mirroring is supported and may infer the use of a flip operation. * - * - Use `extract` before `resize` for pre-resize extraction. - * - Use `extract` after `resize` for post-resize extraction. - * - Use `extract` before and after for both. + * Previous or subsequent use of `rotate(angle)` and either `flip()` or `flop()` + * will logically occur after auto-orientation, regardless of call order. * * @example - * sharp(input) - * .extract({ left: left, top: top, width: width, height: height }) - * .toFile(output, function(err) { - * // Extract a region of the input image, saving in the same format. - * }); + * const output = await sharp(input).autoOrient().toBuffer(); + * * @example - * sharp(input) - * .extract({ left: leftOffsetPre, top: topOffsetPre, width: widthPre, height: heightPre }) - * .resize(width, height) - * .extract({ left: leftOffsetPost, top: topOffsetPost, width: widthPost, height: heightPost }) - * .toFile(output, function(err) { - * // Extract a region, resize, then extract from the resized image + * const pipeline = sharp() + * .autoOrient() + * .resize(null, 200) + * .toBuffer(function (err, outputBuffer, info) { + * // outputBuffer contains 200px high JPEG image data, + * // auto-oriented using EXIF Orientation tag + * // info.width and info.height contain the dimensions of the resized image * }); + * readableStream.pipe(pipeline); * - * @param {Object} options - * @param {Number} options.left - zero-indexed offset from left edge - * @param {Number} options.top - zero-indexed offset from top edge - * @param {Number} options.width - dimension of extracted image - * @param {Number} options.height - dimension of extracted image * @returns {Sharp} - * @throws {Error} Invalid parameters */ -function extract (options) { - const suffix = this.options.width === -1 && this.options.height === -1 ? 'Pre' : 'Post'; - ['left', 'top', 'width', 'height'].forEach(function (name) { - const value = options[name]; - if (is.integer(value) && value >= 0) { - this.options[name + (name === 'left' || name === 'top' ? 'Offset' : '') + suffix] = value; - } else { - throw new Error('Non-integer value for ' + name + ' of ' + value); - } - }, this); - // Ensure existing rotation occurs before pre-resize extraction - if (suffix === 'Pre' && ((this.options.angle % 360) !== 0 || this.options.useExifOrientation === true)) { - this.options.rotateBeforePreExtract = true; - } +function autoOrient () { + this.options.input.autoOrient = true; return this; } /** - * Flip the image about the vertical Y axis. This always occurs after rotation, if any. - * The use of `flip` implies the removal of the EXIF `Orientation` tag, if any. + * Mirror the image vertically (up-down) about the x-axis. + * This always occurs before rotation, if any. + * + * This operation does not work correctly with multi-page images. + * + * @example + * const output = await sharp(input).flip().toBuffer(); + * * @param {Boolean} [flip=true] * @returns {Sharp} */ @@ -102,8 +120,12 @@ function flip (flip) { } /** - * Flop the image about the horizontal X axis. This always occurs after rotation, if any. - * The use of `flop` implies the removal of the EXIF `Orientation` tag, if any. + * Mirror the image horizontally (left-right) about the y-axis. + * This always occurs before rotation, if any. + * + * @example + * const output = await sharp(input).flop().toBuffer(); + * * @param {Boolean} [flop=true] * @returns {Sharp} */ @@ -112,194 +134,576 @@ function flop (flop) { return this; } +/** + * Perform an affine transform on an image. This operation will always occur after resizing, extraction and rotation, if any. + * + * You must provide an array of length 4 or a 2x2 affine transformation matrix. + * By default, new pixels are filled with a black background. You can provide a background colour with the `background` option. + * A particular interpolator may also be specified. Set the `interpolator` option to an attribute of the `sharp.interpolators` Object e.g. `sharp.interpolators.nohalo`. + * + * In the case of a 2x2 matrix, the transform is: + * - X = `matrix[0, 0]` \* (x + `idx`) + `matrix[0, 1]` \* (y + `idy`) + `odx` + * - Y = `matrix[1, 0]` \* (x + `idx`) + `matrix[1, 1]` \* (y + `idy`) + `ody` + * + * where: + * - x and y are the coordinates in input image. + * - X and Y are the coordinates in output image. + * - (0,0) is the upper left corner. + * + * @since 0.27.0 + * + * @example + * const pipeline = sharp() + * .affine([[1, 0.3], [0.1, 0.7]], { + * background: 'white', + * interpolator: sharp.interpolators.nohalo + * }) + * .toBuffer((err, outputBuffer, info) => { + * // outputBuffer contains the transformed image + * // info.width and info.height contain the new dimensions + * }); + * + * inputStream + * .pipe(pipeline); + * + * @param {Array>|Array} matrix - affine transformation matrix + * @param {Object} [options] - if present, is an Object with optional attributes. + * @param {String|Object} [options.background="#000000"] - parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. + * @param {Number} [options.idx=0] - input horizontal offset + * @param {Number} [options.idy=0] - input vertical offset + * @param {Number} [options.odx=0] - output horizontal offset + * @param {Number} [options.ody=0] - output vertical offset + * @param {String} [options.interpolator=sharp.interpolators.bicubic] - interpolator + * @returns {Sharp} + * @throws {Error} Invalid parameters + */ +function affine (matrix, options) { + const flatMatrix = [].concat(...matrix); + if (flatMatrix.length === 4 && flatMatrix.every(is.number)) { + this.options.affineMatrix = flatMatrix; + } else { + throw is.invalidParameterError('matrix', '1x4 or 2x2 array', matrix); + } + + if (is.defined(options)) { + if (is.object(options)) { + this._setBackgroundColourOption('affineBackground', options.background); + if (is.defined(options.idx)) { + if (is.number(options.idx)) { + this.options.affineIdx = options.idx; + } else { + throw is.invalidParameterError('options.idx', 'number', options.idx); + } + } + if (is.defined(options.idy)) { + if (is.number(options.idy)) { + this.options.affineIdy = options.idy; + } else { + throw is.invalidParameterError('options.idy', 'number', options.idy); + } + } + if (is.defined(options.odx)) { + if (is.number(options.odx)) { + this.options.affineOdx = options.odx; + } else { + throw is.invalidParameterError('options.odx', 'number', options.odx); + } + } + if (is.defined(options.ody)) { + if (is.number(options.ody)) { + this.options.affineOdy = options.ody; + } else { + throw is.invalidParameterError('options.ody', 'number', options.ody); + } + } + if (is.defined(options.interpolator)) { + if (is.inArray(options.interpolator, Object.values(this.constructor.interpolators))) { + this.options.affineInterpolator = options.interpolator; + } else { + throw is.invalidParameterError('options.interpolator', 'valid interpolator name', options.interpolator); + } + } + } else { + throw is.invalidParameterError('options', 'object', options); + } + } + + return this; +} + /** * Sharpen the image. + * * When used without parameters, performs a fast, mild sharpen of the output image. + * * When a `sigma` is provided, performs a slower, more accurate sharpen of the L channel in the LAB colour space. - * Separate control over the level of sharpening in "flat" and "jagged" areas is available. + * Fine-grained control over the level of sharpening in "flat" (m1) and "jagged" (m2) areas is available. + * + * See {@link https://www.libvips.org/API/current/method.Image.sharpen.html libvips sharpen} operation. + * + * @example + * const data = await sharp(input).sharpen().toBuffer(); + * + * @example + * const data = await sharp(input).sharpen({ sigma: 2 }).toBuffer(); * - * @param {Number} [sigma] - the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`. - * @param {Number} [flat=1.0] - the level of sharpening to apply to "flat" areas. - * @param {Number} [jagged=2.0] - the level of sharpening to apply to "jagged" areas. + * @example + * const data = await sharp(input) + * .sharpen({ + * sigma: 2, + * m1: 0, + * m2: 3, + * x1: 3, + * y2: 15, + * y3: 15, + * }) + * .toBuffer(); + * + * @param {Object|number} [options] - if present, is an Object with attributes + * @param {number} [options.sigma] - the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`, between 0.000001 and 10 + * @param {number} [options.m1=1.0] - the level of sharpening to apply to "flat" areas, between 0 and 1000000 + * @param {number} [options.m2=2.0] - the level of sharpening to apply to "jagged" areas, between 0 and 1000000 + * @param {number} [options.x1=2.0] - threshold between "flat" and "jagged", between 0 and 1000000 + * @param {number} [options.y2=10.0] - maximum amount of brightening, between 0 and 1000000 + * @param {number} [options.y3=20.0] - maximum amount of darkening, between 0 and 1000000 + * @param {number} [flat] - (deprecated) see `options.m1`. + * @param {number} [jagged] - (deprecated) see `options.m2`. * @returns {Sharp} * @throws {Error} Invalid parameters */ -function sharpen (sigma, flat, jagged) { - if (!is.defined(sigma)) { +function sharpen (options, flat, jagged) { + if (!is.defined(options)) { // No arguments: default to mild sharpen this.options.sharpenSigma = -1; - } else if (is.bool(sigma)) { - // Boolean argument: apply mild sharpen? - this.options.sharpenSigma = sigma ? -1 : 0; - } else if (is.number(sigma) && is.inRange(sigma, 0.01, 10000)) { - // Numeric argument: specific sigma - this.options.sharpenSigma = sigma; - // Control over flat areas + } else if (is.bool(options)) { + // Deprecated boolean argument: apply mild sharpen? + this.options.sharpenSigma = options ? -1 : 0; + } else if (is.number(options) && is.inRange(options, 0.01, 10000)) { + // Deprecated numeric argument: specific sigma + this.options.sharpenSigma = options; + // Deprecated control over flat areas if (is.defined(flat)) { if (is.number(flat) && is.inRange(flat, 0, 10000)) { - this.options.sharpenFlat = flat; + this.options.sharpenM1 = flat; } else { - throw new Error('Invalid sharpen level for flat areas (0.0 - 10000.0) ' + flat); + throw is.invalidParameterError('flat', 'number between 0 and 10000', flat); } } - // Control over jagged areas + // Deprecated control over jagged areas if (is.defined(jagged)) { if (is.number(jagged) && is.inRange(jagged, 0, 10000)) { - this.options.sharpenJagged = jagged; + this.options.sharpenM2 = jagged; } else { - throw new Error('Invalid sharpen level for jagged areas (0.0 - 10000.0) ' + jagged); + throw is.invalidParameterError('jagged', 'number between 0 and 10000', jagged); + } + } + } else if (is.plainObject(options)) { + if (is.number(options.sigma) && is.inRange(options.sigma, 0.000001, 10)) { + this.options.sharpenSigma = options.sigma; + } else { + throw is.invalidParameterError('options.sigma', 'number between 0.000001 and 10', options.sigma); + } + if (is.defined(options.m1)) { + if (is.number(options.m1) && is.inRange(options.m1, 0, 1000000)) { + this.options.sharpenM1 = options.m1; + } else { + throw is.invalidParameterError('options.m1', 'number between 0 and 1000000', options.m1); + } + } + if (is.defined(options.m2)) { + if (is.number(options.m2) && is.inRange(options.m2, 0, 1000000)) { + this.options.sharpenM2 = options.m2; + } else { + throw is.invalidParameterError('options.m2', 'number between 0 and 1000000', options.m2); + } + } + if (is.defined(options.x1)) { + if (is.number(options.x1) && is.inRange(options.x1, 0, 1000000)) { + this.options.sharpenX1 = options.x1; + } else { + throw is.invalidParameterError('options.x1', 'number between 0 and 1000000', options.x1); + } + } + if (is.defined(options.y2)) { + if (is.number(options.y2) && is.inRange(options.y2, 0, 1000000)) { + this.options.sharpenY2 = options.y2; + } else { + throw is.invalidParameterError('options.y2', 'number between 0 and 1000000', options.y2); + } + } + if (is.defined(options.y3)) { + if (is.number(options.y3) && is.inRange(options.y3, 0, 1000000)) { + this.options.sharpenY3 = options.y3; + } else { + throw is.invalidParameterError('options.y3', 'number between 0 and 1000000', options.y3); } } } else { - throw new Error('Invalid sharpen sigma (0.01 - 10000) ' + sigma); + throw is.invalidParameterError('sigma', 'number between 0.01 and 10000', options); + } + return this; +} + +/** + * Apply median filter. + * When used without parameters the default window is 3x3. + * + * @example + * const output = await sharp(input).median().toBuffer(); + * + * @example + * const output = await sharp(input).median(5).toBuffer(); + * + * @param {number} [size=3] square mask size: size x size + * @returns {Sharp} + * @throws {Error} Invalid parameters + */ +function median (size) { + if (!is.defined(size)) { + // No arguments: default to 3x3 + this.options.medianSize = 3; + } else if (is.integer(size) && is.inRange(size, 1, 1000)) { + // Numeric argument: specific sigma + this.options.medianSize = size; + } else { + throw is.invalidParameterError('size', 'integer between 1 and 1000', size); } return this; } /** * Blur the image. - * When used without parameters, performs a fast, mild blur of the output image. + * + * When used without parameters, performs a fast 3x3 box blur (equivalent to a box linear filter). + * * When a `sigma` is provided, performs a slower, more accurate Gaussian blur. - * @param {Number} [sigma] a value between 0.3 and 1000 representing the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`. + * + * @example + * const boxBlurred = await sharp(input) + * .blur() + * .toBuffer(); + * + * @example + * const gaussianBlurred = await sharp(input) + * .blur(5) + * .toBuffer(); + * + * @param {Object|number|Boolean} [options] + * @param {number} [options.sigma] a value between 0.3 and 1000 representing the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`. + * @param {string} [options.precision='integer'] How accurate the operation should be, one of: integer, float, approximate. + * @param {number} [options.minAmplitude=0.2] A value between 0.001 and 1. A smaller value will generate a larger, more accurate mask. * @returns {Sharp} * @throws {Error} Invalid parameters */ -function blur (sigma) { - if (!is.defined(sigma)) { +function blur (options) { + let sigma; + if (is.number(options)) { + sigma = options; + } else if (is.plainObject(options)) { + if (!is.number(options.sigma)) { + throw is.invalidParameterError('options.sigma', 'number between 0.3 and 1000', sigma); + } + sigma = options.sigma; + if ('precision' in options) { + if (is.string(vipsPrecision[options.precision])) { + this.options.precision = vipsPrecision[options.precision]; + } else { + throw is.invalidParameterError('precision', 'one of: integer, float, approximate', options.precision); + } + } + if ('minAmplitude' in options) { + if (is.number(options.minAmplitude) && is.inRange(options.minAmplitude, 0.001, 1)) { + this.options.minAmpl = options.minAmplitude; + } else { + throw is.invalidParameterError('minAmplitude', 'number between 0.001 and 1', options.minAmplitude); + } + } + } + + if (!is.defined(options)) { // No arguments: default to mild blur this.options.blurSigma = -1; - } else if (is.bool(sigma)) { + } else if (is.bool(options)) { // Boolean argument: apply mild blur? - this.options.blurSigma = sigma ? -1 : 0; + this.options.blurSigma = options ? -1 : 0; } else if (is.number(sigma) && is.inRange(sigma, 0.3, 1000)) { // Numeric argument: specific sigma this.options.blurSigma = sigma; } else { - throw new Error('Invalid blur sigma (0.3 - 1000.0) ' + sigma); + throw is.invalidParameterError('sigma', 'number between 0.3 and 1000', sigma); } + return this; } /** - * Extends/pads the edges of the image with the colour provided to the `background` method. - * This operation will always occur after resizing and extraction, if any. + * Expand foreground objects using the dilate morphological operator. * * @example - * // Resize to 140 pixels wide, then add 10 transparent pixels - * // to the top, left and right edges and 20 to the bottom edge - * sharp(input) - * .resize(140) - * .background({r: 0, g: 0, b: 0, alpha: 0}) - * .extend({top: 10, bottom: 20, left: 10, right: 10}) - * ... - * - * @param {(Number|Object)} extend - single pixel count to add to all edges or an Object with per-edge counts - * @param {Number} [extend.top] - * @param {Number} [extend.left] - * @param {Number} [extend.bottom] - * @param {Number} [extend.right] + * const output = await sharp(input) + * .dilate() + * .toBuffer(); + * + * @param {Number} [width=1] dilation width in pixels. * @returns {Sharp} * @throws {Error} Invalid parameters -*/ -function extend (extend) { - if (is.integer(extend) && extend > 0) { - this.options.extendTop = extend; - this.options.extendBottom = extend; - this.options.extendLeft = extend; - this.options.extendRight = extend; - } else if ( - is.object(extend) && - is.integer(extend.top) && extend.top >= 0 && - is.integer(extend.bottom) && extend.bottom >= 0 && - is.integer(extend.left) && extend.left >= 0 && - is.integer(extend.right) && extend.right >= 0 - ) { - this.options.extendTop = extend.top; - this.options.extendBottom = extend.bottom; - this.options.extendLeft = extend.left; - this.options.extendRight = extend.right; + */ +function dilate (width) { + if (!is.defined(width)) { + this.options.dilateWidth = 1; + } else if (is.integer(width) && width > 0) { + this.options.dilateWidth = width; } else { - throw new Error('Invalid edge extension ' + extend); + throw is.invalidParameterError('dilate', 'positive integer', dilate); } return this; } /** - * Merge alpha transparency channel, if any, with `background`. - * @param {Boolean} [flatten=true] + * Shrink foreground objects using the erode morphological operator. + * + * @example + * const output = await sharp(input) + * .erode() + * .toBuffer(); + * + * @param {Number} [width=1] erosion width in pixels. * @returns {Sharp} + * @throws {Error} Invalid parameters */ -function flatten (flatten) { - this.options.flatten = is.bool(flatten) ? flatten : true; +function erode (width) { + if (!is.defined(width)) { + this.options.erodeWidth = 1; + } else if (is.integer(width) && width > 0) { + this.options.erodeWidth = width; + } else { + throw is.invalidParameterError('erode', 'positive integer', erode); + } return this; } /** - * Trim "boring" pixels from all edges that contain values within a percentage similarity of the top-left pixel. - * @param {Number} [tolerance=10] value between 1 and 99 representing the percentage similarity. + * Merge alpha transparency channel, if any, with a background, then remove the alpha channel. + * + * See also {@link /api-channel#removealpha removeAlpha}. + * + * @example + * await sharp(rgbaInput) + * .flatten({ background: '#F0A703' }) + * .toBuffer(); + * + * @param {Object} [options] + * @param {string|Object} [options.background={r: 0, g: 0, b: 0}] - background colour, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black. * @returns {Sharp} - * @throws {Error} Invalid parameters */ -function trim (tolerance) { - if (!is.defined(tolerance)) { - this.options.trimTolerance = 10; - } else if (is.integer(tolerance) && is.inRange(tolerance, 1, 99)) { - this.options.trimTolerance = tolerance; - } else { - throw new Error('Invalid trim tolerance (1 to 99) ' + tolerance); +function flatten (options) { + this.options.flatten = is.bool(options) ? options : true; + if (is.object(options)) { + this._setBackgroundColourOption('flattenBackground', options.background); } return this; } +/** + * Ensure the image has an alpha channel + * with all white pixel values made fully transparent. + * + * Existing alpha channel values for non-white pixels remain unchanged. + * + * This feature is experimental and the API may change. + * + * @since 0.32.1 + * + * @example + * await sharp(rgbInput) + * .unflatten() + * .toBuffer(); + * + * @example + * await sharp(rgbInput) + * .threshold(128, { grayscale: false }) // converter bright pixels to white + * .unflatten() + * .toBuffer(); + */ +function unflatten () { + this.options.unflatten = true; + return this; +} + /** * Apply a gamma correction by reducing the encoding (darken) pre-resize at a factor of `1/gamma` * then increasing the encoding (brighten) post-resize at a factor of `gamma`. * This can improve the perceived brightness of a resized image in non-linear colour spaces. * JPEG and WebP input images will not take advantage of the shrink-on-load performance optimisation * when applying a gamma correction. - * @param {Number} [gamma=2.2] value between 1.0 and 3.0. + * + * Supply a second argument to use a different output gamma value, otherwise the first value is used in both cases. + * + * @param {number} [gamma=2.2] value between 1.0 and 3.0. + * @param {number} [gammaOut] value between 1.0 and 3.0. (optional, defaults to same as `gamma`) * @returns {Sharp} * @throws {Error} Invalid parameters */ -function gamma (gamma) { +function gamma (gamma, gammaOut) { if (!is.defined(gamma)) { // Default gamma correction of 2.2 (sRGB) this.options.gamma = 2.2; } else if (is.number(gamma) && is.inRange(gamma, 1, 3)) { this.options.gamma = gamma; } else { - throw new Error('Invalid gamma correction (1.0 to 3.0) ' + gamma); + throw is.invalidParameterError('gamma', 'number between 1.0 and 3.0', gamma); + } + if (!is.defined(gammaOut)) { + // Default gamma correction for output is same as input + this.options.gammaOut = this.options.gamma; + } else if (is.number(gammaOut) && is.inRange(gammaOut, 1, 3)) { + this.options.gammaOut = gammaOut; + } else { + throw is.invalidParameterError('gammaOut', 'number between 1.0 and 3.0', gammaOut); } return this; } /** * Produce the "negative" of the image. - * @param {Boolean} [negate=true] + * + * @example + * const output = await sharp(input) + * .negate() + * .toBuffer(); + * + * @example + * const output = await sharp(input) + * .negate({ alpha: false }) + * .toBuffer(); + * + * @param {Object} [options] + * @param {Boolean} [options.alpha=true] Whether or not to negate any alpha channel * @returns {Sharp} */ -function negate (negate) { - this.options.negate = is.bool(negate) ? negate : true; +function negate (options) { + this.options.negate = is.bool(options) ? options : true; + if (is.plainObject(options) && 'alpha' in options) { + if (!is.bool(options.alpha)) { + throw is.invalidParameterError('alpha', 'should be boolean value', options.alpha); + } else { + this.options.negateAlpha = options.alpha; + } + } return this; } /** - * Enhance output image contrast by stretching its luminance to cover the full dynamic range. - * @param {Boolean} [normalise=true] + * Enhance output image contrast by stretching its luminance to cover a full dynamic range. + * + * Uses a histogram-based approach, taking a default range of 1% to 99% to reduce sensitivity to noise at the extremes. + * + * Luminance values below the `lower` percentile will be underexposed by clipping to zero. + * Luminance values above the `upper` percentile will be overexposed by clipping to the max pixel value. + * + * @example + * const output = await sharp(input) + * .normalise() + * .toBuffer(); + * + * @example + * const output = await sharp(input) + * .normalise({ lower: 0, upper: 100 }) + * .toBuffer(); + * + * @param {Object} [options] + * @param {number} [options.lower=1] - Percentile below which luminance values will be underexposed. + * @param {number} [options.upper=99] - Percentile above which luminance values will be overexposed. * @returns {Sharp} */ -function normalise (normalise) { - this.options.normalise = is.bool(normalise) ? normalise : true; +function normalise (options) { + if (is.plainObject(options)) { + if (is.defined(options.lower)) { + if (is.number(options.lower) && is.inRange(options.lower, 0, 99)) { + this.options.normaliseLower = options.lower; + } else { + throw is.invalidParameterError('lower', 'number between 0 and 99', options.lower); + } + } + if (is.defined(options.upper)) { + if (is.number(options.upper) && is.inRange(options.upper, 1, 100)) { + this.options.normaliseUpper = options.upper; + } else { + throw is.invalidParameterError('upper', 'number between 1 and 100', options.upper); + } + } + } + if (this.options.normaliseLower >= this.options.normaliseUpper) { + throw is.invalidParameterError('range', 'lower to be less than upper', + `${this.options.normaliseLower} >= ${this.options.normaliseUpper}`); + } + this.options.normalise = true; return this; } /** * Alternative spelling of normalise. - * @param {Boolean} [normalize=true] + * + * @example + * const output = await sharp(input) + * .normalize() + * .toBuffer(); + * + * @param {Object} [options] + * @param {number} [options.lower=1] - Percentile below which luminance values will be underexposed. + * @param {number} [options.upper=99] - Percentile above which luminance values will be overexposed. + * @returns {Sharp} + */ +function normalize (options) { + return this.normalise(options); +} + +/** + * Perform contrast limiting adaptive histogram equalization + * {@link https://en.wikipedia.org/wiki/Adaptive_histogram_equalization#Contrast_Limited_AHE CLAHE}. + * + * This will, in general, enhance the clarity of the image by bringing out darker details. + * + * @since 0.28.3 + * + * @example + * const output = await sharp(input) + * .clahe({ + * width: 3, + * height: 3, + * }) + * .toBuffer(); + * + * @param {Object} options + * @param {number} options.width - Integral width of the search window, in pixels. + * @param {number} options.height - Integral height of the search window, in pixels. + * @param {number} [options.maxSlope=3] - Integral level of brightening, between 0 and 100, where 0 disables contrast limiting. * @returns {Sharp} + * @throws {Error} Invalid parameters */ -function normalize (normalize) { - return this.normalise(normalize); +function clahe (options) { + if (is.plainObject(options)) { + if (is.integer(options.width) && options.width > 0) { + this.options.claheWidth = options.width; + } else { + throw is.invalidParameterError('width', 'integer greater than zero', options.width); + } + if (is.integer(options.height) && options.height > 0) { + this.options.claheHeight = options.height; + } else { + throw is.invalidParameterError('height', 'integer greater than zero', options.height); + } + if (is.defined(options.maxSlope)) { + if (is.integer(options.maxSlope) && is.inRange(options.maxSlope, 0, 100)) { + this.options.claheMaxSlope = options.maxSlope; + } else { + throw is.invalidParameterError('maxSlope', 'integer between 0 and 100', options.maxSlope); + } + } + } else { + throw is.invalidParameterError('options', 'plain object', options); + } + return this; } /** @@ -319,11 +723,11 @@ function normalize (normalize) { * }); * * @param {Object} kernel - * @param {Number} kernel.width - width of the kernel in pixels. - * @param {Number} kernel.height - width of the kernel in pixels. - * @param {Array} kernel.kernel - Array of length `width*height` containing the kernel values. - * @param {Number} [kernel.scale=sum] - the scale of the kernel in pixels. - * @param {Number} [kernel.offset=0] - the offset of the kernel in pixels. + * @param {number} kernel.width - width of the kernel in pixels. + * @param {number} kernel.height - height of the kernel in pixels. + * @param {Array} kernel.kernel - Array of length `width*height` containing the kernel values. + * @param {number} [kernel.scale=sum] - the scale of the kernel in pixels. + * @param {number} [kernel.offset=0] - the offset of the kernel in pixels. * @returns {Sharp} * @throws {Error} Invalid parameters */ @@ -332,15 +736,13 @@ function convolve (kernel) { !is.integer(kernel.width) || !is.integer(kernel.height) || !is.inRange(kernel.width, 3, 1001) || !is.inRange(kernel.height, 3, 1001) || kernel.height * kernel.width !== kernel.kernel.length - ) { + ) { // must pass in a kernel throw new Error('Invalid convolution kernel'); } // Default scale is sum of kernel values if (!is.integer(kernel.scale)) { - kernel.scale = kernel.kernel.reduce(function (a, b) { - return a + b; - }, 0); + kernel.scale = kernel.kernel.reduce((a, b) => a + b, 0); } // Clip scale to a minimum value of 1 if (kernel.scale < 1) { @@ -354,8 +756,8 @@ function convolve (kernel) { } /** - * Any pixel value greather than or equal to the threshold value will be set to 255, otherwise it will be set to 0. - * @param {Number} [threshold=128] - a value in the range 0-255 representing the level at which the threshold will be applied. + * Any pixel value greater than or equal to the threshold value will be set to 255, otherwise it will be set to 0. + * @param {number} [threshold=128] - a value in the range 0-255 representing the level at which the threshold will be applied. * @param {Object} [options] * @param {Boolean} [options.greyscale=true] - convert to single channel greyscale. * @param {Boolean} [options.grayscale=true] - alternative spelling for greyscale. @@ -370,7 +772,7 @@ function threshold (threshold, options) { } else if (is.integer(threshold) && is.inRange(threshold, 0, 255)) { this.options.threshold = threshold; } else { - throw new Error('Invalid threshold (0 to 255) ' + threshold); + throw is.invalidParameterError('threshold', 'integer between 0 and 255', threshold); } if (!is.object(options) || options.greyscale === true || options.grayscale === true) { this.options.thresholdGrayscale = true; @@ -386,13 +788,13 @@ function threshold (threshold, options) { * This operation creates an output image where each pixel is the result of * the selected bitwise boolean `operation` between the corresponding pixels of the input images. * - * @param {Buffer|String} operand - Buffer containing image data or String containing the path to an image file. - * @param {String} operator - one of `and`, `or` or `eor` to perform that bitwise operation, like the C logic operators `&`, `|` and `^` respectively. + * @param {Buffer|string} operand - Buffer containing image data or string containing the path to an image file. + * @param {string} operator - one of `and`, `or` or `eor` to perform that bitwise operation, like the C logic operators `&`, `|` and `^` respectively. * @param {Object} [options] * @param {Object} [options.raw] - describes operand when using raw pixel data. - * @param {Number} [options.raw.width] - * @param {Number} [options.raw.height] - * @param {Number} [options.raw.channels] + * @param {number} [options.raw.width] + * @param {number} [options.raw.height] + * @param {number} [options.raw.channels] * @returns {Sharp} * @throws {Error} Invalid parameters */ @@ -401,34 +803,214 @@ function boolean (operand, operator, options) { if (is.string(operator) && is.inArray(operator, ['and', 'or', 'eor'])) { this.options.booleanOp = operator; } else { - throw new Error('Invalid boolean operator ' + operator); + throw is.invalidParameterError('operator', 'one of: and, or, eor', operator); + } + return this; +} + +/** + * Apply the linear formula `a` * input + `b` to the image to adjust image levels. + * + * When a single number is provided, it will be used for all image channels. + * When an array of numbers is provided, the array length must match the number of channels. + * + * @example + * await sharp(input) + * .linear(0.5, 2) + * .toBuffer(); + * + * @example + * await sharp(rgbInput) + * .linear( + * [0.25, 0.5, 0.75], + * [150, 100, 50] + * ) + * .toBuffer(); + * + * @param {(number|number[])} [a=[]] multiplier + * @param {(number|number[])} [b=[]] offset + * @returns {Sharp} + * @throws {Error} Invalid parameters + */ +function linear (a, b) { + if (!is.defined(a) && is.number(b)) { + a = 1.0; + } else if (is.number(a) && !is.defined(b)) { + b = 0.0; + } + if (!is.defined(a)) { + this.options.linearA = []; + } else if (is.number(a)) { + this.options.linearA = [a]; + } else if (Array.isArray(a) && a.length && a.every(is.number)) { + this.options.linearA = a; + } else { + throw is.invalidParameterError('a', 'number or array of numbers', a); + } + if (!is.defined(b)) { + this.options.linearB = []; + } else if (is.number(b)) { + this.options.linearB = [b]; + } else if (Array.isArray(b) && b.length && b.every(is.number)) { + this.options.linearB = b; + } else { + throw is.invalidParameterError('b', 'number or array of numbers', b); + } + if (this.options.linearA.length !== this.options.linearB.length) { + throw new Error('Expected a and b to be arrays of the same length'); + } + return this; +} + +/** + * Recombine the image with the specified matrix. + * + * @since 0.21.1 + * + * @example + * sharp(input) + * .recomb([ + * [0.3588, 0.7044, 0.1368], + * [0.2990, 0.5870, 0.1140], + * [0.2392, 0.4696, 0.0912], + * ]) + * .raw() + * .toBuffer(function(err, data, info) { + * // data contains the raw pixel data after applying the matrix + * // With this example input, a sepia filter has been applied + * }); + * + * @param {Array>} inputMatrix - 3x3 or 4x4 Recombination matrix + * @returns {Sharp} + * @throws {Error} Invalid parameters + */ +function recomb (inputMatrix) { + if (!Array.isArray(inputMatrix)) { + throw is.invalidParameterError('inputMatrix', 'array', inputMatrix); + } + if (inputMatrix.length !== 3 && inputMatrix.length !== 4) { + throw is.invalidParameterError('inputMatrix', '3x3 or 4x4 array', inputMatrix.length); + } + const recombMatrix = inputMatrix.flat().map(Number); + if (recombMatrix.length !== 9 && recombMatrix.length !== 16) { + throw is.invalidParameterError('inputMatrix', 'cardinality of 9 or 16', recombMatrix.length); + } + this.options.recombMatrix = recombMatrix; + return this; +} + +/** + * Transforms the image using brightness, saturation, hue rotation, and lightness. + * Brightness and lightness both operate on luminance, with the difference being that + * brightness is multiplicative whereas lightness is additive. + * + * @since 0.22.1 + * + * @example + * // increase brightness by a factor of 2 + * const output = await sharp(input) + * .modulate({ + * brightness: 2 + * }) + * .toBuffer(); + * + * @example + * // hue-rotate by 180 degrees + * const output = await sharp(input) + * .modulate({ + * hue: 180 + * }) + * .toBuffer(); + * + * @example + * // increase lightness by +50 + * const output = await sharp(input) + * .modulate({ + * lightness: 50 + * }) + * .toBuffer(); + * + * @example + * // decrease brightness and saturation while also hue-rotating by 90 degrees + * const output = await sharp(input) + * .modulate({ + * brightness: 0.5, + * saturation: 0.5, + * hue: 90, + * }) + * .toBuffer(); + * + * @param {Object} [options] + * @param {number} [options.brightness] Brightness multiplier + * @param {number} [options.saturation] Saturation multiplier + * @param {number} [options.hue] Degrees for hue rotation + * @param {number} [options.lightness] Lightness addend + * @returns {Sharp} + */ +function modulate (options) { + if (!is.plainObject(options)) { + throw is.invalidParameterError('options', 'plain object', options); + } + if ('brightness' in options) { + if (is.number(options.brightness) && options.brightness >= 0) { + this.options.brightness = options.brightness; + } else { + throw is.invalidParameterError('brightness', 'number above zero', options.brightness); + } + } + if ('saturation' in options) { + if (is.number(options.saturation) && options.saturation >= 0) { + this.options.saturation = options.saturation; + } else { + throw is.invalidParameterError('saturation', 'number above zero', options.saturation); + } + } + if ('hue' in options) { + if (is.integer(options.hue)) { + this.options.hue = options.hue % 360; + } else { + throw is.invalidParameterError('hue', 'number', options.hue); + } + } + if ('lightness' in options) { + if (is.number(options.lightness)) { + this.options.lightness = options.lightness; + } else { + throw is.invalidParameterError('lightness', 'number', options.lightness); + } } return this; } /** * Decorate the Sharp prototype with operation-related functions. + * @module Sharp * @private */ -module.exports = function (Sharp) { - [ +module.exports = (Sharp) => { + Object.assign(Sharp.prototype, { + autoOrient, rotate, - extract, flip, flop, + affine, sharpen, + erode, + dilate, + median, blur, - extend, flatten, - trim, + unflatten, gamma, negate, normalise, normalize, + clahe, convolve, threshold, - boolean - ].forEach(function (f) { - Sharp.prototype[f.name] = f; + boolean, + linear, + recomb, + modulate }); }; diff --git a/lib/output.js b/lib/output.js index 881cf3743..27a6ac470 100644 --- a/lib/output.js +++ b/lib/output.js @@ -1,307 +1,1313 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ +const path = require('node:path'); const is = require('./is'); -const sharp = require('../build/Release/sharp.node'); +const sharp = require('./sharp'); + +const formats = new Map([ + ['heic', 'heif'], + ['heif', 'heif'], + ['avif', 'avif'], + ['jpeg', 'jpeg'], + ['jpg', 'jpeg'], + ['jpe', 'jpeg'], + ['tile', 'tile'], + ['dz', 'tile'], + ['png', 'png'], + ['raw', 'raw'], + ['tiff', 'tiff'], + ['tif', 'tiff'], + ['webp', 'webp'], + ['gif', 'gif'], + ['jp2', 'jp2'], + ['jpx', 'jp2'], + ['j2k', 'jp2'], + ['j2c', 'jp2'], + ['jxl', 'jxl'] +]); + +const jp2Regex = /\.(jp[2x]|j2[kc])$/i; + +const errJp2Save = () => new Error('JP2 output requires libvips with support for OpenJPEG'); + +const bitdepthFromColourCount = (colours) => 1 << 31 - Math.clz32(Math.ceil(Math.log2(colours))); /** * Write output image data to a file. * * If an explicit output format is not selected, it will be inferred from the extension, - * with JPEG, PNG, WebP, TIFF, DZI, and libvips' V format supported. + * with JPEG, PNG, WebP, AVIF, TIFF, GIF, DZI, and libvips' V format supported. * Note that raw pixel data is only supported for buffer output. * - * A Promises/A+ promise is returned when `callback` is not provided. + * By default all metadata will be removed, which includes EXIF-based orientation. + * See {@link #withmetadata withMetadata} for control over this. * - * @param {String} fileOut - the path to write the image data to. + * The caller is responsible for ensuring directory structures and permissions exist. + * + * A `Promise` is returned when `callback` is not provided. + * + * @example + * sharp(input) + * .toFile('output.png', (err, info) => { ... }); + * + * @example + * sharp(input) + * .toFile('output.png') + * .then(info => { ... }) + * .catch(err => { ... }); + * + * @param {string} fileOut - the path to write the image data to. * @param {Function} [callback] - called on completion with two arguments `(err, info)`. * `info` contains the output image `format`, `size` (bytes), `width`, `height`, * `channels` and `premultiplied` (indicating if premultiplication was used). + * When using a crop strategy also contains `cropOffsetLeft` and `cropOffsetTop`. + * When using the attention crop strategy also contains `attentionX` and `attentionY`, the focal point of the cropped region. + * Animated output will also contain `pageHeight` and `pages`. + * May also contain `textAutofitDpi` (dpi the font was rendered at) if image was created from text. * @returns {Promise} - when no callback is provided * @throws {Error} Invalid parameters */ function toFile (fileOut, callback) { - if (!fileOut || fileOut.length === 0) { - const errOutputInvalid = new Error('Invalid output'); + let err; + if (!is.string(fileOut)) { + err = new Error('Missing output file path'); + } else if (is.string(this.options.input.file) && path.resolve(this.options.input.file) === path.resolve(fileOut)) { + err = new Error('Cannot use same file for input and output'); + } else if (jp2Regex.test(path.extname(fileOut)) && !this.constructor.format.jp2k.output.file) { + err = errJp2Save(); + } + if (err) { if (is.fn(callback)) { - callback(errOutputInvalid); + callback(err); } else { - return Promise.reject(errOutputInvalid); + return Promise.reject(err); + } + } else { + this.options.fileOut = fileOut; + const stack = Error(); + return this._pipeline(callback, stack); + } + return this; +} + +/** + * Write output to a Buffer. + * JPEG, PNG, WebP, AVIF, TIFF, GIF and raw pixel data output are supported. + * + * Use {@link #toformat toFormat} or one of the format-specific functions such as {@link #jpeg jpeg}, {@link #png png} etc. to set the output format. + * + * If no explicit format is set, the output format will match the input image, except SVG input which becomes PNG output. + * + * By default all metadata will be removed, which includes EXIF-based orientation. + * See {@link #withmetadata withMetadata} for control over this. + * + * `callback`, if present, gets three arguments `(err, data, info)` where: + * - `err` is an error, if any. + * - `data` is the output image data. + * - `info` contains the output image `format`, `size` (bytes), `width`, `height`, + * `channels` and `premultiplied` (indicating if premultiplication was used). + * When using a crop strategy also contains `cropOffsetLeft` and `cropOffsetTop`. + * Animated output will also contain `pageHeight` and `pages`. + * May also contain `textAutofitDpi` (dpi the font was rendered at) if image was created from text. + * + * A `Promise` is returned when `callback` is not provided. + * + * @example + * sharp(input) + * .toBuffer((err, data, info) => { ... }); + * + * @example + * sharp(input) + * .toBuffer() + * .then(data => { ... }) + * .catch(err => { ... }); + * + * @example + * sharp(input) + * .png() + * .toBuffer({ resolveWithObject: true }) + * .then(({ data, info }) => { ... }) + * .catch(err => { ... }); + * + * @example + * const { data, info } = await sharp('my-image.jpg') + * // output the raw pixels + * .raw() + * .toBuffer({ resolveWithObject: true }); + * + * // create a more type safe way to work with the raw pixel data + * // this will not copy the data, instead it will change `data`s underlying ArrayBuffer + * // so `data` and `pixelArray` point to the same memory location + * const pixelArray = new Uint8ClampedArray(data.buffer); + * + * // When you are done changing the pixelArray, sharp takes the `pixelArray` as an input + * const { width, height, channels } = info; + * await sharp(pixelArray, { raw: { width, height, channels } }) + * .toFile('my-changed-image.jpg'); + * + * @param {Object} [options] + * @param {boolean} [options.resolveWithObject] Resolve the Promise with an Object containing `data` and `info` properties instead of resolving only with `data`. + * @param {Function} [callback] + * @returns {Promise} - when no callback is provided + */ +function toBuffer (options, callback) { + if (is.object(options)) { + this._setBooleanOption('resolveWithObject', options.resolveWithObject); + } else if (this.options.resolveWithObject) { + this.options.resolveWithObject = false; + } + this.options.fileOut = ''; + const stack = Error(); + return this._pipeline(is.fn(options) ? options : callback, stack); +} + +/** + * Keep all EXIF metadata from the input image in the output image. + * + * EXIF metadata is unsupported for TIFF output. + * + * @since 0.33.0 + * + * @example + * const outputWithExif = await sharp(inputWithExif) + * .keepExif() + * .toBuffer(); + * + * @returns {Sharp} + */ +function keepExif () { + this.options.keepMetadata |= 0b00001; + return this; +} + +/** + * Set EXIF metadata in the output image, ignoring any EXIF in the input image. + * + * @since 0.33.0 + * + * @example + * const dataWithExif = await sharp(input) + * .withExif({ + * IFD0: { + * Copyright: 'The National Gallery' + * }, + * IFD3: { + * GPSLatitudeRef: 'N', + * GPSLatitude: '51/1 30/1 3230/100', + * GPSLongitudeRef: 'W', + * GPSLongitude: '0/1 7/1 4366/100' + * } + * }) + * .toBuffer(); + * + * @param {Object>} exif Object keyed by IFD0, IFD1 etc. of key/value string pairs to write as EXIF data. + * @returns {Sharp} + * @throws {Error} Invalid parameters + */ +function withExif (exif) { + if (is.object(exif)) { + for (const [ifd, entries] of Object.entries(exif)) { + if (is.object(entries)) { + for (const [k, v] of Object.entries(entries)) { + if (is.string(v)) { + this.options.withExif[`exif-${ifd.toLowerCase()}-${k}`] = v; + } else { + throw is.invalidParameterError(`${ifd}.${k}`, 'string', v); + } + } + } else { + throw is.invalidParameterError(ifd, 'object', entries); + } + } + } else { + throw is.invalidParameterError('exif', 'object', exif); + } + this.options.withExifMerge = false; + return this.keepExif(); +} + +/** + * Update EXIF metadata from the input image in the output image. + * + * @since 0.33.0 + * + * @example + * const dataWithMergedExif = await sharp(inputWithExif) + * .withExifMerge({ + * IFD0: { + * Copyright: 'The National Gallery' + * } + * }) + * .toBuffer(); + * + * @param {Object>} exif Object keyed by IFD0, IFD1 etc. of key/value string pairs to write as EXIF data. + * @returns {Sharp} + * @throws {Error} Invalid parameters + */ +function withExifMerge (exif) { + this.withExif(exif); + this.options.withExifMerge = true; + return this; +} + +/** + * Keep ICC profile from the input image in the output image. + * + * When input and output colour spaces differ, use with {@link /api-colour/#tocolourspace toColourspace} and optionally {@link /api-colour/#pipelinecolourspace pipelineColourspace}. + * + * @since 0.33.0 + * + * @example + * const outputWithIccProfile = await sharp(inputWithIccProfile) + * .keepIccProfile() + * .toBuffer(); + * + * @example + * const cmykOutputWithIccProfile = await sharp(cmykInputWithIccProfile) + * .pipelineColourspace('cmyk') + * .toColourspace('cmyk') + * .keepIccProfile() + * .toBuffer(); + * + * @returns {Sharp} + */ +function keepIccProfile () { + this.options.keepMetadata |= 0b01000; + return this; +} + +/** + * Transform using an ICC profile and attach to the output image. + * + * This can either be an absolute filesystem path or + * built-in profile name (`srgb`, `p3`, `cmyk`). + * + * @since 0.33.0 + * + * @example + * const outputWithP3 = await sharp(input) + * .withIccProfile('p3') + * .toBuffer(); + * + * @param {string} icc - Absolute filesystem path to output ICC profile or built-in profile name (srgb, p3, cmyk). + * @param {Object} [options] + * @param {number} [options.attach=true] Should the ICC profile be included in the output image metadata? + * @returns {Sharp} + * @throws {Error} Invalid parameters + */ +function withIccProfile (icc, options) { + if (is.string(icc)) { + this.options.withIccProfile = icc; + } else { + throw is.invalidParameterError('icc', 'string', icc); + } + this.keepIccProfile(); + if (is.object(options)) { + if (is.defined(options.attach)) { + if (is.bool(options.attach)) { + if (!options.attach) { + this.options.keepMetadata &= ~0b01000; + } + } else { + throw is.invalidParameterError('attach', 'boolean', options.attach); + } + } + } + return this; +} + +/** + * Keep XMP metadata from the input image in the output image. + * + * @since 0.34.3 + * + * @example + * const outputWithXmp = await sharp(inputWithXmp) + * .keepXmp() + * .toBuffer(); + * + * @returns {Sharp} + */ +function keepXmp () { + this.options.keepMetadata |= 0b00010; + return this; +} + +/** + * Set XMP metadata in the output image. + * + * Supported by PNG, JPEG, WebP, and TIFF output. + * + * @since 0.34.3 + * + * @example + * const xmpString = ` + * + * + * + * + * John Doe + * + * + * `; + * + * const data = await sharp(input) + * .withXmp(xmpString) + * .toBuffer(); + * + * @param {string} xmp String containing XMP metadata to be embedded in the output image. + * @returns {Sharp} + * @throws {Error} Invalid parameters + */ +function withXmp (xmp) { + if (is.string(xmp) && xmp.length > 0) { + this.options.withXmp = xmp; + this.options.keepMetadata |= 0b00010; + } else { + throw is.invalidParameterError('xmp', 'non-empty string', xmp); + } + return this; +} + +/** + * Keep all metadata (EXIF, ICC, XMP, IPTC) from the input image in the output image. + * + * The default behaviour, when `keepMetadata` is not used, is to convert to the device-independent + * sRGB colour space and strip all metadata, including the removal of any ICC profile. + * + * @since 0.33.0 + * + * @example + * const outputWithMetadata = await sharp(inputWithMetadata) + * .keepMetadata() + * .toBuffer(); + * + * @returns {Sharp} + */ +function keepMetadata () { + this.options.keepMetadata = 0b11111; + return this; +} + +/** + * Keep most metadata (EXIF, XMP, IPTC) from the input image in the output image. + * + * This will also convert to and add a web-friendly sRGB ICC profile if appropriate. + * + * Allows orientation and density to be set or updated. + * + * @example + * const outputSrgbWithMetadata = await sharp(inputRgbWithMetadata) + * .withMetadata() + * .toBuffer(); + * + * @example + * // Set output metadata to 96 DPI + * const data = await sharp(input) + * .withMetadata({ density: 96 }) + * .toBuffer(); + * + * @param {Object} [options] + * @param {number} [options.orientation] Used to update the EXIF `Orientation` tag, integer between 1 and 8. + * @param {number} [options.density] Number of pixels per inch (DPI). + * @returns {Sharp} + * @throws {Error} Invalid parameters + */ +function withMetadata (options) { + this.keepMetadata(); + this.withIccProfile('srgb'); + if (is.object(options)) { + if (is.defined(options.orientation)) { + if (is.integer(options.orientation) && is.inRange(options.orientation, 1, 8)) { + this.options.withMetadataOrientation = options.orientation; + } else { + throw is.invalidParameterError('orientation', 'integer between 1 and 8', options.orientation); + } + } + if (is.defined(options.density)) { + if (is.number(options.density) && options.density > 0) { + this.options.withMetadataDensity = options.density; + } else { + throw is.invalidParameterError('density', 'positive number', options.density); + } + } + if (is.defined(options.icc)) { + this.withIccProfile(options.icc); + } + if (is.defined(options.exif)) { + this.withExifMerge(options.exif); + } + } + return this; +} + +/** + * Force output to a given format. + * + * @example + * // Convert any input to PNG output + * const data = await sharp(input) + * .toFormat('png') + * .toBuffer(); + * + * @param {(string|Object)} format - as a string or an Object with an 'id' attribute + * @param {Object} options - output options + * @returns {Sharp} + * @throws {Error} unsupported format or options + */ +function toFormat (format, options) { + const actualFormat = formats.get((is.object(format) && is.string(format.id) ? format.id : format).toLowerCase()); + if (!actualFormat) { + throw is.invalidParameterError('format', `one of: ${[...formats.keys()].join(', ')}`, format); + } + return this[actualFormat](options); +} + +/** + * Use these JPEG options for output image. + * + * @example + * // Convert any input to very high quality JPEG output + * const data = await sharp(input) + * .jpeg({ + * quality: 100, + * chromaSubsampling: '4:4:4' + * }) + * .toBuffer(); + * + * @example + * // Use mozjpeg to reduce output JPEG file size (slower) + * const data = await sharp(input) + * .jpeg({ mozjpeg: true }) + * .toBuffer(); + * + * @param {Object} [options] - output options + * @param {number} [options.quality=80] - quality, integer 1-100 + * @param {boolean} [options.progressive=false] - use progressive (interlace) scan + * @param {string} [options.chromaSubsampling='4:2:0'] - set to '4:4:4' to prevent chroma subsampling otherwise defaults to '4:2:0' chroma subsampling + * @param {boolean} [options.optimiseCoding=true] - optimise Huffman coding tables + * @param {boolean} [options.optimizeCoding=true] - alternative spelling of optimiseCoding + * @param {boolean} [options.mozjpeg=false] - use mozjpeg defaults, equivalent to `{ trellisQuantisation: true, overshootDeringing: true, optimiseScans: true, quantisationTable: 3 }` + * @param {boolean} [options.trellisQuantisation=false] - apply trellis quantisation + * @param {boolean} [options.overshootDeringing=false] - apply overshoot deringing + * @param {boolean} [options.optimiseScans=false] - optimise progressive scans, forces progressive + * @param {boolean} [options.optimizeScans=false] - alternative spelling of optimiseScans + * @param {number} [options.quantisationTable=0] - quantization table to use, integer 0-8 + * @param {number} [options.quantizationTable=0] - alternative spelling of quantisationTable + * @param {boolean} [options.force=true] - force JPEG output, otherwise attempt to use input format + * @returns {Sharp} + * @throws {Error} Invalid options + */ +function jpeg (options) { + if (is.object(options)) { + if (is.defined(options.quality)) { + if (is.integer(options.quality) && is.inRange(options.quality, 1, 100)) { + this.options.jpegQuality = options.quality; + } else { + throw is.invalidParameterError('quality', 'integer between 1 and 100', options.quality); + } + } + if (is.defined(options.progressive)) { + this._setBooleanOption('jpegProgressive', options.progressive); + } + if (is.defined(options.chromaSubsampling)) { + if (is.string(options.chromaSubsampling) && is.inArray(options.chromaSubsampling, ['4:2:0', '4:4:4'])) { + this.options.jpegChromaSubsampling = options.chromaSubsampling; + } else { + throw is.invalidParameterError('chromaSubsampling', 'one of: 4:2:0, 4:4:4', options.chromaSubsampling); + } + } + const optimiseCoding = is.bool(options.optimizeCoding) ? options.optimizeCoding : options.optimiseCoding; + if (is.defined(optimiseCoding)) { + this._setBooleanOption('jpegOptimiseCoding', optimiseCoding); + } + if (is.defined(options.mozjpeg)) { + if (is.bool(options.mozjpeg)) { + if (options.mozjpeg) { + this.options.jpegTrellisQuantisation = true; + this.options.jpegOvershootDeringing = true; + this.options.jpegOptimiseScans = true; + this.options.jpegProgressive = true; + this.options.jpegQuantisationTable = 3; + } + } else { + throw is.invalidParameterError('mozjpeg', 'boolean', options.mozjpeg); + } + } + const trellisQuantisation = is.bool(options.trellisQuantization) ? options.trellisQuantization : options.trellisQuantisation; + if (is.defined(trellisQuantisation)) { + this._setBooleanOption('jpegTrellisQuantisation', trellisQuantisation); + } + if (is.defined(options.overshootDeringing)) { + this._setBooleanOption('jpegOvershootDeringing', options.overshootDeringing); + } + const optimiseScans = is.bool(options.optimizeScans) ? options.optimizeScans : options.optimiseScans; + if (is.defined(optimiseScans)) { + this._setBooleanOption('jpegOptimiseScans', optimiseScans); + if (optimiseScans) { + this.options.jpegProgressive = true; + } + } + const quantisationTable = is.number(options.quantizationTable) ? options.quantizationTable : options.quantisationTable; + if (is.defined(quantisationTable)) { + if (is.integer(quantisationTable) && is.inRange(quantisationTable, 0, 8)) { + this.options.jpegQuantisationTable = quantisationTable; + } else { + throw is.invalidParameterError('quantisationTable', 'integer between 0 and 8', quantisationTable); + } + } + } + return this._updateFormatOut('jpeg', options); +} + +/** + * Use these PNG options for output image. + * + * By default, PNG output is full colour at 8 bits per pixel. + * + * Indexed PNG input at 1, 2 or 4 bits per pixel is converted to 8 bits per pixel. + * Set `palette` to `true` for slower, indexed PNG output. + * + * For 16 bits per pixel output, convert to `rgb16` via + * {@link /api-colour/#tocolourspace toColourspace}. + * + * @example + * // Convert any input to full colour PNG output + * const data = await sharp(input) + * .png() + * .toBuffer(); + * + * @example + * // Convert any input to indexed PNG output (slower) + * const data = await sharp(input) + * .png({ palette: true }) + * .toBuffer(); + * + * @example + * // Output 16 bits per pixel RGB(A) + * const data = await sharp(input) + * .toColourspace('rgb16') + * .png() + * .toBuffer(); + * + * @param {Object} [options] + * @param {boolean} [options.progressive=false] - use progressive (interlace) scan + * @param {number} [options.compressionLevel=6] - zlib compression level, 0 (fastest, largest) to 9 (slowest, smallest) + * @param {boolean} [options.adaptiveFiltering=false] - use adaptive row filtering + * @param {boolean} [options.palette=false] - quantise to a palette-based image with alpha transparency support + * @param {number} [options.quality=100] - use the lowest number of colours needed to achieve given quality, sets `palette` to `true` + * @param {number} [options.effort=7] - CPU effort, between 1 (fastest) and 10 (slowest), sets `palette` to `true` + * @param {number} [options.colours=256] - maximum number of palette entries, sets `palette` to `true` + * @param {number} [options.colors=256] - alternative spelling of `options.colours`, sets `palette` to `true` + * @param {number} [options.dither=1.0] - level of Floyd-Steinberg error diffusion, sets `palette` to `true` + * @param {boolean} [options.force=true] - force PNG output, otherwise attempt to use input format + * @returns {Sharp} + * @throws {Error} Invalid options + */ +function png (options) { + if (is.object(options)) { + if (is.defined(options.progressive)) { + this._setBooleanOption('pngProgressive', options.progressive); + } + if (is.defined(options.compressionLevel)) { + if (is.integer(options.compressionLevel) && is.inRange(options.compressionLevel, 0, 9)) { + this.options.pngCompressionLevel = options.compressionLevel; + } else { + throw is.invalidParameterError('compressionLevel', 'integer between 0 and 9', options.compressionLevel); + } + } + if (is.defined(options.adaptiveFiltering)) { + this._setBooleanOption('pngAdaptiveFiltering', options.adaptiveFiltering); + } + const colours = options.colours || options.colors; + if (is.defined(colours)) { + if (is.integer(colours) && is.inRange(colours, 2, 256)) { + this.options.pngBitdepth = bitdepthFromColourCount(colours); + } else { + throw is.invalidParameterError('colours', 'integer between 2 and 256', colours); + } + } + if (is.defined(options.palette)) { + this._setBooleanOption('pngPalette', options.palette); + } else if ([options.quality, options.effort, options.colours, options.colors, options.dither].some(is.defined)) { + this._setBooleanOption('pngPalette', true); + } + if (this.options.pngPalette) { + if (is.defined(options.quality)) { + if (is.integer(options.quality) && is.inRange(options.quality, 0, 100)) { + this.options.pngQuality = options.quality; + } else { + throw is.invalidParameterError('quality', 'integer between 0 and 100', options.quality); + } + } + if (is.defined(options.effort)) { + if (is.integer(options.effort) && is.inRange(options.effort, 1, 10)) { + this.options.pngEffort = options.effort; + } else { + throw is.invalidParameterError('effort', 'integer between 1 and 10', options.effort); + } + } + if (is.defined(options.dither)) { + if (is.number(options.dither) && is.inRange(options.dither, 0, 1)) { + this.options.pngDither = options.dither; + } else { + throw is.invalidParameterError('dither', 'number between 0.0 and 1.0', options.dither); + } + } + } + } + return this._updateFormatOut('png', options); +} + +/** + * Use these WebP options for output image. + * + * @example + * // Convert any input to lossless WebP output + * const data = await sharp(input) + * .webp({ lossless: true }) + * .toBuffer(); + * + * @example + * // Optimise the file size of an animated WebP + * const outputWebp = await sharp(inputWebp, { animated: true }) + * .webp({ effort: 6 }) + * .toBuffer(); + * + * @param {Object} [options] - output options + * @param {number} [options.quality=80] - quality, integer 1-100 + * @param {number} [options.alphaQuality=100] - quality of alpha layer, integer 0-100 + * @param {boolean} [options.lossless=false] - use lossless compression mode + * @param {boolean} [options.nearLossless=false] - use near_lossless compression mode + * @param {boolean} [options.smartSubsample=false] - use high quality chroma subsampling + * @param {boolean} [options.smartDeblock=false] - auto-adjust the deblocking filter, can improve low contrast edges (slow) + * @param {string} [options.preset='default'] - named preset for preprocessing/filtering, one of: default, photo, picture, drawing, icon, text + * @param {number} [options.effort=4] - CPU effort, between 0 (fastest) and 6 (slowest) + * @param {number} [options.loop=0] - number of animation iterations, use 0 for infinite animation + * @param {number|number[]} [options.delay] - delay(s) between animation frames (in milliseconds) + * @param {boolean} [options.minSize=false] - prevent use of animation key frames to minimise file size (slow) + * @param {boolean} [options.mixed=false] - allow mixture of lossy and lossless animation frames (slow) + * @param {boolean} [options.force=true] - force WebP output, otherwise attempt to use input format + * @returns {Sharp} + * @throws {Error} Invalid options + */ +function webp (options) { + if (is.object(options)) { + if (is.defined(options.quality)) { + if (is.integer(options.quality) && is.inRange(options.quality, 1, 100)) { + this.options.webpQuality = options.quality; + } else { + throw is.invalidParameterError('quality', 'integer between 1 and 100', options.quality); + } + } + if (is.defined(options.alphaQuality)) { + if (is.integer(options.alphaQuality) && is.inRange(options.alphaQuality, 0, 100)) { + this.options.webpAlphaQuality = options.alphaQuality; + } else { + throw is.invalidParameterError('alphaQuality', 'integer between 0 and 100', options.alphaQuality); + } + } + if (is.defined(options.lossless)) { + this._setBooleanOption('webpLossless', options.lossless); + } + if (is.defined(options.nearLossless)) { + this._setBooleanOption('webpNearLossless', options.nearLossless); + } + if (is.defined(options.smartSubsample)) { + this._setBooleanOption('webpSmartSubsample', options.smartSubsample); + } + if (is.defined(options.smartDeblock)) { + this._setBooleanOption('webpSmartDeblock', options.smartDeblock); + } + if (is.defined(options.preset)) { + if (is.string(options.preset) && is.inArray(options.preset, ['default', 'photo', 'picture', 'drawing', 'icon', 'text'])) { + this.options.webpPreset = options.preset; + } else { + throw is.invalidParameterError('preset', 'one of: default, photo, picture, drawing, icon, text', options.preset); + } + } + if (is.defined(options.effort)) { + if (is.integer(options.effort) && is.inRange(options.effort, 0, 6)) { + this.options.webpEffort = options.effort; + } else { + throw is.invalidParameterError('effort', 'integer between 0 and 6', options.effort); + } + } + if (is.defined(options.minSize)) { + this._setBooleanOption('webpMinSize', options.minSize); + } + if (is.defined(options.mixed)) { + this._setBooleanOption('webpMixed', options.mixed); + } + } + trySetAnimationOptions(options, this.options); + return this._updateFormatOut('webp', options); +} + +/** + * Use these GIF options for the output image. + * + * The first entry in the palette is reserved for transparency. + * + * The palette of the input image will be re-used if possible. + * + * @since 0.30.0 + * + * @example + * // Convert PNG to GIF + * await sharp(pngBuffer) + * .gif() + * .toBuffer(); + * + * @example + * // Convert animated WebP to animated GIF + * await sharp('animated.webp', { animated: true }) + * .toFile('animated.gif'); + * + * @example + * // Create a 128x128, cropped, non-dithered, animated thumbnail of an animated GIF + * const out = await sharp('in.gif', { animated: true }) + * .resize({ width: 128, height: 128 }) + * .gif({ dither: 0 }) + * .toBuffer(); + * + * @example + * // Lossy file size reduction of animated GIF + * await sharp('in.gif', { animated: true }) + * .gif({ interFrameMaxError: 8 }) + * .toFile('optim.gif'); + * + * @param {Object} [options] - output options + * @param {boolean} [options.reuse=true] - re-use existing palette, otherwise generate new (slow) + * @param {boolean} [options.progressive=false] - use progressive (interlace) scan + * @param {number} [options.colours=256] - maximum number of palette entries, including transparency, between 2 and 256 + * @param {number} [options.colors=256] - alternative spelling of `options.colours` + * @param {number} [options.effort=7] - CPU effort, between 1 (fastest) and 10 (slowest) + * @param {number} [options.dither=1.0] - level of Floyd-Steinberg error diffusion, between 0 (least) and 1 (most) + * @param {number} [options.interFrameMaxError=0] - maximum inter-frame error for transparency, between 0 (lossless) and 32 + * @param {number} [options.interPaletteMaxError=3] - maximum inter-palette error for palette reuse, between 0 and 256 + * @param {boolean} [options.keepDuplicateFrames=false] - keep duplicate frames in the output instead of combining them + * @param {number} [options.loop=0] - number of animation iterations, use 0 for infinite animation + * @param {number|number[]} [options.delay] - delay(s) between animation frames (in milliseconds) + * @param {boolean} [options.force=true] - force GIF output, otherwise attempt to use input format + * @returns {Sharp} + * @throws {Error} Invalid options + */ +function gif (options) { + if (is.object(options)) { + if (is.defined(options.reuse)) { + this._setBooleanOption('gifReuse', options.reuse); + } + if (is.defined(options.progressive)) { + this._setBooleanOption('gifProgressive', options.progressive); + } + const colours = options.colours || options.colors; + if (is.defined(colours)) { + if (is.integer(colours) && is.inRange(colours, 2, 256)) { + this.options.gifBitdepth = bitdepthFromColourCount(colours); + } else { + throw is.invalidParameterError('colours', 'integer between 2 and 256', colours); + } + } + if (is.defined(options.effort)) { + if (is.number(options.effort) && is.inRange(options.effort, 1, 10)) { + this.options.gifEffort = options.effort; + } else { + throw is.invalidParameterError('effort', 'integer between 1 and 10', options.effort); + } + } + if (is.defined(options.dither)) { + if (is.number(options.dither) && is.inRange(options.dither, 0, 1)) { + this.options.gifDither = options.dither; + } else { + throw is.invalidParameterError('dither', 'number between 0.0 and 1.0', options.dither); + } + } + if (is.defined(options.interFrameMaxError)) { + if (is.number(options.interFrameMaxError) && is.inRange(options.interFrameMaxError, 0, 32)) { + this.options.gifInterFrameMaxError = options.interFrameMaxError; + } else { + throw is.invalidParameterError('interFrameMaxError', 'number between 0.0 and 32.0', options.interFrameMaxError); + } + } + if (is.defined(options.interPaletteMaxError)) { + if (is.number(options.interPaletteMaxError) && is.inRange(options.interPaletteMaxError, 0, 256)) { + this.options.gifInterPaletteMaxError = options.interPaletteMaxError; + } else { + throw is.invalidParameterError('interPaletteMaxError', 'number between 0.0 and 256.0', options.interPaletteMaxError); + } + } + if (is.defined(options.keepDuplicateFrames)) { + if (is.bool(options.keepDuplicateFrames)) { + this._setBooleanOption('gifKeepDuplicateFrames', options.keepDuplicateFrames); + } else { + throw is.invalidParameterError('keepDuplicateFrames', 'boolean', options.keepDuplicateFrames); + } + } + } + trySetAnimationOptions(options, this.options); + return this._updateFormatOut('gif', options); +} + +/** + * Use these JP2 options for output image. + * + * Requires libvips compiled with support for OpenJPEG. + * The prebuilt binaries do not include this - see + * {@link /install#custom-libvips installing a custom libvips}. + * + * @example + * // Convert any input to lossless JP2 output + * const data = await sharp(input) + * .jp2({ lossless: true }) + * .toBuffer(); + * + * @example + * // Convert any input to very high quality JP2 output + * const data = await sharp(input) + * .jp2({ + * quality: 100, + * chromaSubsampling: '4:4:4' + * }) + * .toBuffer(); + * + * @since 0.29.1 + * + * @param {Object} [options] - output options + * @param {number} [options.quality=80] - quality, integer 1-100 + * @param {boolean} [options.lossless=false] - use lossless compression mode + * @param {number} [options.tileWidth=512] - horizontal tile size + * @param {number} [options.tileHeight=512] - vertical tile size + * @param {string} [options.chromaSubsampling='4:4:4'] - set to '4:2:0' to use chroma subsampling + * @returns {Sharp} + * @throws {Error} Invalid options + */ +function jp2 (options) { + /* node:coverage ignore next 41 */ + if (!this.constructor.format.jp2k.output.buffer) { + throw errJp2Save(); + } + if (is.object(options)) { + if (is.defined(options.quality)) { + if (is.integer(options.quality) && is.inRange(options.quality, 1, 100)) { + this.options.jp2Quality = options.quality; + } else { + throw is.invalidParameterError('quality', 'integer between 1 and 100', options.quality); + } } - } else { - if (this.options.input.file === fileOut) { - const errOutputIsInput = new Error('Cannot use same file for input and output'); - if (is.fn(callback)) { - callback(errOutputIsInput); + if (is.defined(options.lossless)) { + if (is.bool(options.lossless)) { + this.options.jp2Lossless = options.lossless; } else { - return Promise.reject(errOutputIsInput); + throw is.invalidParameterError('lossless', 'boolean', options.lossless); + } + } + if (is.defined(options.tileWidth)) { + if (is.integer(options.tileWidth) && is.inRange(options.tileWidth, 1, 32768)) { + this.options.jp2TileWidth = options.tileWidth; + } else { + throw is.invalidParameterError('tileWidth', 'integer between 1 and 32768', options.tileWidth); + } + } + if (is.defined(options.tileHeight)) { + if (is.integer(options.tileHeight) && is.inRange(options.tileHeight, 1, 32768)) { + this.options.jp2TileHeight = options.tileHeight; + } else { + throw is.invalidParameterError('tileHeight', 'integer between 1 and 32768', options.tileHeight); + } + } + if (is.defined(options.chromaSubsampling)) { + if (is.string(options.chromaSubsampling) && is.inArray(options.chromaSubsampling, ['4:2:0', '4:4:4'])) { + this.options.jp2ChromaSubsampling = options.chromaSubsampling; + } else { + throw is.invalidParameterError('chromaSubsampling', 'one of: 4:2:0, 4:4:4', options.chromaSubsampling); } - } else { - this.options.fileOut = fileOut; - return this._pipeline(callback); } } - return this; + return this._updateFormatOut('jp2', options); } /** - * Write output to a Buffer. - * JPEG, PNG, WebP, TIFF and RAW output are supported. - * By default, the format will match the input image, except GIF and SVG input which become PNG output. - * - * `callback`, if present, gets three arguments `(err, data, info)` where: - * - `err` is an error, if any. - * - `data` is the output image data. - * - `info` contains the output image `format`, `size` (bytes), `width`, `height`, - * `channels` and `premultiplied` (indicating if premultiplication was used). - * A Promise is returned when `callback` is not provided. + * Set animation options if available. + * @private * - * @param {Object} [options] - * @param {Boolean} [options.resolveWithObject] Resolve the Promise with an Object containing `data` and `info` properties instead of resolving only with `data`. - * @param {Function} [callback] - * @returns {Promise} - when no callback is provided + * @param {Object} [source] - output options + * @param {number} [source.loop=0] - number of animation iterations, use 0 for infinite animation + * @param {number[]} [source.delay] - list of delays between animation frames (in milliseconds) + * @param {Object} [target] - target object for valid options + * @throws {Error} Invalid options */ -function toBuffer (options, callback) { - if (is.object(options)) { - if (is.bool(options.resolveWithObject)) { - this.options.resolveWithObject = options.resolveWithObject; +function trySetAnimationOptions (source, target) { + if (is.object(source) && is.defined(source.loop)) { + if (is.integer(source.loop) && is.inRange(source.loop, 0, 65535)) { + target.loop = source.loop; + } else { + throw is.invalidParameterError('loop', 'integer between 0 and 65535', source.loop); } } - return this._pipeline(is.fn(options) ? options : callback); -} - -/** - * Include all metadata (EXIF, XMP, IPTC) from the input image in the output image. - * The default behaviour, when `withMetadata` is not used, is to strip all metadata and convert to the device-independent sRGB colour space. - * This will also convert to and add a web-friendly sRGB ICC profile. - * @param {Object} [withMetadata] - * @param {Number} [withMetadata.orientation] value between 1 and 8, used to update the EXIF `Orientation` tag. - * @returns {Sharp} - * @throws {Error} Invalid parameters - */ -function withMetadata (withMetadata) { - this.options.withMetadata = is.bool(withMetadata) ? withMetadata : true; - if (is.object(withMetadata)) { - if (is.defined(withMetadata.orientation)) { - if (is.integer(withMetadata.orientation) && is.inRange(withMetadata.orientation, 1, 8)) { - this.options.withMetadataOrientation = withMetadata.orientation; - } else { - throw new Error('Invalid orientation (1 to 8) ' + withMetadata.orientation); - } + if (is.object(source) && is.defined(source.delay)) { + // We allow singular values as well + if (is.integer(source.delay) && is.inRange(source.delay, 0, 65535)) { + target.delay = [source.delay]; + } else if ( + Array.isArray(source.delay) && + source.delay.every(is.integer) && + source.delay.every(v => is.inRange(v, 0, 65535))) { + target.delay = source.delay; + } else { + throw is.invalidParameterError('delay', 'integer or an array of integers between 0 and 65535', source.delay); } } - return this; } /** - * Use these JPEG options for output image. + * Use these TIFF options for output image. + * + * The `density` can be set in pixels/inch via {@link #withmetadata withMetadata} + * instead of providing `xres` and `yres` in pixels/mm. + * + * @example + * // Convert SVG input to LZW-compressed, 1 bit per pixel TIFF output + * sharp('input.svg') + * .tiff({ + * compression: 'lzw', + * bitdepth: 1 + * }) + * .toFile('1-bpp-output.tiff') + * .then(info => { ... }); + * * @param {Object} [options] - output options - * @param {Number} [options.quality=80] - quality, integer 1-100 - * @param {Boolean} [options.progressive=false] - use progressive (interlace) scan - * @param {String} [options.chromaSubsampling='4:2:0'] - set to '4:4:4' to prevent chroma subsampling when quality <= 90 - * @param {Boolean} [options.trellisQuantisation=false] - apply trellis quantisation, requires mozjpeg - * @param {Boolean} [options.overshootDeringing=false] - apply overshoot deringing, requires mozjpeg - * @param {Boolean} [options.optimiseScans=false] - optimise progressive scans, forces progressive, requires mozjpeg - * @param {Boolean} [options.optimizeScans=false] - alternative spelling of optimiseScans - * @param {Boolean} [options.force=true] - force JPEG output, otherwise attempt to use input format + * @param {number} [options.quality=80] - quality, integer 1-100 + * @param {boolean} [options.force=true] - force TIFF output, otherwise attempt to use input format + * @param {string} [options.compression='jpeg'] - compression options: none, jpeg, deflate, packbits, ccittfax4, lzw, webp, zstd, jp2k + * @param {boolean} [options.bigtiff=false] - use BigTIFF variant (has no effect when compression is none) + * @param {string} [options.predictor='horizontal'] - compression predictor options: none, horizontal, float + * @param {boolean} [options.pyramid=false] - write an image pyramid + * @param {boolean} [options.tile=false] - write a tiled tiff + * @param {number} [options.tileWidth=256] - horizontal tile size + * @param {number} [options.tileHeight=256] - vertical tile size + * @param {number} [options.xres=1.0] - horizontal resolution in pixels/mm + * @param {number} [options.yres=1.0] - vertical resolution in pixels/mm + * @param {string} [options.resolutionUnit='inch'] - resolution unit options: inch, cm + * @param {number} [options.bitdepth=8] - reduce bitdepth to 1, 2 or 4 bit + * @param {boolean} [options.miniswhite=false] - write 1-bit images as miniswhite * @returns {Sharp} * @throws {Error} Invalid options */ -function jpeg (options) { +function tiff (options) { if (is.object(options)) { if (is.defined(options.quality)) { if (is.integer(options.quality) && is.inRange(options.quality, 1, 100)) { - this.options.jpegQuality = options.quality; + this.options.tiffQuality = options.quality; } else { - throw new Error('Invalid quality (integer, 1-100) ' + options.quality); + throw is.invalidParameterError('quality', 'integer between 1 and 100', options.quality); } } - if (is.defined(options.progressive)) { - this._setBooleanOption('jpegProgressive', options.progressive); + if (is.defined(options.bitdepth)) { + if (is.integer(options.bitdepth) && is.inArray(options.bitdepth, [1, 2, 4, 8])) { + this.options.tiffBitdepth = options.bitdepth; + } else { + throw is.invalidParameterError('bitdepth', '1, 2, 4 or 8', options.bitdepth); + } } - if (is.defined(options.chromaSubsampling)) { - if (is.string(options.chromaSubsampling) && is.inArray(options.chromaSubsampling, ['4:2:0', '4:4:4'])) { - this.options.jpegChromaSubsampling = options.chromaSubsampling; + // tiling + if (is.defined(options.tile)) { + this._setBooleanOption('tiffTile', options.tile); + } + if (is.defined(options.tileWidth)) { + if (is.integer(options.tileWidth) && options.tileWidth > 0) { + this.options.tiffTileWidth = options.tileWidth; + } else { + throw is.invalidParameterError('tileWidth', 'integer greater than zero', options.tileWidth); + } + } + if (is.defined(options.tileHeight)) { + if (is.integer(options.tileHeight) && options.tileHeight > 0) { + this.options.tiffTileHeight = options.tileHeight; } else { - throw new Error('Invalid chromaSubsampling (4:2:0, 4:4:4) ' + options.chromaSubsampling); + throw is.invalidParameterError('tileHeight', 'integer greater than zero', options.tileHeight); } } - options.trellisQuantisation = is.bool(options.trellisQuantization) ? options.trellisQuantization : options.trellisQuantisation; - if (is.defined(options.trellisQuantisation)) { - this._setBooleanOption('jpegTrellisQuantisation', options.trellisQuantisation); + // miniswhite + if (is.defined(options.miniswhite)) { + this._setBooleanOption('tiffMiniswhite', options.miniswhite); } - if (is.defined(options.overshootDeringing)) { - this._setBooleanOption('jpegOvershootDeringing', options.overshootDeringing); + // pyramid + if (is.defined(options.pyramid)) { + this._setBooleanOption('tiffPyramid', options.pyramid); } - options.optimiseScans = is.bool(options.optimizeScans) ? options.optimizeScans : options.optimiseScans; - if (is.defined(options.optimiseScans)) { - this._setBooleanOption('jpegOptimiseScans', options.optimiseScans); - if (options.optimiseScans) { - this.options.jpegProgressive = true; + // resolution + if (is.defined(options.xres)) { + if (is.number(options.xres) && options.xres > 0) { + this.options.tiffXres = options.xres; + } else { + throw is.invalidParameterError('xres', 'number greater than zero', options.xres); } } - } - return this._updateFormatOut('jpeg', options); -} - -/** - * Use these PNG options for output image. - * @param {Object} [options] - * @param {Boolean} [options.progressive=false] - use progressive (interlace) scan - * @param {Number} [options.compressionLevel=6] - zlib compression level - * @param {Boolean} [options.adaptiveFiltering=true] - use adaptive row filtering - * @param {Boolean} [options.force=true] - force PNG output, otherwise attempt to use input format - * @returns {Sharp} - * @throws {Error} Invalid options - */ -function png (options) { - if (is.object(options)) { - if (is.defined(options.progressive)) { - this._setBooleanOption('pngProgressive', options.progressive); + if (is.defined(options.yres)) { + if (is.number(options.yres) && options.yres > 0) { + this.options.tiffYres = options.yres; + } else { + throw is.invalidParameterError('yres', 'number greater than zero', options.yres); + } } - if (is.defined(options.compressionLevel)) { - if (is.integer(options.compressionLevel) && is.inRange(options.compressionLevel, 0, 9)) { - this.options.pngCompressionLevel = options.compressionLevel; + // compression + if (is.defined(options.compression)) { + if (is.string(options.compression) && is.inArray(options.compression, ['none', 'jpeg', 'deflate', 'packbits', 'ccittfax4', 'lzw', 'webp', 'zstd', 'jp2k'])) { + this.options.tiffCompression = options.compression; } else { - throw new Error('Invalid compressionLevel (integer, 0-9) ' + options.compressionLevel); + throw is.invalidParameterError('compression', 'one of: none, jpeg, deflate, packbits, ccittfax4, lzw, webp, zstd, jp2k', options.compression); } } - if (is.defined(options.adaptiveFiltering)) { - this._setBooleanOption('pngAdaptiveFiltering', options.adaptiveFiltering); + // bigtiff + if (is.defined(options.bigtiff)) { + this._setBooleanOption('tiffBigtiff', options.bigtiff); + } + // predictor + if (is.defined(options.predictor)) { + if (is.string(options.predictor) && is.inArray(options.predictor, ['none', 'horizontal', 'float'])) { + this.options.tiffPredictor = options.predictor; + } else { + throw is.invalidParameterError('predictor', 'one of: none, horizontal, float', options.predictor); + } + } + // resolutionUnit + if (is.defined(options.resolutionUnit)) { + if (is.string(options.resolutionUnit) && is.inArray(options.resolutionUnit, ['inch', 'cm'])) { + this.options.tiffResolutionUnit = options.resolutionUnit; + } else { + throw is.invalidParameterError('resolutionUnit', 'one of: inch, cm', options.resolutionUnit); + } } } - return this._updateFormatOut('png', options); + return this._updateFormatOut('tiff', options); } /** - * Use these WebP options for output image. + * Use these AVIF options for output image. + * + * AVIF image sequences are not supported. + * Prebuilt binaries support a bitdepth of 8 only. + * + * This feature is experimental on the Windows ARM64 platform + * and requires a CPU with ARM64v8.4 or later. + * + * @example + * const data = await sharp(input) + * .avif({ effort: 2 }) + * .toBuffer(); + * + * @example + * const data = await sharp(input) + * .avif({ lossless: true }) + * .toBuffer(); + * + * @since 0.27.0 + * * @param {Object} [options] - output options - * @param {Number} [options.quality=80] - quality, integer 1-100 - * @param {Number} [options.alphaQuality=100] - quality of alpha layer, integer 0-100 - * @param {Boolean} [options.lossless=false] - use lossless compression mode - * @param {Boolean} [options.nearLossless=false] - use near_lossless compression mode - * @param {Boolean} [options.force=true] - force WebP output, otherwise attempt to use input format + * @param {number} [options.quality=50] - quality, integer 1-100 + * @param {boolean} [options.lossless=false] - use lossless compression + * @param {number} [options.effort=4] - CPU effort, between 0 (fastest) and 9 (slowest) + * @param {string} [options.chromaSubsampling='4:4:4'] - set to '4:2:0' to use chroma subsampling + * @param {number} [options.bitdepth=8] - set bitdepth to 8, 10 or 12 bit * @returns {Sharp} * @throws {Error} Invalid options */ -function webp (options) { - if (is.object(options) && is.defined(options.quality)) { - if (is.integer(options.quality) && is.inRange(options.quality, 1, 100)) { - this.options.webpQuality = options.quality; - } else { - throw new Error('Invalid quality (integer, 1-100) ' + options.quality); - } - } - if (is.object(options) && is.defined(options.alphaQuality)) { - if (is.integer(options.alphaQuality) && is.inRange(options.alphaQuality, 1, 100)) { - this.options.webpAlphaQuality = options.alphaQuality; - } else { - throw new Error('Invalid webp alpha quality (integer, 1-100) ' + options.alphaQuality); - } - } - if (is.object(options) && is.defined(options.lossless)) { - this._setBooleanOption('webpLossless', options.lossless); - } - if (is.object(options) && is.defined(options.nearLossless)) { - this._setBooleanOption('webpNearLossless', options.nearLossless); - } - return this._updateFormatOut('webp', options); +function avif (options) { + return this.heif({ ...options, compression: 'av1' }); } /** - * Use these TIFF options for output image. - * @param {Object} [options] - output options - * @param {Number} [options.quality=80] - quality, integer 1-100 - * @param {Boolean} [options.force=true] - force TIFF output, otherwise attempt to use input format - * @param {Boolean} [options.compression='jpeg'] - compression options: lzw, deflate, jpeg - * @param {Boolean} [options.predictor='none'] - compression predictor options: none, horizontal, float - * @param {Number} [options.xres=1.0] - horizontal resolution in pixels/mm - * @param {Number} [options.yres=1.0] - vertical resolution in pixels/mm - * @param {Boolean} [options.squash=false] - squash 8-bit images down to 1 bit + * Use these HEIF options for output image. + * + * Support for patent-encumbered HEIC images using `hevc` compression requires the use of a + * globally-installed libvips compiled with support for libheif, libde265 and x265. + * + * @example + * const data = await sharp(input) + * .heif({ compression: 'hevc' }) + * .toBuffer(); + * + * @since 0.23.0 + * + * @param {Object} options - output options + * @param {string} options.compression - compression format: av1, hevc + * @param {number} [options.quality=50] - quality, integer 1-100 + * @param {boolean} [options.lossless=false] - use lossless compression + * @param {number} [options.effort=4] - CPU effort, between 0 (fastest) and 9 (slowest) + * @param {string} [options.chromaSubsampling='4:4:4'] - set to '4:2:0' to use chroma subsampling + * @param {number} [options.bitdepth=8] - set bitdepth to 8, 10 or 12 bit * @returns {Sharp} * @throws {Error} Invalid options */ -function tiff (options) { - if (is.object(options) && is.defined(options.quality)) { - if (is.integer(options.quality) && is.inRange(options.quality, 1, 100)) { - this.options.tiffQuality = options.quality; +function heif (options) { + if (is.object(options)) { + if (is.string(options.compression) && is.inArray(options.compression, ['av1', 'hevc'])) { + this.options.heifCompression = options.compression; } else { - throw new Error('Invalid quality (integer, 1-100) ' + options.quality); + throw is.invalidParameterError('compression', 'one of: av1, hevc', options.compression); } - } - if (is.object(options) && is.defined(options.squash)) { - if (is.bool(options.squash)) { - this.options.tiffSquash = options.squash; - } else { - throw new Error('Invalid Value for squash ' + options.squash + ' Only Boolean Values allowed for options.squash.'); + if (is.defined(options.quality)) { + if (is.integer(options.quality) && is.inRange(options.quality, 1, 100)) { + this.options.heifQuality = options.quality; + } else { + throw is.invalidParameterError('quality', 'integer between 1 and 100', options.quality); + } } - } - // resolution - if (is.object(options) && is.defined(options.xres)) { - if (is.number(options.xres)) { - this.options.tiffXres = options.xres; - } else { - throw new Error('Invalid Value for xres ' + options.xres + ' Only numeric values allowed for options.xres'); + if (is.defined(options.lossless)) { + if (is.bool(options.lossless)) { + this.options.heifLossless = options.lossless; + } else { + throw is.invalidParameterError('lossless', 'boolean', options.lossless); + } } - } - if (is.object(options) && is.defined(options.yres)) { - if (is.number(options.yres)) { - this.options.tiffYres = options.yres; - } else { - throw new Error('Invalid Value for yres ' + options.yres + ' Only numeric values allowed for options.yres'); + if (is.defined(options.effort)) { + if (is.integer(options.effort) && is.inRange(options.effort, 0, 9)) { + this.options.heifEffort = options.effort; + } else { + throw is.invalidParameterError('effort', 'integer between 0 and 9', options.effort); + } } - } - // compression - if (is.defined(options) && is.defined(options.compression)) { - if (is.string(options.compression) && is.inArray(options.compression, ['lzw', 'deflate', 'jpeg', 'none'])) { - this.options.tiffCompression = options.compression; - } else { - const message = `Invalid compression option "${options.compression}". Should be one of: lzw, deflate, jpeg, none`; - throw new Error(message); + if (is.defined(options.chromaSubsampling)) { + if (is.string(options.chromaSubsampling) && is.inArray(options.chromaSubsampling, ['4:2:0', '4:4:4'])) { + this.options.heifChromaSubsampling = options.chromaSubsampling; + } else { + throw is.invalidParameterError('chromaSubsampling', 'one of: 4:2:0, 4:4:4', options.chromaSubsampling); + } } - } - // predictor - if (is.defined(options) && is.defined(options.predictor)) { - if (is.string(options.predictor) && is.inArray(options.predictor, ['none', 'horizontal', 'float'])) { - this.options.tiffPredictor = options.predictor; - } else { - const message = `Invalid predictor option "${options.predictor}". Should be one of: none, horizontal, float`; - throw new Error(message); + if (is.defined(options.bitdepth)) { + if (is.integer(options.bitdepth) && is.inArray(options.bitdepth, [8, 10, 12])) { + if (options.bitdepth !== 8 && this.constructor.versions.heif) { + throw is.invalidParameterError('bitdepth when using prebuilt binaries', 8, options.bitdepth); + } + this.options.heifBitdepth = options.bitdepth; + } else { + throw is.invalidParameterError('bitdepth', '8, 10 or 12', options.bitdepth); + } } + } else { + throw is.invalidParameterError('options', 'Object', options); } - return this._updateFormatOut('tiff', options); + return this._updateFormatOut('heif', options); } /** - * Force output to be raw, uncompressed uint8 pixel data. + * Use these JPEG-XL (JXL) options for output image. + * + * This feature is experimental, please do not use in production systems. + * + * Requires libvips compiled with support for libjxl. + * The prebuilt binaries do not include this - see + * {@link /install/#custom-libvips installing a custom libvips}. + * + * @since 0.31.3 + * + * @param {Object} [options] - output options + * @param {number} [options.distance=1.0] - maximum encoding error, between 0 (highest quality) and 15 (lowest quality) + * @param {number} [options.quality] - calculate `distance` based on JPEG-like quality, between 1 and 100, overrides distance if specified + * @param {number} [options.decodingTier=0] - target decode speed tier, between 0 (highest quality) and 4 (lowest quality) + * @param {boolean} [options.lossless=false] - use lossless compression + * @param {number} [options.effort=7] - CPU effort, between 1 (fastest) and 9 (slowest) + * @param {number} [options.loop=0] - number of animation iterations, use 0 for infinite animation + * @param {number|number[]} [options.delay] - delay(s) between animation frames (in milliseconds) * @returns {Sharp} + * @throws {Error} Invalid options */ -function raw () { - return this._updateFormatOut('raw'); +function jxl (options) { + if (is.object(options)) { + if (is.defined(options.quality)) { + if (is.integer(options.quality) && is.inRange(options.quality, 1, 100)) { + // https://github.com/libjxl/libjxl/blob/0aeea7f180bafd6893c1db8072dcb67d2aa5b03d/tools/cjxl_main.cc#L640-L644 + this.options.jxlDistance = options.quality >= 30 + ? 0.1 + (100 - options.quality) * 0.09 + : 53 / 3000 * options.quality * options.quality - 23 / 20 * options.quality + 25; + } else { + throw is.invalidParameterError('quality', 'integer between 1 and 100', options.quality); + } + } else if (is.defined(options.distance)) { + if (is.number(options.distance) && is.inRange(options.distance, 0, 15)) { + this.options.jxlDistance = options.distance; + } else { + throw is.invalidParameterError('distance', 'number between 0.0 and 15.0', options.distance); + } + } + if (is.defined(options.decodingTier)) { + if (is.integer(options.decodingTier) && is.inRange(options.decodingTier, 0, 4)) { + this.options.jxlDecodingTier = options.decodingTier; + } else { + throw is.invalidParameterError('decodingTier', 'integer between 0 and 4', options.decodingTier); + } + } + if (is.defined(options.lossless)) { + if (is.bool(options.lossless)) { + this.options.jxlLossless = options.lossless; + } else { + throw is.invalidParameterError('lossless', 'boolean', options.lossless); + } + } + if (is.defined(options.effort)) { + if (is.integer(options.effort) && is.inRange(options.effort, 1, 9)) { + this.options.jxlEffort = options.effort; + } else { + throw is.invalidParameterError('effort', 'integer between 1 and 9', options.effort); + } + } + } + trySetAnimationOptions(options, this.options); + return this._updateFormatOut('jxl', options); } /** - * Force output to a given format. - * @param {(String|Object)} format - as a String or an Object with an 'id' attribute - * @param {Object} options - output options + * Force output to be raw, uncompressed pixel data. + * Pixel ordering is left-to-right, top-to-bottom, without padding. + * Channel ordering will be RGB or RGBA for non-greyscale colourspaces. + * + * @example + * // Extract raw, unsigned 8-bit RGB pixel data from JPEG input + * const { data, info } = await sharp('input.jpg') + * .raw() + * .toBuffer({ resolveWithObject: true }); + * + * @example + * // Extract alpha channel as raw, unsigned 16-bit pixel data from PNG input + * const data = await sharp('input.png') + * .ensureAlpha() + * .extractChannel(3) + * .toColourspace('b-w') + * .raw({ depth: 'ushort' }) + * .toBuffer(); + * + * @param {Object} [options] - output options + * @param {string} [options.depth='uchar'] - bit depth, one of: char, uchar (default), short, ushort, int, uint, float, complex, double, dpcomplex * @returns {Sharp} - * @throws {Error} unsupported format or options + * @throws {Error} Invalid options */ -function toFormat (format, options) { - if (is.object(format) && is.string(format.id)) { - format = format.id; - } - if (format === 'jpg') format = 'jpeg'; - if (!is.inArray(format, ['jpeg', 'png', 'webp', 'tiff', 'raw'])) { - throw new Error('Unsupported output format ' + format); +function raw (options) { + if (is.object(options)) { + if (is.defined(options.depth)) { + if (is.string(options.depth) && is.inArray(options.depth, + ['char', 'uchar', 'short', 'ushort', 'int', 'uint', 'float', 'complex', 'double', 'dpcomplex'] + )) { + this.options.rawDepth = options.depth; + } else { + throw is.invalidParameterError('depth', 'one of: char, uchar, short, ushort, int, uint, float, complex, double, dpcomplex', options.depth); + } + } } - return this[format](options); + return this._updateFormatOut('raw'); } /** * Use tile-based deep zoom (image pyramid) output. + * * Set the format and options for tile images via the `toFormat`, `jpeg`, `png` or `webp` functions. * Use a `.zip` or `.szi` file extension with `toFile` to write to a compressed archive file format. * + * The container will be set to `zip` when the output is a Buffer or Stream, otherwise it will default to `fs`. + * * @example * sharp('input.tiff') * .png() @@ -313,49 +1319,117 @@ function toFormat (format, options) { * // output_files contains 512x512 tiles grouped by zoom level * }); * - * @param {Object} [tile] - * @param {Number} [tile.size=256] tile size in pixels, a value between 1 and 8192. - * @param {Number} [tile.overlap=0] tile overlap in pixels, a value between 0 and 8192. - * @param {String} [tile.container='fs'] tile container, with value `fs` (filesystem) or `zip` (compressed file). - * @param {String} [tile.layout='dz'] filesystem layout, possible values are `dz`, `zoomify` or `google`. + * @example + * const zipFileWithTiles = await sharp(input) + * .tile({ basename: "tiles" }) + * .toBuffer(); + * + * @example + * const iiififier = sharp().tile({ layout: "iiif" }); + * readableStream + * .pipe(iiififier) + * .pipe(writeableStream); + * + * @param {Object} [options] + * @param {number} [options.size=256] tile size in pixels, a value between 1 and 8192. + * @param {number} [options.overlap=0] tile overlap in pixels, a value between 0 and 8192. + * @param {number} [options.angle=0] tile angle of rotation, must be a multiple of 90. + * @param {string|Object} [options.background={r: 255, g: 255, b: 255, alpha: 1}] - background colour, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to white without transparency. + * @param {string} [options.depth] how deep to make the pyramid, possible values are `onepixel`, `onetile` or `one`, default based on layout. + * @param {number} [options.skipBlanks=-1] Threshold to skip tile generation. Range is 0-255 for 8-bit images, 0-65535 for 16-bit images. Default is 5 for `google` layout, -1 (no skip) otherwise. + * @param {string} [options.container='fs'] tile container, with value `fs` (filesystem) or `zip` (compressed file). + * @param {string} [options.layout='dz'] filesystem layout, possible values are `dz`, `iiif`, `iiif3`, `zoomify` or `google`. + * @param {boolean} [options.centre=false] centre image in tile. + * @param {boolean} [options.center=false] alternative spelling of centre. + * @param {string} [options.id='https://example.com/iiif'] when `layout` is `iiif`/`iiif3`, sets the `@id`/`id` attribute of `info.json` + * @param {string} [options.basename] the name of the directory within the zip file when container is `zip`. * @returns {Sharp} * @throws {Error} Invalid parameters */ -function tile (tile) { - if (is.object(tile)) { +function tile (options) { + if (is.object(options)) { // Size of square tiles, in pixels - if (is.defined(tile.size)) { - if (is.integer(tile.size) && is.inRange(tile.size, 1, 8192)) { - this.options.tileSize = tile.size; + if (is.defined(options.size)) { + if (is.integer(options.size) && is.inRange(options.size, 1, 8192)) { + this.options.tileSize = options.size; } else { - throw new Error('Invalid tile size (1 to 8192) ' + tile.size); + throw is.invalidParameterError('size', 'integer between 1 and 8192', options.size); } } // Overlap of tiles, in pixels - if (is.defined(tile.overlap)) { - if (is.integer(tile.overlap) && is.inRange(tile.overlap, 0, 8192)) { - if (tile.overlap > this.options.tileSize) { - throw new Error('Tile overlap ' + tile.overlap + ' cannot be larger than tile size ' + this.options.tileSize); + if (is.defined(options.overlap)) { + if (is.integer(options.overlap) && is.inRange(options.overlap, 0, 8192)) { + if (options.overlap > this.options.tileSize) { + throw is.invalidParameterError('overlap', `<= size (${this.options.tileSize})`, options.overlap); } - this.options.tileOverlap = tile.overlap; + this.options.tileOverlap = options.overlap; } else { - throw new Error('Invalid tile overlap (0 to 8192) ' + tile.overlap); + throw is.invalidParameterError('overlap', 'integer between 0 and 8192', options.overlap); } } // Container - if (is.defined(tile.container)) { - if (is.string(tile.container) && is.inArray(tile.container, ['fs', 'zip'])) { - this.options.tileContainer = tile.container; + if (is.defined(options.container)) { + if (is.string(options.container) && is.inArray(options.container, ['fs', 'zip'])) { + this.options.tileContainer = options.container; } else { - throw new Error('Invalid tile container ' + tile.container); + throw is.invalidParameterError('container', 'one of: fs, zip', options.container); } } // Layout - if (is.defined(tile.layout)) { - if (is.string(tile.layout) && is.inArray(tile.layout, ['dz', 'google', 'zoomify'])) { - this.options.tileLayout = tile.layout; + if (is.defined(options.layout)) { + if (is.string(options.layout) && is.inArray(options.layout, ['dz', 'google', 'iiif', 'iiif3', 'zoomify'])) { + this.options.tileLayout = options.layout; + } else { + throw is.invalidParameterError('layout', 'one of: dz, google, iiif, iiif3, zoomify', options.layout); + } + } + // Angle of rotation, + if (is.defined(options.angle)) { + if (is.integer(options.angle) && !(options.angle % 90)) { + this.options.tileAngle = options.angle; } else { - throw new Error('Invalid tile layout ' + tile.layout); + throw is.invalidParameterError('angle', 'positive/negative multiple of 90', options.angle); + } + } + // Background colour + this._setBackgroundColourOption('tileBackground', options.background); + // Depth of tiles + if (is.defined(options.depth)) { + if (is.string(options.depth) && is.inArray(options.depth, ['onepixel', 'onetile', 'one'])) { + this.options.tileDepth = options.depth; + } else { + throw is.invalidParameterError('depth', 'one of: onepixel, onetile, one', options.depth); + } + } + // Threshold to skip blank tiles + if (is.defined(options.skipBlanks)) { + if (is.integer(options.skipBlanks) && is.inRange(options.skipBlanks, -1, 65535)) { + this.options.tileSkipBlanks = options.skipBlanks; + } else { + throw is.invalidParameterError('skipBlanks', 'integer between -1 and 255/65535', options.skipBlanks); + } + } else if (is.defined(options.layout) && options.layout === 'google') { + this.options.tileSkipBlanks = 5; + } + // Center image in tile + const centre = is.bool(options.center) ? options.center : options.centre; + if (is.defined(centre)) { + this._setBooleanOption('tileCentre', centre); + } + // @id attribute for IIIF layout + if (is.defined(options.id)) { + if (is.string(options.id)) { + this.options.tileId = options.id; + } else { + throw is.invalidParameterError('id', 'string', options.id); + } + } + // Basename for zip container + if (is.defined(options.basename)) { + if (is.string(options.basename)) { + this.options.tileBasename = options.basename; + } else { + throw is.invalidParameterError('basename', 'string', options.basename); } } } @@ -363,37 +1437,75 @@ function tile (tile) { if (is.inArray(this.options.formatOut, ['jpeg', 'png', 'webp'])) { this.options.tileFormat = this.options.formatOut; } else if (this.options.formatOut !== 'input') { - throw new Error('Invalid tile format ' + this.options.formatOut); + throw is.invalidParameterError('format', 'one of: jpeg, png, webp', this.options.formatOut); } return this._updateFormatOut('dz'); } +/** + * Set a timeout for processing, in seconds. + * Use a value of zero to continue processing indefinitely, the default behaviour. + * + * The clock starts when libvips opens an input image for processing. + * Time spent waiting for a libuv thread to become available is not included. + * + * @example + * // Ensure processing takes no longer than 3 seconds + * try { + * const data = await sharp(input) + * .blur(1000) + * .timeout({ seconds: 3 }) + * .toBuffer(); + * } catch (err) { + * if (err.message.includes('timeout')) { ... } + * } + * + * @since 0.29.2 + * + * @param {Object} options + * @param {number} options.seconds - Number of seconds after which processing will be stopped + * @returns {Sharp} + */ +function timeout (options) { + if (!is.plainObject(options)) { + throw is.invalidParameterError('options', 'object', options); + } + if (is.integer(options.seconds) && is.inRange(options.seconds, 0, 3600)) { + this.options.timeoutSeconds = options.seconds; + } else { + throw is.invalidParameterError('seconds', 'integer between 0 and 3600', options.seconds); + } + return this; +} + /** * Update the output format unless options.force is false, * in which case revert to input format. * @private - * @param {String} formatOut + * @param {string} formatOut * @param {Object} [options] - * @param {Boolean} [options.force=true] - force output format, otherwise attempt to use input format + * @param {boolean} [options.force=true] - force output format, otherwise attempt to use input format * @returns {Sharp} */ function _updateFormatOut (formatOut, options) { - this.options.formatOut = (is.object(options) && options.force === false) ? 'input' : formatOut; + if (!(is.object(options) && options.force === false)) { + this.options.formatOut = formatOut; + } return this; } /** - * Update a Boolean attribute of the this.options Object. + * Update a boolean attribute of the this.options Object. * @private - * @param {String} key - * @param {Boolean} val + * @param {string} key + * @param {boolean} val * @throws {Error} Invalid key */ function _setBooleanOption (key, val) { if (is.bool(val)) { this.options[key] = val; } else { - throw new Error('Invalid ' + key + ' (boolean) ' + val); + throw is.invalidParameterError(key, 'boolean', val); } } @@ -404,7 +1516,8 @@ function _setBooleanOption (key, val) { function _read () { if (!this.options.streamOut) { this.options.streamOut = true; - this._pipeline(); + const stack = Error(); + this._pipeline(undefined, stack); } } @@ -413,60 +1526,63 @@ function _read () { * Supports callback, stream and promise variants * @private */ -function _pipeline (callback) { - const that = this; +function _pipeline (callback, stack) { if (typeof callback === 'function') { // output=file/buffer if (this._isStreamInput()) { // output=file/buffer, input=stream - this.on('finish', function () { - that._flattenBufferIn(); - sharp.pipeline(that.options, callback); + this.on('finish', () => { + this._flattenBufferIn(); + sharp.pipeline(this.options, (err, data, info) => { + if (err) { + callback(is.nativeError(err, stack)); + } else { + callback(null, data, info); + } + }); }); } else { // output=file/buffer, input=file/buffer - sharp.pipeline(this.options, callback); + sharp.pipeline(this.options, (err, data, info) => { + if (err) { + callback(is.nativeError(err, stack)); + } else { + callback(null, data, info); + } + }); } return this; } else if (this.options.streamOut) { // output=stream if (this._isStreamInput()) { // output=stream, input=stream - if (this.streamInFinished) { + this.once('finish', () => { this._flattenBufferIn(); - sharp.pipeline(this.options, function (err, data, info) { + sharp.pipeline(this.options, (err, data, info) => { if (err) { - that.emit('error', err); + this.emit('error', is.nativeError(err, stack)); } else { - that.emit('info', info); - that.push(data); + this.emit('info', info); + this.push(data); } - that.push(null); - }); - } else { - this.on('finish', function () { - that._flattenBufferIn(); - sharp.pipeline(that.options, function (err, data, info) { - if (err) { - that.emit('error', err); - } else { - that.emit('info', info); - that.push(data); - } - that.push(null); - }); + this.push(null); + this.on('end', () => this.emit('close')); }); + }); + if (this.streamInFinished) { + this.emit('finish'); } } else { // output=stream, input=file/buffer - sharp.pipeline(this.options, function (err, data, info) { + sharp.pipeline(this.options, (err, data, info) => { if (err) { - that.emit('error', err); + this.emit('error', is.nativeError(err, stack)); } else { - that.emit('info', info); - that.push(data); + this.emit('info', info); + this.push(data); } - that.push(null); + this.push(null); + this.on('end', () => this.emit('close')); }); } return this; @@ -474,15 +1590,15 @@ function _pipeline (callback) { // output=promise if (this._isStreamInput()) { // output=promise, input=stream - return new Promise(function (resolve, reject) { - that.on('finish', function () { - that._flattenBufferIn(); - sharp.pipeline(that.options, function (err, data, info) { + return new Promise((resolve, reject) => { + this.once('finish', () => { + this._flattenBufferIn(); + sharp.pipeline(this.options, (err, data, info) => { if (err) { - reject(err); + reject(is.nativeError(err, stack)); } else { - if (that.options.resolveWithObject) { - resolve({ data: data, info: info }); + if (this.options.resolveWithObject) { + resolve({ data, info }); } else { resolve(data); } @@ -492,13 +1608,13 @@ function _pipeline (callback) { }); } else { // output=promise, input=file/buffer - return new Promise(function (resolve, reject) { - sharp.pipeline(that.options, function (err, data, info) { + return new Promise((resolve, reject) => { + sharp.pipeline(this.options, (err, data, info) => { if (err) { - reject(err); + reject(is.nativeError(err, stack)); } else { - if (that.options.resolveWithObject) { - resolve({ data: data, info: info }); + if (this.options.resolveWithObject) { + resolve({ data, info }); } else { resolve(data); } @@ -511,27 +1627,40 @@ function _pipeline (callback) { /** * Decorate the Sharp prototype with output-related functions. + * @module Sharp * @private */ -module.exports = function (Sharp) { - [ +module.exports = (Sharp) => { + Object.assign(Sharp.prototype, { // Public toFile, toBuffer, + keepExif, + withExif, + withExifMerge, + keepIccProfile, + withIccProfile, + keepXmp, + withXmp, + keepMetadata, withMetadata, + toFormat, jpeg, + jp2, png, webp, tiff, + avif, + heif, + jxl, + gif, raw, - toFormat, tile, + timeout, // Private _updateFormatOut, _setBooleanOption, _read, _pipeline - ].forEach(function (f) { - Sharp.prototype[f.name] = f; }); }; diff --git a/lib/resize.js b/lib/resize.js index dcb0710c6..544fbba3a 100644 --- a/lib/resize.js +++ b/lib/resize.js @@ -1,9 +1,12 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ const is = require('./is'); /** - * Weighting to apply to image crop. + * Weighting to apply when using contain/cover fit. * @member * @private */ @@ -21,7 +24,35 @@ const gravity = { }; /** - * Strategies for automagic crop behaviour. + * Position to apply when using contain/cover fit. + * @member + * @private + */ +const position = { + top: 1, + right: 2, + bottom: 3, + left: 4, + 'right top': 5, + 'right bottom': 6, + 'left bottom': 7, + 'left top': 8 +}; + +/** + * How to extend the image. + * @member + * @private + */ +const extendWith = { + background: 'background', + copy: 'copy', + repeat: 'repeat', + mirror: 'mirror' +}; + +/** + * Strategies for automagic cover behaviour. * @member * @private */ @@ -37,76 +68,207 @@ const strategy = { */ const kernel = { nearest: 'nearest', + linear: 'linear', cubic: 'cubic', + mitchell: 'mitchell', lanczos2: 'lanczos2', - lanczos3: 'lanczos3' + lanczos3: 'lanczos3', + mks2013: 'mks2013', + mks2021: 'mks2021' }; /** - * Enlargement interpolators. + * Methods by which an image can be resized to fit the provided dimensions. * @member * @private */ -const interpolator = { - nearest: 'nearest', - bilinear: 'bilinear', - bicubic: 'bicubic', - nohalo: 'nohalo', - lbb: 'lbb', - locallyBoundedBicubic: 'lbb', - vsqbs: 'vsqbs', - vertexSplitQuadraticBasisSpline: 'vsqbs' +const fit = { + contain: 'contain', + cover: 'cover', + fill: 'fill', + inside: 'inside', + outside: 'outside' +}; + +/** + * Map external fit property to internal canvas property. + * @member + * @private + */ +const mapFitToCanvas = { + contain: 'embed', + cover: 'crop', + fill: 'ignore_aspect', + inside: 'max', + outside: 'min' }; /** - * Resize image to `width` x `height`. - * By default, the resized image is centre cropped to the exact size specified. + * @private + */ +function isRotationExpected (options) { + return (options.angle % 360) !== 0 || options.rotationAngle !== 0; +} + +/** + * @private + */ +function isResizeExpected (options) { + return options.width !== -1 || options.height !== -1; +} + +/** + * Resize image to `width`, `height` or `width x height`. + * + * When both a `width` and `height` are provided, the possible methods by which the image should **fit** these are: + * - `cover`: (default) Preserving aspect ratio, attempt to ensure the image covers both provided dimensions by cropping/clipping to fit. + * - `contain`: Preserving aspect ratio, contain within both provided dimensions using "letterboxing" where necessary. + * - `fill`: Ignore the aspect ratio of the input and stretch to both provided dimensions. + * - `inside`: Preserving aspect ratio, resize the image to be as large as possible while ensuring its dimensions are less than or equal to both those specified. + * - `outside`: Preserving aspect ratio, resize the image to be as small as possible while ensuring its dimensions are greater than or equal to both those specified. + * + * Some of these values are based on the [object-fit](https://developer.mozilla.org/en-US/docs/Web/CSS/object-fit) CSS property. * - * Possible reduction kernels are: + * Examples of various values for the fit property when resizing + * + * When using a **fit** of `cover` or `contain`, the default **position** is `centre`. Other options are: + * - `sharp.position`: `top`, `right top`, `right`, `right bottom`, `bottom`, `left bottom`, `left`, `left top`. + * - `sharp.gravity`: `north`, `northeast`, `east`, `southeast`, `south`, `southwest`, `west`, `northwest`, `center` or `centre`. + * - `sharp.strategy`: `cover` only, dynamically crop using either the `entropy` or `attention` strategy. + * + * Some of these values are based on the [object-position](https://developer.mozilla.org/en-US/docs/Web/CSS/object-position) CSS property. + * + * The strategy-based approach initially resizes so one dimension is at its target length + * then repeatedly ranks edge regions, discarding the edge with the lowest score based on the selected strategy. + * - `entropy`: focus on the region with the highest [Shannon entropy](https://en.wikipedia.org/wiki/Entropy_%28information_theory%29). + * - `attention`: focus on the region with the highest luminance frequency, colour saturation and presence of skin tones. + * + * Possible downsizing kernels are: * - `nearest`: Use [nearest neighbour interpolation](http://en.wikipedia.org/wiki/Nearest-neighbor_interpolation). + * - `linear`: Use a [triangle filter](https://en.wikipedia.org/wiki/Triangular_function). * - `cubic`: Use a [Catmull-Rom spline](https://en.wikipedia.org/wiki/Centripetal_Catmull%E2%80%93Rom_spline). + * - `mitchell`: Use a [Mitchell-Netravali spline](https://www.cs.utexas.edu/~fussell/courses/cs384g-fall2013/lectures/mitchell/Mitchell.pdf). * - `lanczos2`: Use a [Lanczos kernel](https://en.wikipedia.org/wiki/Lanczos_resampling#Lanczos_kernel) with `a=2`. * - `lanczos3`: Use a Lanczos kernel with `a=3` (the default). + * - `mks2013`: Use a [Magic Kernel Sharp](https://johncostella.com/magic/mks.pdf) 2013 kernel, as adopted by Facebook. + * - `mks2021`: Use a Magic Kernel Sharp 2021 kernel, with more accurate (reduced) sharpening than the 2013 version. * - * Possible enlargement interpolators are: - * - `nearest`: Use [nearest neighbour interpolation](http://en.wikipedia.org/wiki/Nearest-neighbor_interpolation). - * - `bilinear`: Use [bilinear interpolation](http://en.wikipedia.org/wiki/Bilinear_interpolation), faster than bicubic but with less smooth results. - * - `vertexSplitQuadraticBasisSpline`: Use the smoother [VSQBS interpolation](https://github.com/jcupitt/libvips/blob/master/libvips/resample/vsqbs.cpp#L48) to prevent "staircasing" when enlarging. - * - `bicubic`: Use [bicubic interpolation](http://en.wikipedia.org/wiki/Bicubic_interpolation) (the default). - * - `locallyBoundedBicubic`: Use [LBB interpolation](https://github.com/jcupitt/libvips/blob/master/libvips/resample/lbb.cpp#L100), which prevents some "[acutance](http://en.wikipedia.org/wiki/Acutance)" but typically reduces performance by a factor of 2. - * - `nohalo`: Use [Nohalo interpolation](http://eprints.soton.ac.uk/268086/), which prevents acutance but typically reduces performance by a factor of 3. + * When upsampling, these kernels map to `nearest`, `linear` and `cubic` interpolators. + * Downsampling kernels without a matching upsampling interpolator map to `cubic`. + * + * Only one resize can occur per pipeline. + * Previous calls to `resize` in the same pipeline will be ignored. * * @example - * sharp(inputBuffer) + * sharp(input) + * .resize({ width: 100 }) + * .toBuffer() + * .then(data => { + * // 100 pixels wide, auto-scaled height + * }); + * + * @example + * sharp(input) + * .resize({ height: 100 }) + * .toBuffer() + * .then(data => { + * // 100 pixels high, auto-scaled width + * }); + * + * @example + * sharp(input) * .resize(200, 300, { - * kernel: sharp.kernel.lanczos2, - * interpolator: sharp.interpolator.nohalo + * kernel: sharp.kernel.nearest, + * fit: 'contain', + * position: 'right top', + * background: { r: 255, g: 255, b: 255, alpha: 0.5 } * }) - * .background('white') - * .embed() - * .toFile('output.tiff') - * .then(function() { - * // output.tiff is a 200 pixels wide and 300 pixels high image - * // containing a lanczos2/nohalo scaled version, embedded on a white canvas, - * // of the image data in inputBuffer + * .toFile('output.png') + * .then(() => { + * // output.png is a 200 pixels wide and 300 pixels high image + * // containing a nearest-neighbour scaled version + * // contained within the north-east corner of a semi-transparent white canvas * }); * - * @param {Number} [width] - pixels wide the resultant image should be. Use `null` or `undefined` to auto-scale the width to match the height. - * @param {Number} [height] - pixels high the resultant image should be. Use `null` or `undefined` to auto-scale the height to match the width. + * @example + * const transformer = sharp() + * .resize({ + * width: 200, + * height: 200, + * fit: sharp.fit.cover, + * position: sharp.strategy.entropy + * }); + * // Read image data from readableStream + * // Write 200px square auto-cropped image data to writableStream + * readableStream + * .pipe(transformer) + * .pipe(writableStream); + * + * @example + * sharp(input) + * .resize(200, 200, { + * fit: sharp.fit.inside, + * withoutEnlargement: true + * }) + * .toFormat('jpeg') + * .toBuffer() + * .then(function(outputBuffer) { + * // outputBuffer contains JPEG image data + * // no wider and no higher than 200 pixels + * // and no larger than the input image + * }); + * + * @example + * sharp(input) + * .resize(200, 200, { + * fit: sharp.fit.outside, + * withoutReduction: true + * }) + * .toFormat('jpeg') + * .toBuffer() + * .then(function(outputBuffer) { + * // outputBuffer contains JPEG image data + * // of at least 200 pixels wide and 200 pixels high while maintaining aspect ratio + * // and no smaller than the input image + * }); + * + * @example + * const scaleByHalf = await sharp(input) + * .metadata() + * .then(({ width }) => sharp(input) + * .resize(Math.round(width * 0.5)) + * .toBuffer() + * ); + * + * @param {number} [width] - How many pixels wide the resultant image should be. Use `null` or `undefined` to auto-scale the width to match the height. + * @param {number} [height] - How many pixels high the resultant image should be. Use `null` or `undefined` to auto-scale the height to match the width. * @param {Object} [options] - * @param {String} [options.kernel='lanczos3'] - the kernel to use for image reduction. - * @param {String} [options.interpolator='bicubic'] - the interpolator to use for image enlargement. - * @param {Boolean} [options.centreSampling=false] - use *magick centre sampling convention instead of corner sampling. - * @param {Boolean} [options.centerSampling=false] - alternative spelling of centreSampling. + * @param {number} [options.width] - An alternative means of specifying `width`. If both are present this takes priority. + * @param {number} [options.height] - An alternative means of specifying `height`. If both are present this takes priority. + * @param {String} [options.fit='cover'] - How the image should be resized/cropped to fit the target dimension(s), one of `cover`, `contain`, `fill`, `inside` or `outside`. + * @param {String} [options.position='centre'] - A position, gravity or strategy to use when `fit` is `cover` or `contain`. + * @param {String|Object} [options.background={r: 0, g: 0, b: 0, alpha: 1}] - background colour when `fit` is `contain`, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black without transparency. + * @param {String} [options.kernel='lanczos3'] - The kernel to use for image reduction and the inferred interpolator to use for upsampling. Use the `fastShrinkOnLoad` option to control kernel vs shrink-on-load. + * @param {Boolean} [options.withoutEnlargement=false] - Do not scale up if the width *or* height are already less than the target dimensions, equivalent to GraphicsMagick's `>` geometry option. This may result in output dimensions smaller than the target dimensions. + * @param {Boolean} [options.withoutReduction=false] - Do not scale down if the width *or* height are already greater than the target dimensions, equivalent to GraphicsMagick's `<` geometry option. This may still result in a crop to reach the target dimensions. + * @param {Boolean} [options.fastShrinkOnLoad=true] - Take greater advantage of the JPEG and WebP shrink-on-load feature, which can lead to a slight moiré pattern or round-down of an auto-scaled dimension. * @returns {Sharp} * @throws {Error} Invalid parameters */ -function resize (width, height, options) { - if (is.defined(width)) { - if (is.integer(width) && width > 0) { - this.options.width = width; +function resize (widthOrOptions, height, options) { + if (isResizeExpected(this.options)) { + this.options.debuglog('ignoring previous resize options'); + } + if (this.options.widthPost !== -1) { + this.options.debuglog('operation order will be: extract, resize, extract'); + } + if (is.defined(widthOrOptions)) { + if (is.object(widthOrOptions) && !is.defined(options)) { + options = widthOrOptions; + } else if (is.integer(widthOrOptions) && widthOrOptions > 0) { + this.options.width = widthOrOptions; } else { - throw is.invalidParameterError('width', 'positive integer', width); + throw is.invalidParameterError('width', 'positive integer', widthOrOptions); } } else { this.options.width = -1; @@ -121,6 +283,44 @@ function resize (width, height, options) { this.options.height = -1; } if (is.object(options)) { + // Width + if (is.defined(options.width)) { + if (is.integer(options.width) && options.width > 0) { + this.options.width = options.width; + } else { + throw is.invalidParameterError('width', 'positive integer', options.width); + } + } + // Height + if (is.defined(options.height)) { + if (is.integer(options.height) && options.height > 0) { + this.options.height = options.height; + } else { + throw is.invalidParameterError('height', 'positive integer', options.height); + } + } + // Fit + if (is.defined(options.fit)) { + const canvas = mapFitToCanvas[options.fit]; + if (is.string(canvas)) { + this.options.canvas = canvas; + } else { + throw is.invalidParameterError('fit', 'valid fit', options.fit); + } + } + // Position + if (is.defined(options.position)) { + const pos = is.integer(options.position) + ? options.position + : strategy[options.position] || position[options.position] || gravity[options.position]; + if (is.integer(pos) && (is.inRange(pos, 0, 8) || is.inRange(pos, 16, 17))) { + this.options.position = pos; + } else { + throw is.invalidParameterError('position', 'valid position/gravity/strategy', options.position); + } + } + // Background + this._setBackgroundColourOption('resizeBackground', options.background); // Kernel if (is.defined(options.kernel)) { if (is.string(kernel[options.kernel])) { @@ -129,178 +329,267 @@ function resize (width, height, options) { throw is.invalidParameterError('kernel', 'valid kernel name', options.kernel); } } - // Interpolator - if (is.defined(options.interpolator)) { - if (is.string(interpolator[options.interpolator])) { - this.options.interpolator = interpolator[options.interpolator]; - } else { - throw is.invalidParameterError('interpolator', 'valid interpolator name', options.interpolator); - } + // Without enlargement + if (is.defined(options.withoutEnlargement)) { + this._setBooleanOption('withoutEnlargement', options.withoutEnlargement); + } + // Without reduction + if (is.defined(options.withoutReduction)) { + this._setBooleanOption('withoutReduction', options.withoutReduction); } - // Centre sampling - options.centreSampling = is.bool(options.centerSampling) ? options.centerSampling : options.centreSampling; - if (is.defined(options.centreSampling)) { - this._setBooleanOption('centreSampling', options.centreSampling); + // Shrink on load + if (is.defined(options.fastShrinkOnLoad)) { + this._setBooleanOption('fastShrinkOnLoad', options.fastShrinkOnLoad); } } + if (isRotationExpected(this.options) && isResizeExpected(this.options)) { + this.options.rotateBefore = true; + } return this; } /** - * Crop the resized image to the exact size specified, the default behaviour. + * Extend / pad / extrude one or more edges of the image with either + * the provided background colour or pixels derived from the image. + * This operation will always occur after resizing and extraction, if any. * - * Possible attributes of the optional `sharp.gravity` are `north`, `northeast`, `east`, `southeast`, `south`, - * `southwest`, `west`, `northwest`, `center` and `centre`. + * @example + * // Resize to 140 pixels wide, then add 10 transparent pixels + * // to the top, left and right edges and 20 to the bottom edge + * sharp(input) + * .resize(140) + * .extend({ + * top: 10, + * bottom: 20, + * left: 10, + * right: 10, + * background: { r: 0, g: 0, b: 0, alpha: 0 } + * }) + * ... * - * The experimental strategy-based approach resizes so one dimension is at its target length - * then repeatedly ranks edge regions, discarding the edge with the lowest score based on the selected strategy. - * - `entropy`: focus on the region with the highest [Shannon entropy](https://en.wikipedia.org/wiki/Entropy_%28information_theory%29). - * - `attention`: focus on the region with the highest luminance frequency, colour saturation and presence of skin tones. +* @example + * // Add a row of 10 red pixels to the bottom + * sharp(input) + * .extend({ + * bottom: 10, + * background: 'red' + * }) + * ... * * @example - * const transformer = sharp() - * .resize(200, 200) - * .crop(sharp.strategy.entropy) - * .on('error', function(err) { - * console.log(err); - * }); - * // Read image data from readableStream - * // Write 200px square auto-cropped image data to writableStream - * readableStream.pipe(transformer).pipe(writableStream); + * // Extrude image by 8 pixels to the right, mirroring existing right hand edge + * sharp(input) + * .extend({ + * right: 8, + * background: 'mirror' + * }) + * ... * - * @param {String} [crop='centre'] - A member of `sharp.gravity` to crop to an edge/corner or `sharp.strategy` to crop dynamically. + * @param {(number|Object)} extend - single pixel count to add to all edges or an Object with per-edge counts + * @param {number} [extend.top=0] + * @param {number} [extend.left=0] + * @param {number} [extend.bottom=0] + * @param {number} [extend.right=0] + * @param {String} [extend.extendWith='background'] - populate new pixels using this method, one of: background, copy, repeat, mirror. + * @param {String|Object} [extend.background={r: 0, g: 0, b: 0, alpha: 1}] - background colour, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black without transparency. * @returns {Sharp} * @throws {Error} Invalid parameters - */ -function crop (crop) { - this.options.canvas = 'crop'; - if (!is.defined(crop)) { - // Default - this.options.crop = gravity.center; - } else if (is.integer(crop) && is.inRange(crop, 0, 8)) { - // Gravity (numeric) - this.options.crop = crop; - } else if (is.string(crop) && is.integer(gravity[crop])) { - // Gravity (string) - this.options.crop = gravity[crop]; - } else if (is.integer(crop) && crop >= strategy.entropy) { - // Strategy - this.options.crop = crop; - } else if (is.string(crop) && is.integer(strategy[crop])) { - // Strategy (string) - this.options.crop = strategy[crop]; +*/ +function extend (extend) { + if (is.integer(extend) && extend > 0) { + this.options.extendTop = extend; + this.options.extendBottom = extend; + this.options.extendLeft = extend; + this.options.extendRight = extend; + } else if (is.object(extend)) { + if (is.defined(extend.top)) { + if (is.integer(extend.top) && extend.top >= 0) { + this.options.extendTop = extend.top; + } else { + throw is.invalidParameterError('top', 'positive integer', extend.top); + } + } + if (is.defined(extend.bottom)) { + if (is.integer(extend.bottom) && extend.bottom >= 0) { + this.options.extendBottom = extend.bottom; + } else { + throw is.invalidParameterError('bottom', 'positive integer', extend.bottom); + } + } + if (is.defined(extend.left)) { + if (is.integer(extend.left) && extend.left >= 0) { + this.options.extendLeft = extend.left; + } else { + throw is.invalidParameterError('left', 'positive integer', extend.left); + } + } + if (is.defined(extend.right)) { + if (is.integer(extend.right) && extend.right >= 0) { + this.options.extendRight = extend.right; + } else { + throw is.invalidParameterError('right', 'positive integer', extend.right); + } + } + this._setBackgroundColourOption('extendBackground', extend.background); + if (is.defined(extend.extendWith)) { + if (is.string(extendWith[extend.extendWith])) { + this.options.extendWith = extendWith[extend.extendWith]; + } else { + throw is.invalidParameterError('extendWith', 'one of: background, copy, repeat, mirror', extend.extendWith); + } + } } else { - throw is.invalidParameterError('crop', 'valid crop id/name/strategy', crop); + throw is.invalidParameterError('extend', 'integer or object', extend); } return this; } /** - * Preserving aspect ratio, resize the image to the maximum `width` or `height` specified - * then embed on a background of the exact `width` and `height` specified. + * Extract/crop a region of the image. * - * If the background contains an alpha value then WebP and PNG format output images will - * contain an alpha channel, even when the input image does not. + * - Use `extract` before `resize` for pre-resize extraction. + * - Use `extract` after `resize` for post-resize extraction. + * - Use `extract` twice and `resize` once for extract-then-resize-then-extract in a fixed operation order. * * @example - * sharp('input.gif') - * .resize(200, 300) - * .background({r: 0, g: 0, b: 0, alpha: 0}) - * .embed() - * .toFormat(sharp.format.webp) - * .toBuffer(function(err, outputBuffer) { - * if (err) { - * throw err; - * } - * // outputBuffer contains WebP image data of a 200 pixels wide and 300 pixels high - * // containing a scaled version, embedded on a transparent canvas, of input.gif + * sharp(input) + * .extract({ left: left, top: top, width: width, height: height }) + * .toFile(output, function(err) { + * // Extract a region of the input image, saving in the same format. + * }); + * @example + * sharp(input) + * .extract({ left: leftOffsetPre, top: topOffsetPre, width: widthPre, height: heightPre }) + * .resize(width, height) + * .extract({ left: leftOffsetPost, top: topOffsetPost, width: widthPost, height: heightPost }) + * .toFile(output, function(err) { + * // Extract a region, resize, then extract from the resized image * }); * + * @param {Object} options - describes the region to extract using integral pixel values + * @param {number} options.left - zero-indexed offset from left edge + * @param {number} options.top - zero-indexed offset from top edge + * @param {number} options.width - width of region to extract + * @param {number} options.height - height of region to extract * @returns {Sharp} + * @throws {Error} Invalid parameters */ -function embed () { - this.options.canvas = 'embed'; +function extract (options) { + const suffix = isResizeExpected(this.options) || this.options.widthPre !== -1 ? 'Post' : 'Pre'; + if (this.options[`width${suffix}`] !== -1) { + this.options.debuglog('ignoring previous extract options'); + } + ['left', 'top', 'width', 'height'].forEach(function (name) { + const value = options[name]; + if (is.integer(value) && value >= 0) { + this.options[name + (name === 'left' || name === 'top' ? 'Offset' : '') + suffix] = value; + } else { + throw is.invalidParameterError(name, 'integer', value); + } + }, this); + // Ensure existing rotation occurs before pre-resize extraction + if (isRotationExpected(this.options) && !isResizeExpected(this.options)) { + if (this.options.widthPre === -1 || this.options.widthPost === -1) { + this.options.rotateBefore = true; + } + } + if (this.options.input.autoOrient) { + this.options.orientBefore = true; + } return this; } /** - * Preserving aspect ratio, resize the image to be as large as possible - * while ensuring its dimensions are less than or equal to the `width` and `height` specified. + * Trim pixels from all edges that contain values similar to the given background colour, which defaults to that of the top-left pixel. + * + * Images with an alpha channel will use the combined bounding box of alpha and non-alpha channels. + * + * If the result of this operation would trim an image to nothing then no change is made. * - * Both `width` and `height` must be provided via `resize` otherwise the behaviour will default to `crop`. + * The `info` response Object will contain `trimOffsetLeft` and `trimOffsetTop` properties. * * @example - * sharp(inputBuffer) - * .resize(200, 200) - * .max() - * .toFormat('jpeg') - * .toBuffer() - * .then(function(outputBuffer) { - * // outputBuffer contains JPEG image data no wider than 200 pixels and no higher - * // than 200 pixels regardless of the inputBuffer image dimensions - * }); + * // Trim pixels with a colour similar to that of the top-left pixel. + * await sharp(input) + * .trim() + * .toFile(output); * - * @returns {Sharp} - */ -function max () { - this.options.canvas = 'max'; - return this; -} - -/** - * Preserving aspect ratio, resize the image to be as small as possible - * while ensuring its dimensions are greater than or equal to the `width` and `height` specified. + * @example + * // Trim pixels with the exact same colour as that of the top-left pixel. + * await sharp(input) + * .trim({ + * threshold: 0 + * }) + * .toFile(output); * - * Both `width` and `height` must be provided via `resize` otherwise the behaviour will default to `crop`. + * @example + * // Assume input is line art and trim only pixels with a similar colour to red. + * const output = await sharp(input) + * .trim({ + * background: "#FF0000", + * lineArt: true + * }) + * .toBuffer(); * + * @example + * // Trim all "yellow-ish" pixels, being more lenient with the higher threshold. + * const output = await sharp(input) + * .trim({ + * background: "yellow", + * threshold: 42, + * }) + * .toBuffer(); + * + * @param {Object} [options] + * @param {string|Object} [options.background='top-left pixel'] - Background colour, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to that of the top-left pixel. + * @param {number} [options.threshold=10] - Allowed difference from the above colour, a positive number. + * @param {boolean} [options.lineArt=false] - Does the input more closely resemble line art (e.g. vector) rather than being photographic? * @returns {Sharp} + * @throws {Error} Invalid parameters */ -function min () { - this.options.canvas = 'min'; - return this; -} - -/** - * Ignoring the aspect ratio of the input, stretch the image to - * the exact `width` and/or `height` provided via `resize`. - * @returns {Sharp} - */ -function ignoreAspectRatio () { - this.options.canvas = 'ignore_aspect'; - return this; -} - -/** - * Do not enlarge the output image if the input image width *or* height are already less than the required dimensions. - * This is equivalent to GraphicsMagick's `>` geometry option: - * "*change the dimensions of the image only if its width or height exceeds the geometry specification*". - * @param {Boolean} [withoutEnlargement=true] - * @returns {Sharp} -*/ -function withoutEnlargement (withoutEnlargement) { - this.options.withoutEnlargement = is.bool(withoutEnlargement) ? withoutEnlargement : true; +function trim (options) { + this.options.trimThreshold = 10; + if (is.defined(options)) { + if (is.object(options)) { + if (is.defined(options.background)) { + this._setBackgroundColourOption('trimBackground', options.background); + } + if (is.defined(options.threshold)) { + if (is.number(options.threshold) && options.threshold >= 0) { + this.options.trimThreshold = options.threshold; + } else { + throw is.invalidParameterError('threshold', 'positive number', options.threshold); + } + } + if (is.defined(options.lineArt)) { + this._setBooleanOption('trimLineArt', options.lineArt); + } + } else { + throw is.invalidParameterError('trim', 'object', options); + } + } + if (isRotationExpected(this.options)) { + this.options.rotateBefore = true; + } return this; } /** * Decorate the Sharp prototype with resize-related functions. + * @module Sharp * @private */ -module.exports = function (Sharp) { - [ +module.exports = (Sharp) => { + Object.assign(Sharp.prototype, { resize, - crop, - embed, - max, - min, - ignoreAspectRatio, - withoutEnlargement - ].forEach(function (f) { - Sharp.prototype[f.name] = f; + extend, + extract, + trim }); // Class attributes Sharp.gravity = gravity; Sharp.strategy = strategy; Sharp.kernel = kernel; - Sharp.interpolator = interpolator; + Sharp.fit = fit; + Sharp.position = position; }; diff --git a/lib/sharp.js b/lib/sharp.js new file mode 100644 index 000000000..1081c9314 --- /dev/null +++ b/lib/sharp.js @@ -0,0 +1,121 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +// Inspects the runtime environment and exports the relevant sharp.node binary + +const { familySync, versionSync } = require('detect-libc'); + +const { runtimePlatformArch, isUnsupportedNodeRuntime, prebuiltPlatforms, minimumLibvipsVersion } = require('./libvips'); +const runtimePlatform = runtimePlatformArch(); + +const paths = [ + `../src/build/Release/sharp-${runtimePlatform}.node`, + '../src/build/Release/sharp-wasm32.node', + `@img/sharp-${runtimePlatform}/sharp.node`, + '@img/sharp-wasm32/sharp.node' +]; + +/* node:coverage disable */ + +let path, sharp; +const errors = []; +for (path of paths) { + try { + sharp = require(path); + break; + } catch (err) { + errors.push(err); + } +} + +if (sharp && path.startsWith('@img/sharp-linux-x64') && !sharp._isUsingX64V2()) { + const err = new Error('Prebuilt binaries for linux-x64 require v2 microarchitecture'); + err.code = 'Unsupported CPU'; + errors.push(err); + sharp = null; +} + +if (sharp) { + module.exports = sharp; +} else { + const [isLinux, isMacOs, isWindows] = ['linux', 'darwin', 'win32'].map(os => runtimePlatform.startsWith(os)); + + const help = [`Could not load the "sharp" module using the ${runtimePlatform} runtime`]; + errors.forEach(err => { + if (err.code !== 'MODULE_NOT_FOUND') { + help.push(`${err.code}: ${err.message}`); + } + }); + const messages = errors.map(err => err.message).join(' '); + help.push('Possible solutions:'); + // Common error messages + if (isUnsupportedNodeRuntime()) { + const { found, expected } = isUnsupportedNodeRuntime(); + help.push( + '- Please upgrade Node.js:', + ` Found ${found}`, + ` Requires ${expected}` + ); + } else if (prebuiltPlatforms.includes(runtimePlatform)) { + const [os, cpu] = runtimePlatform.split('-'); + const libc = os.endsWith('musl') ? ' --libc=musl' : ''; + help.push( + '- Ensure optional dependencies can be installed:', + ' npm install --include=optional sharp', + '- Ensure your package manager supports multi-platform installation:', + ' See https://sharp.pixelplumbing.com/install#cross-platform', + '- Add platform-specific dependencies:', + ` npm install --os=${os.replace('musl', '')}${libc} --cpu=${cpu} sharp` + ); + } else { + help.push( + `- Manually install libvips >= ${minimumLibvipsVersion}`, + '- Add experimental WebAssembly-based dependencies:', + ' npm install --cpu=wasm32 sharp', + ' npm install @img/sharp-wasm32' + ); + } + if (isLinux && /(symbol not found|CXXABI_)/i.test(messages)) { + try { + const { config } = require(`@img/sharp-libvips-${runtimePlatform}/package`); + const libcFound = `${familySync()} ${versionSync()}`; + const libcRequires = `${config.musl ? 'musl' : 'glibc'} ${config.musl || config.glibc}`; + help.push( + '- Update your OS:', + ` Found ${libcFound}`, + ` Requires ${libcRequires}` + ); + } catch (_errEngines) {} + } + if (isLinux && /\/snap\/core[0-9]{2}/.test(messages)) { + help.push( + '- Remove the Node.js Snap, which does not support native modules', + ' snap remove node' + ); + } + if (isMacOs && /Incompatible library version/.test(messages)) { + help.push( + '- Update Homebrew:', + ' brew update && brew upgrade vips' + ); + } + if (errors.some(err => err.code === 'ERR_DLOPEN_DISABLED')) { + help.push('- Run Node.js without using the --no-addons flag'); + } + // Link to installation docs + if (isWindows && /The specified procedure could not be found/.test(messages)) { + help.push( + '- Using the canvas package on Windows?', + ' See https://sharp.pixelplumbing.com/install#canvas-and-windows', + '- Check for outdated versions of sharp in the dependency tree:', + ' npm ls sharp' + ); + } + help.push( + '- Consult the installation documentation:', + ' See https://sharp.pixelplumbing.com/install' + ); + throw new Error(help.join('\n')); +} diff --git a/lib/utility.js b/lib/utility.js index 67aaafa82..c0ad39f86 100644 --- a/lib/utility.js +++ b/lib/utility.js @@ -1,10 +1,89 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const events = require('node:events'); +const detectLibc = require('detect-libc'); const is = require('./is'); -const sharp = require('../build/Release/sharp.node'); +const { runtimePlatformArch } = require('./libvips'); +const sharp = require('./sharp'); + +const runtimePlatform = runtimePlatformArch(); +const libvipsVersion = sharp.libvipsVersion(); + +/** + * An Object containing nested boolean values representing the available input and output formats/methods. + * @member + * @example + * console.log(sharp.format); + * @returns {Object} + */ +const format = sharp.format(); +format.heif.output.alias = ['avif', 'heic']; +format.jpeg.output.alias = ['jpe', 'jpg']; +format.tiff.output.alias = ['tif']; +format.jp2k.output.alias = ['j2c', 'j2k', 'jp2', 'jpx']; + +/** + * An Object containing the available interpolators and their proper values + * @readonly + * @enum {string} + */ +const interpolators = { + /** [Nearest neighbour interpolation](http://en.wikipedia.org/wiki/Nearest-neighbor_interpolation). Suitable for image enlargement only. */ + nearest: 'nearest', + /** [Bilinear interpolation](http://en.wikipedia.org/wiki/Bilinear_interpolation). Faster than bicubic but with less smooth results. */ + bilinear: 'bilinear', + /** [Bicubic interpolation](http://en.wikipedia.org/wiki/Bicubic_interpolation) (the default). */ + bicubic: 'bicubic', + /** [LBB interpolation](https://github.com/libvips/libvips/blob/master/libvips/resample/lbb.cpp#L100). Prevents some "[acutance](http://en.wikipedia.org/wiki/Acutance)" but typically reduces performance by a factor of 2. */ + locallyBoundedBicubic: 'lbb', + /** [Nohalo interpolation](http://eprints.soton.ac.uk/268086/). Prevents acutance but typically reduces performance by a factor of 3. */ + nohalo: 'nohalo', + /** [VSQBS interpolation](https://github.com/libvips/libvips/blob/master/libvips/resample/vsqbs.cpp#L48). Prevents "staircasing" when enlarging. */ + vertexSplitQuadraticBasisSpline: 'vsqbs' +}; + +/** + * An Object containing the version numbers of sharp, libvips + * and (when using prebuilt binaries) its dependencies. + * + * @member + * @example + * console.log(sharp.versions); + */ +let versions = { + vips: libvipsVersion.semver +}; +/* node:coverage ignore next 15 */ +if (!libvipsVersion.isGlobal) { + if (!libvipsVersion.isWasm) { + try { + versions = require(`@img/sharp-${runtimePlatform}/versions`); + } catch (_) { + try { + versions = require(`@img/sharp-libvips-${runtimePlatform}/versions`); + } catch (_) {} + } + } else { + try { + versions = require('@img/sharp-wasm32/versions'); + } catch (_) {} + } +} +versions.sharp = require('../package.json').version; + +/* node:coverage ignore next 5 */ +if (versions.heif && format.heif) { + // Prebuilt binaries provide AV1 + format.heif.input.fileSuffix = ['.avif']; + format.heif.output.alias = ['avif']; +} /** - * Gets, or when options are provided sets, the limits of _libvips'_ operation cache. + * Gets or, when options are provided, sets the limits of _libvips'_ operation cache. * Existing entries in the cache will be trimmed after any change in limits. * This method always returns cache statistics, * useful for determining how much working memory is required for a particular task. @@ -16,10 +95,10 @@ const sharp = require('../build/Release/sharp.node'); * sharp.cache( { files: 0 } ); * sharp.cache(false); * - * @param {Object|Boolean} options - Object with the following attributes, or Boolean where true uses default cache settings and false removes all caching. - * @param {Number} [options.memory=50] - is the maximum memory in MB to use for this cache - * @param {Number} [options.files=20] - is the maximum number of files to hold open - * @param {Number} [options.items=100] - is the maximum number of operations to cache + * @param {Object|boolean} [options=true] - Object with the following attributes, or boolean where true uses default cache settings and false removes all caching + * @param {number} [options.memory=50] - is the maximum memory in MB to use for this cache + * @param {number} [options.files=20] - is the maximum number of files to hold open + * @param {number} [options.items=100] - is the maximum number of operations to cache * @returns {Object} */ function cache (options) { @@ -39,27 +118,58 @@ function cache (options) { cache(true); /** - * Gets, or when a concurrency is provided sets, - * the number of threads _libvips'_ should create to process each image. - * The default value is the number of CPU cores. - * A value of `0` will reset to this default. - * - * The maximum number of images that can be processed in parallel - * is limited by libuv's `UV_THREADPOOL_SIZE` environment variable. + * Gets or, when a concurrency is provided, sets + * the maximum number of threads _libvips_ should use to process _each image_. + * These are from a thread pool managed by glib, + * which helps avoid the overhead of creating new threads. * * This method always returns the current concurrency. * + * The default value is the number of CPU cores, + * except when using glibc-based Linux without jemalloc, + * where the default is `1` to help reduce memory fragmentation. + * + * A value of `0` will reset this to the number of CPU cores. + * + * Some image format libraries spawn additional threads, + * e.g. libaom manages its own 4 threads when encoding AVIF images, + * and these are independent of the value set here. + * + * :::note + * Further {@link /performance/ control over performance} is available. + * ::: + * * @example * const threads = sharp.concurrency(); // 4 * sharp.concurrency(2); // 2 * sharp.concurrency(0); // 4 * - * @param {Number} [concurrency] - * @returns {Number} concurrency + * @param {number} [concurrency] + * @returns {number} concurrency */ function concurrency (concurrency) { return sharp.concurrency(is.integer(concurrency) ? concurrency : null); } +/* node:coverage ignore next 7 */ +if (detectLibc.familySync() === detectLibc.GLIBC && !sharp._isUsingJemalloc()) { + // Reduce default concurrency to 1 when using glibc memory allocator + sharp.concurrency(1); +} else if (detectLibc.familySync() === detectLibc.MUSL && sharp.concurrency() === 1024) { + // Reduce default concurrency when musl thread over-subscription detected + sharp.concurrency(require('node:os').availableParallelism()); +} + +/** + * An EventEmitter that emits a `change` event when a task is either: + * - queued, waiting for _libuv_ to provide a worker thread + * - complete + * @member + * @example + * sharp.queue.on('change', function(queueLength) { + * console.log('Queue contains ' + queueLength + ' task(s)'); + * }); + */ +const queue = new events.EventEmitter(); /** * Provides access to internal task counters. @@ -77,40 +187,105 @@ function counters () { /** * Get and set use of SIMD vector unit instructions. - * Requires libvips to have been compiled with liborc support. + * Requires libvips to have been compiled with highway support. * * Improves the performance of `resize`, `blur` and `sharpen` operations * by taking advantage of the SIMD vector unit of the CPU, e.g. Intel SSE and ARM NEON. * - * This feature is currently off by default but future versions may reverse this. - * Versions of liborc prior to 0.4.25 are known to segfault under heavy load. - * * @example * const simd = sharp.simd(); - * // simd is `true` if SIMD is currently enabled + * // simd is `true` if the runtime use of highway is currently enabled * @example - * const simd = sharp.simd(true); - * // attempts to enable the use of SIMD, returning true if available + * const simd = sharp.simd(false); + * // prevent libvips from using highway at runtime * - * @param {Boolean} [simd=false] - * @returns {Boolean} + * @param {boolean} [simd=true] + * @returns {boolean} */ function simd (simd) { return sharp.simd(is.bool(simd) ? simd : null); } -simd(false); + +/** + * Block libvips operations at runtime. + * + * This is in addition to the `VIPS_BLOCK_UNTRUSTED` environment variable, + * which when set will block all "untrusted" operations. + * + * @since 0.32.4 + * + * @example Block all TIFF input. + * sharp.block({ + * operation: ['VipsForeignLoadTiff'] + * }); + * + * @param {Object} options + * @param {Array} options.operation - List of libvips low-level operation names to block. + */ +function block (options) { + if (is.object(options)) { + if (Array.isArray(options.operation) && options.operation.every(is.string)) { + sharp.block(options.operation, true); + } else { + throw is.invalidParameterError('operation', 'Array', options.operation); + } + } else { + throw is.invalidParameterError('options', 'object', options); + } +} + +/** + * Unblock libvips operations at runtime. + * + * This is useful for defining a list of allowed operations. + * + * @since 0.32.4 + * + * @example Block all input except WebP from the filesystem. + * sharp.block({ + * operation: ['VipsForeignLoad'] + * }); + * sharp.unblock({ + * operation: ['VipsForeignLoadWebpFile'] + * }); + * + * @example Block all input except JPEG and PNG from a Buffer or Stream. + * sharp.block({ + * operation: ['VipsForeignLoad'] + * }); + * sharp.unblock({ + * operation: ['VipsForeignLoadJpegBuffer', 'VipsForeignLoadPngBuffer'] + * }); + * + * @param {Object} options + * @param {Array} options.operation - List of libvips low-level operation names to unblock. + */ +function unblock (options) { + if (is.object(options)) { + if (Array.isArray(options.operation) && options.operation.every(is.string)) { + sharp.block(options.operation, false); + } else { + throw is.invalidParameterError('operation', 'Array', options.operation); + } + } else { + throw is.invalidParameterError('options', 'object', options); + } +} /** * Decorate the Sharp class with utility-related functions. + * @module Sharp * @private */ -module.exports = function (Sharp) { - [ - cache, - concurrency, - counters, - simd - ].forEach(function (f) { - Sharp[f.name] = f; - }); +module.exports = (Sharp) => { + Sharp.cache = cache; + Sharp.concurrency = concurrency; + Sharp.counters = counters; + Sharp.simd = simd; + Sharp.format = format; + Sharp.interpolators = interpolators; + Sharp.versions = versions; + Sharp.queue = queue; + Sharp.block = block; + Sharp.unblock = unblock; }; diff --git a/mkdocs.yml b/mkdocs.yml deleted file mode 100644 index 21aaa06a2..000000000 --- a/mkdocs.yml +++ /dev/null @@ -1,25 +0,0 @@ -site_name: sharp -site_url: http://sharp.pixelplumbing.com/ -repo_url: https://github.com/lovell/sharp -site_description: High performance Node.js image processing, the fastest module to resize JPEG, PNG, WebP and TIFF images -copyright: pixelplumbing.com -google_analytics: ['UA-13034748-12', 'sharp.pixelplumbing.com'] -theme: readthedocs -markdown_extensions: - - toc: - permalink: True -pages: - - Home: index.md - - Installation: install.md - - API: - - Constructor: api-constructor.md - - Input: api-input.md - - Output: api-output.md - - "Resizing images": api-resize.md - - "Compositing images": api-composite.md - - "Image operations": api-operation.md - - "Colour manipulation": api-colour.md - - "Channel manipulation": api-channel.md - - Utilities: api-utility.md - - Performance: performance.md - - Changelog: changelog.md diff --git a/npm/darwin-arm64/package.json b/npm/darwin-arm64/package.json new file mode 100644 index 000000000..df6b34a92 --- /dev/null +++ b/npm/darwin-arm64/package.json @@ -0,0 +1,40 @@ +{ + "name": "@img/sharp-darwin-arm64", + "version": "0.34.5", + "description": "Prebuilt sharp for use with macOS 64-bit ARM", + "author": "Lovell Fuller ", + "homepage": "https://sharp.pixelplumbing.com", + "repository": { + "type": "git", + "url": "git+https://github.com/lovell/sharp.git", + "directory": "npm/darwin-arm64" + }, + "license": "Apache-2.0", + "funding": { + "url": "https://opencollective.com/libvips" + }, + "preferUnplugged": true, + "optionalDependencies": { + "@img/sharp-libvips-darwin-arm64": "1.2.4" + }, + "files": [ + "lib" + ], + "publishConfig": { + "access": "public" + }, + "type": "commonjs", + "exports": { + "./sharp.node": "./lib/sharp-darwin-arm64.node", + "./package": "./package.json" + }, + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "os": [ + "darwin" + ], + "cpu": [ + "arm64" + ] +} diff --git a/npm/darwin-x64/package.json b/npm/darwin-x64/package.json new file mode 100644 index 000000000..147d109dd --- /dev/null +++ b/npm/darwin-x64/package.json @@ -0,0 +1,40 @@ +{ + "name": "@img/sharp-darwin-x64", + "version": "0.34.5", + "description": "Prebuilt sharp for use with macOS x64", + "author": "Lovell Fuller ", + "homepage": "https://sharp.pixelplumbing.com", + "repository": { + "type": "git", + "url": "git+https://github.com/lovell/sharp.git", + "directory": "npm/darwin-x64" + }, + "license": "Apache-2.0", + "funding": { + "url": "https://opencollective.com/libvips" + }, + "preferUnplugged": true, + "optionalDependencies": { + "@img/sharp-libvips-darwin-x64": "1.2.4" + }, + "files": [ + "lib" + ], + "publishConfig": { + "access": "public" + }, + "type": "commonjs", + "exports": { + "./sharp.node": "./lib/sharp-darwin-x64.node", + "./package": "./package.json" + }, + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "os": [ + "darwin" + ], + "cpu": [ + "x64" + ] +} diff --git a/npm/from-local-build.js b/npm/from-local-build.js new file mode 100644 index 000000000..fc35bd2ed --- /dev/null +++ b/npm/from-local-build.js @@ -0,0 +1,64 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +// Populate the npm package for the current platform with the local build + +const { copyFileSync, cpSync, readFileSync, writeFileSync, appendFileSync } = require('node:fs'); +const { basename, join } = require('node:path'); + +const { buildPlatformArch } = require('../lib/libvips'); + +const licensing = ` +## Licensing + +Copyright 2013 Lovell Fuller and others. + +Licensed under the Apache License, Version 2.0 (the "License"); +you may not use this file except in compliance with the License. +You may obtain a copy of the License at +[https://www.apache.org/licenses/LICENSE-2.0](https://www.apache.org/licenses/LICENSE-2.0) + +Unless required by applicable law or agreed to in writing, software +distributed under the License is distributed on an "AS IS" BASIS, +WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +See the License for the specific language governing permissions and +limitations under the License. +`; + +const platform = buildPlatformArch(); +const destDir = join(__dirname, platform); +console.log(`Populating npm package for platform: ${platform}`); + +// Copy binaries +const releaseDir = join(__dirname, '..', 'src', 'build', 'Release'); +const libDir = join(destDir, 'lib'); +cpSync(releaseDir, libDir, { + recursive: true, + filter: (file) => { + const name = basename(file); + return name === 'Release' || + (name.startsWith('sharp-') && name.includes('.node')) || + (name.startsWith('libvips-') && name.endsWith('.dll')); + } +}); + +// Generate README +const { name, description } = require(`./${platform}/package.json`); +writeFileSync(join(destDir, 'README.md'), `# \`${name}\`\n\n${description}.\n${licensing}`); + +// Copy Apache-2.0 LICENSE +copyFileSync(join(__dirname, '..', 'LICENSE'), join(destDir, 'LICENSE')); + +// Copy files for packages without an explicit sharp-libvips dependency (Windows, wasm) +if (platform.startsWith('win') || platform.startsWith('wasm')) { + const libvipsPlatform = platform === 'wasm32' ? 'dev-wasm32' : platform; + const sharpLibvipsDir = join(require(`@img/sharp-libvips-${libvipsPlatform}/lib`), '..'); + // Copy versions.json + copyFileSync(join(sharpLibvipsDir, 'versions.json'), join(destDir, 'versions.json')); + // Append third party licensing to README + const readme = readFileSync(join(sharpLibvipsDir, 'README.md'), { encoding: 'utf-8' }); + const thirdParty = readme.substring(readme.indexOf('\nThis software contains')); + appendFileSync(join(destDir, 'README.md'), thirdParty); +} diff --git a/npm/linux-arm/package.json b/npm/linux-arm/package.json new file mode 100644 index 000000000..cd4d5a2b6 --- /dev/null +++ b/npm/linux-arm/package.json @@ -0,0 +1,46 @@ +{ + "name": "@img/sharp-linux-arm", + "version": "0.34.5", + "description": "Prebuilt sharp for use with Linux (glibc) ARM (32-bit)", + "author": "Lovell Fuller ", + "homepage": "https://sharp.pixelplumbing.com", + "repository": { + "type": "git", + "url": "git+https://github.com/lovell/sharp.git", + "directory": "npm/linux-arm" + }, + "license": "Apache-2.0", + "funding": { + "url": "https://opencollective.com/libvips" + }, + "preferUnplugged": true, + "optionalDependencies": { + "@img/sharp-libvips-linux-arm": "1.2.4" + }, + "files": [ + "lib" + ], + "publishConfig": { + "access": "public" + }, + "type": "commonjs", + "exports": { + "./sharp.node": "./lib/sharp-linux-arm.node", + "./package": "./package.json" + }, + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "config": { + "glibc": ">=2.31" + }, + "os": [ + "linux" + ], + "libc": [ + "glibc" + ], + "cpu": [ + "arm" + ] +} diff --git a/npm/linux-arm64/package.json b/npm/linux-arm64/package.json new file mode 100644 index 000000000..b373bb920 --- /dev/null +++ b/npm/linux-arm64/package.json @@ -0,0 +1,46 @@ +{ + "name": "@img/sharp-linux-arm64", + "version": "0.34.5", + "description": "Prebuilt sharp for use with Linux (glibc) 64-bit ARM", + "author": "Lovell Fuller ", + "homepage": "https://sharp.pixelplumbing.com", + "repository": { + "type": "git", + "url": "git+https://github.com/lovell/sharp.git", + "directory": "npm/linux-arm64" + }, + "license": "Apache-2.0", + "funding": { + "url": "https://opencollective.com/libvips" + }, + "preferUnplugged": true, + "optionalDependencies": { + "@img/sharp-libvips-linux-arm64": "1.2.4" + }, + "files": [ + "lib" + ], + "publishConfig": { + "access": "public" + }, + "type": "commonjs", + "exports": { + "./sharp.node": "./lib/sharp-linux-arm64.node", + "./package": "./package.json" + }, + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "config": { + "glibc": ">=2.26" + }, + "os": [ + "linux" + ], + "libc": [ + "glibc" + ], + "cpu": [ + "arm64" + ] +} diff --git a/npm/linux-ppc64/package.json b/npm/linux-ppc64/package.json new file mode 100644 index 000000000..db2b62c01 --- /dev/null +++ b/npm/linux-ppc64/package.json @@ -0,0 +1,46 @@ +{ + "name": "@img/sharp-linux-ppc64", + "version": "0.34.5", + "description": "Prebuilt sharp for use with Linux (glibc) ppc64", + "author": "Lovell Fuller ", + "homepage": "https://sharp.pixelplumbing.com", + "repository": { + "type": "git", + "url": "git+https://github.com/lovell/sharp.git", + "directory": "npm/linux-ppc64" + }, + "license": "Apache-2.0", + "funding": { + "url": "https://opencollective.com/libvips" + }, + "preferUnplugged": true, + "optionalDependencies": { + "@img/sharp-libvips-linux-ppc64": "1.2.4" + }, + "files": [ + "lib" + ], + "publishConfig": { + "access": "public" + }, + "type": "commonjs", + "exports": { + "./sharp.node": "./lib/sharp-linux-ppc64.node", + "./package": "./package.json" + }, + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "config": { + "glibc": ">=2.36" + }, + "os": [ + "linux" + ], + "libc": [ + "glibc" + ], + "cpu": [ + "ppc64" + ] +} diff --git a/npm/linux-riscv64/package.json b/npm/linux-riscv64/package.json new file mode 100644 index 000000000..9f0e95204 --- /dev/null +++ b/npm/linux-riscv64/package.json @@ -0,0 +1,46 @@ +{ + "name": "@img/sharp-linux-riscv64", + "version": "0.34.5", + "description": "Prebuilt sharp for use with Linux (glibc) RISC-V 64-bit", + "author": "Lovell Fuller ", + "homepage": "https://sharp.pixelplumbing.com", + "repository": { + "type": "git", + "url": "git+https://github.com/lovell/sharp.git", + "directory": "npm/linux-riscv64" + }, + "license": "Apache-2.0", + "funding": { + "url": "https://opencollective.com/libvips" + }, + "preferUnplugged": true, + "optionalDependencies": { + "@img/sharp-libvips-linux-riscv64": "1.2.4" + }, + "files": [ + "lib" + ], + "publishConfig": { + "access": "public" + }, + "type": "commonjs", + "exports": { + "./sharp.node": "./lib/sharp-linux-riscv64.node", + "./package": "./package.json" + }, + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "config": { + "glibc": ">=2.41" + }, + "os": [ + "linux" + ], + "libc": [ + "glibc" + ], + "cpu": [ + "riscv64" + ] +} diff --git a/npm/linux-s390x/package.json b/npm/linux-s390x/package.json new file mode 100644 index 000000000..423692edd --- /dev/null +++ b/npm/linux-s390x/package.json @@ -0,0 +1,46 @@ +{ + "name": "@img/sharp-linux-s390x", + "version": "0.34.5", + "description": "Prebuilt sharp for use with Linux (glibc) s390x", + "author": "Lovell Fuller ", + "homepage": "https://sharp.pixelplumbing.com", + "repository": { + "type": "git", + "url": "git+https://github.com/lovell/sharp.git", + "directory": "npm/linux-s390x" + }, + "license": "Apache-2.0", + "funding": { + "url": "https://opencollective.com/libvips" + }, + "preferUnplugged": true, + "optionalDependencies": { + "@img/sharp-libvips-linux-s390x": "1.2.4" + }, + "files": [ + "lib" + ], + "publishConfig": { + "access": "public" + }, + "type": "commonjs", + "exports": { + "./sharp.node": "./lib/sharp-linux-s390x.node", + "./package": "./package.json" + }, + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "config": { + "glibc": ">=2.36" + }, + "os": [ + "linux" + ], + "libc": [ + "glibc" + ], + "cpu": [ + "s390x" + ] +} diff --git a/npm/linux-x64/package.json b/npm/linux-x64/package.json new file mode 100644 index 000000000..95a8a035f --- /dev/null +++ b/npm/linux-x64/package.json @@ -0,0 +1,46 @@ +{ + "name": "@img/sharp-linux-x64", + "version": "0.34.5", + "description": "Prebuilt sharp for use with Linux (glibc) x64", + "author": "Lovell Fuller ", + "homepage": "https://sharp.pixelplumbing.com", + "repository": { + "type": "git", + "url": "git+https://github.com/lovell/sharp.git", + "directory": "npm/linux-x64" + }, + "license": "Apache-2.0", + "funding": { + "url": "https://opencollective.com/libvips" + }, + "preferUnplugged": true, + "optionalDependencies": { + "@img/sharp-libvips-linux-x64": "1.2.4" + }, + "files": [ + "lib" + ], + "publishConfig": { + "access": "public" + }, + "type": "commonjs", + "exports": { + "./sharp.node": "./lib/sharp-linux-x64.node", + "./package": "./package.json" + }, + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "config": { + "glibc": ">=2.26" + }, + "os": [ + "linux" + ], + "libc": [ + "glibc" + ], + "cpu": [ + "x64" + ] +} diff --git a/npm/linuxmusl-arm64/package.json b/npm/linuxmusl-arm64/package.json new file mode 100644 index 000000000..5e214bf18 --- /dev/null +++ b/npm/linuxmusl-arm64/package.json @@ -0,0 +1,46 @@ +{ + "name": "@img/sharp-linuxmusl-arm64", + "version": "0.34.5", + "description": "Prebuilt sharp for use with Linux (musl) 64-bit ARM", + "author": "Lovell Fuller ", + "homepage": "https://sharp.pixelplumbing.com", + "repository": { + "type": "git", + "url": "git+https://github.com/lovell/sharp.git", + "directory": "npm/linuxmusl-arm64" + }, + "license": "Apache-2.0", + "funding": { + "url": "https://opencollective.com/libvips" + }, + "preferUnplugged": true, + "optionalDependencies": { + "@img/sharp-libvips-linuxmusl-arm64": "1.2.4" + }, + "files": [ + "lib" + ], + "publishConfig": { + "access": "public" + }, + "type": "commonjs", + "exports": { + "./sharp.node": "./lib/sharp-linuxmusl-arm64.node", + "./package": "./package.json" + }, + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "config": { + "musl": ">=1.2.2" + }, + "os": [ + "linux" + ], + "libc": [ + "musl" + ], + "cpu": [ + "arm64" + ] +} diff --git a/npm/linuxmusl-x64/package.json b/npm/linuxmusl-x64/package.json new file mode 100644 index 000000000..8be92db0d --- /dev/null +++ b/npm/linuxmusl-x64/package.json @@ -0,0 +1,46 @@ +{ + "name": "@img/sharp-linuxmusl-x64", + "version": "0.34.5", + "description": "Prebuilt sharp for use with Linux (musl) x64", + "author": "Lovell Fuller ", + "homepage": "https://sharp.pixelplumbing.com", + "repository": { + "type": "git", + "url": "git+https://github.com/lovell/sharp.git", + "directory": "npm/linuxmusl-x64" + }, + "license": "Apache-2.0", + "funding": { + "url": "https://opencollective.com/libvips" + }, + "preferUnplugged": true, + "optionalDependencies": { + "@img/sharp-libvips-linuxmusl-x64": "1.2.4" + }, + "files": [ + "lib" + ], + "publishConfig": { + "access": "public" + }, + "type": "commonjs", + "exports": { + "./sharp.node": "./lib/sharp-linuxmusl-x64.node", + "./package": "./package.json" + }, + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "config": { + "musl": ">=1.2.2" + }, + "os": [ + "linux" + ], + "libc": [ + "musl" + ], + "cpu": [ + "x64" + ] +} diff --git a/npm/package.json b/npm/package.json new file mode 100644 index 000000000..773daa94a --- /dev/null +++ b/npm/package.json @@ -0,0 +1,21 @@ +{ + "name": "@img/sharp", + "version": "0.34.5", + "private": "true", + "workspaces": [ + "darwin-arm64", + "darwin-x64", + "linux-arm", + "linux-arm64", + "linux-ppc64", + "linux-riscv64", + "linux-s390x", + "linux-x64", + "linuxmusl-arm64", + "linuxmusl-x64", + "wasm32", + "win32-arm64", + "win32-ia32", + "win32-x64" + ] +} diff --git a/npm/release-notes.js b/npm/release-notes.js new file mode 100644 index 000000000..13cc18772 --- /dev/null +++ b/npm/release-notes.js @@ -0,0 +1,9 @@ +const { readFileSync, writeFileSync } = require('node:fs'); + +const { version } = require('./package.json'); +const versionWithoutPreRelease = version.replace(/-rc\.\d+$/, ''); + +const markdown = readFileSync(`./docs/src/content/docs/changelog/v${versionWithoutPreRelease}.md`, 'utf8'); +const markdownWithoutFrontmatter = markdown.replace(/---\n.*?\n---\n+/s, ''); + +writeFileSync('./release-notes.md', markdownWithoutFrontmatter); diff --git a/npm/wasm32/package.json b/npm/wasm32/package.json new file mode 100644 index 000000000..d5775bec3 --- /dev/null +++ b/npm/wasm32/package.json @@ -0,0 +1,39 @@ +{ + "name": "@img/sharp-wasm32", + "version": "0.34.5", + "description": "Prebuilt sharp for use with wasm32", + "author": "Lovell Fuller ", + "homepage": "https://sharp.pixelplumbing.com", + "repository": { + "type": "git", + "url": "git+https://github.com/lovell/sharp.git", + "directory": "npm/wasm32" + }, + "license": "Apache-2.0 AND LGPL-3.0-or-later AND MIT", + "funding": { + "url": "https://opencollective.com/libvips" + }, + "preferUnplugged": true, + "files": [ + "lib", + "versions.json" + ], + "publishConfig": { + "access": "public" + }, + "type": "commonjs", + "exports": { + "./sharp.node": "./lib/sharp-wasm32.node.js", + "./package": "./package.json", + "./versions": "./versions.json" + }, + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "dependencies": { + "@emnapi/runtime": "^1.7.0" + }, + "cpu": [ + "wasm32" + ] +} diff --git a/npm/win32-arm64/package.json b/npm/win32-arm64/package.json new file mode 100644 index 000000000..651f420ea --- /dev/null +++ b/npm/win32-arm64/package.json @@ -0,0 +1,39 @@ +{ + "name": "@img/sharp-win32-arm64", + "version": "0.34.5", + "description": "Prebuilt sharp for use with Windows 64-bit ARM", + "author": "Lovell Fuller ", + "homepage": "https://sharp.pixelplumbing.com", + "repository": { + "type": "git", + "url": "git+https://github.com/lovell/sharp.git", + "directory": "npm/win32-arm64" + }, + "license": "Apache-2.0 AND LGPL-3.0-or-later", + "funding": { + "url": "https://opencollective.com/libvips" + }, + "preferUnplugged": true, + "files": [ + "lib", + "versions.json" + ], + "publishConfig": { + "access": "public" + }, + "type": "commonjs", + "exports": { + "./sharp.node": "./lib/sharp-win32-arm64.node", + "./package": "./package.json", + "./versions": "./versions.json" + }, + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "os": [ + "win32" + ], + "cpu": [ + "arm64" + ] +} diff --git a/npm/win32-ia32/package.json b/npm/win32-ia32/package.json new file mode 100644 index 000000000..21c6dbba2 --- /dev/null +++ b/npm/win32-ia32/package.json @@ -0,0 +1,39 @@ +{ + "name": "@img/sharp-win32-ia32", + "version": "0.34.5", + "description": "Prebuilt sharp for use with Windows x86 (32-bit)", + "author": "Lovell Fuller ", + "homepage": "https://sharp.pixelplumbing.com", + "repository": { + "type": "git", + "url": "git+https://github.com/lovell/sharp.git", + "directory": "npm/win32-ia32" + }, + "license": "Apache-2.0 AND LGPL-3.0-or-later", + "funding": { + "url": "https://opencollective.com/libvips" + }, + "preferUnplugged": true, + "files": [ + "lib", + "versions.json" + ], + "publishConfig": { + "access": "public" + }, + "type": "commonjs", + "exports": { + "./sharp.node": "./lib/sharp-win32-ia32.node", + "./package": "./package.json", + "./versions": "./versions.json" + }, + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "os": [ + "win32" + ], + "cpu": [ + "ia32" + ] +} diff --git a/npm/win32-x64/package.json b/npm/win32-x64/package.json new file mode 100644 index 000000000..8b867b5e2 --- /dev/null +++ b/npm/win32-x64/package.json @@ -0,0 +1,39 @@ +{ + "name": "@img/sharp-win32-x64", + "version": "0.34.5", + "description": "Prebuilt sharp for use with Windows x64", + "author": "Lovell Fuller ", + "homepage": "https://sharp.pixelplumbing.com", + "repository": { + "type": "git", + "url": "git+https://github.com/lovell/sharp.git", + "directory": "npm/win32-x64" + }, + "license": "Apache-2.0 AND LGPL-3.0-or-later", + "funding": { + "url": "https://opencollective.com/libvips" + }, + "preferUnplugged": true, + "files": [ + "lib", + "versions.json" + ], + "publishConfig": { + "access": "public" + }, + "type": "commonjs", + "exports": { + "./sharp.node": "./lib/sharp-win32-x64.node", + "./package": "./package.json", + "./versions": "./versions.json" + }, + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "os": [ + "win32" + ], + "cpu": [ + "x64" + ] +} diff --git a/package.json b/package.json index 05cae3bb9..0c0d00988 100644 --- a/package.json +++ b/package.json @@ -1,9 +1,9 @@ { "name": "sharp", - "description": "High performance Node.js image processing, the fastest module to resize JPEG, PNG, WebP and TIFF images", - "version": "0.18.2", + "description": "High performance Node.js image processing, the fastest module to resize JPEG, PNG, WebP, GIF, AVIF and TIFF images", + "version": "0.34.5", "author": "Lovell Fuller ", - "homepage": "https://github.com/lovell/sharp", + "homepage": "https://sharp.pixelplumbing.com", "contributors": [ "Pierre Inglebert ", "Jonathan Ong ", @@ -37,74 +37,166 @@ "Kristo Jorgenson ", "YvesBos ", "Guy Maliar ", - "Nicolas Coden " + "Nicolas Coden ", + "Matt Parrish ", + "Marcel Bretschneider ", + "Matthew McEachen ", + "Jarda Kotěšovec ", + "Kenric D'Souza ", + "Oleh Aleinyk ", + "Marcel Bretschneider ", + "Andrea Bianco ", + "Rik Heywood ", + "Thomas Parisot ", + "Nathan Graves ", + "Tom Lokhorst ", + "Espen Hovlandsdal ", + "Sylvain Dumont ", + "Alun Davies ", + "Aidan Hoolachan ", + "Axel Eirola ", + "Freezy ", + "Daiz ", + "Julian Aubourg ", + "Keith Belovay ", + "Michael B. Klein ", + "Jordan Prudhomme ", + "Ilya Ovdin ", + "Andargor ", + "Paul Neave ", + "Brendan Kennedy ", + "Brychan Bennett-Odlum ", + "Edward Silverton ", + "Roman Malieiev ", + "Tomas Szabo ", + "Robert O'Rourke ", + "Guillermo Alfonso Varela Chouciño ", + "Christian Flintrup ", + "Manan Jadhav ", + "Leon Radley ", + "alza54 ", + "Jacob Smith ", + "Michael Nutt ", + "Brad Parham ", + "Taneli Vatanen ", + "Joris Dugué ", + "Chris Banks ", + "Ompal Singh ", + "Brodan ", + "Ankur Parihar ", + "Brahim Ait elhaj ", + "Mart Jansink ", + "Lachlan Newman ", + "Dennis Beatty ", + "Ingvar Stepanyan ", + "Don Denton " ], "scripts": { - "clean": "rm -rf node_modules/ build/ vendor/ coverage/ test/fixtures/output.*", - "test": "semistandard && cc && nyc --reporter=lcov --branches=99 mocha --slow=5000 --timeout=60000 ./test/unit/*.js", + "build": "node install/build.js", + "install": "node install/check.js || npm run build", + "clean": "rm -rf src/build/ .nyc_output/ coverage/ test/fixtures/output.*", + "test": "npm run lint && npm run test-unit", + "lint": "npm run lint-cpp && npm run lint-js && npm run lint-types", + "lint-cpp": "cpplint --quiet src/*.h src/*.cc", + "lint-js": "biome lint", + "lint-types": "tsd --files ./test/types/sharp.test-d.ts", "test-leak": "./test/leak/leak.sh", - "test-packaging": "./packaging/test-linux-x64.sh", - "docs": "for m in constructor input resize composite operation colour channel output utility; do documentation build --shallow --format=md lib/$m.js >docs/api-$m.md; done" + "test-unit": "node --experimental-test-coverage test/unit.mjs", + "package-from-local-build": "node npm/from-local-build.js", + "package-release-notes": "node npm/release-notes.js", + "docs-build": "node docs/build.mjs", + "docs-serve": "cd docs && npm start", + "docs-publish": "cd docs && npm run build && npx firebase-tools deploy --project pixelplumbing --only hosting:pixelplumbing-sharp" }, + "type": "commonjs", "main": "lib/index.js", + "types": "lib/index.d.ts", + "files": [ + "install", + "lib", + "src/*.{cc,h,gyp}" + ], "repository": { "type": "git", - "url": "git://github.com/lovell/sharp" + "url": "git://github.com/lovell/sharp.git" }, "keywords": [ "jpeg", "png", "webp", + "avif", "tiff", "gif", "svg", + "jp2", "dzi", "image", "resize", "thumbnail", "crop", + "embed", "libvips", "vips" ], "dependencies": { - "caw": "^2.0.0", - "color": "^2.0.0", - "detect-libc": "^0.2.0", - "got": "^7.1.0", - "nan": "^2.6.2", - "semver": "^5.3.0", - "tar": "^3.1.5" + "@img/colour": "^1.0.0", + "detect-libc": "^2.1.2", + "semver": "^7.7.3" + }, + "optionalDependencies": { + "@img/sharp-darwin-arm64": "0.34.5", + "@img/sharp-darwin-x64": "0.34.5", + "@img/sharp-libvips-darwin-arm64": "1.2.4", + "@img/sharp-libvips-darwin-x64": "1.2.4", + "@img/sharp-libvips-linux-arm": "1.2.4", + "@img/sharp-libvips-linux-arm64": "1.2.4", + "@img/sharp-libvips-linux-ppc64": "1.2.4", + "@img/sharp-libvips-linux-riscv64": "1.2.4", + "@img/sharp-libvips-linux-s390x": "1.2.4", + "@img/sharp-libvips-linux-x64": "1.2.4", + "@img/sharp-libvips-linuxmusl-arm64": "1.2.4", + "@img/sharp-libvips-linuxmusl-x64": "1.2.4", + "@img/sharp-linux-arm": "0.34.5", + "@img/sharp-linux-arm64": "0.34.5", + "@img/sharp-linux-ppc64": "0.34.5", + "@img/sharp-linux-riscv64": "0.34.5", + "@img/sharp-linux-s390x": "0.34.5", + "@img/sharp-linux-x64": "0.34.5", + "@img/sharp-linuxmusl-arm64": "0.34.5", + "@img/sharp-linuxmusl-x64": "0.34.5", + "@img/sharp-wasm32": "0.34.5", + "@img/sharp-win32-arm64": "0.34.5", + "@img/sharp-win32-ia32": "0.34.5", + "@img/sharp-win32-x64": "0.34.5" }, "devDependencies": { - "async": "^2.5.0", - "cc": "^1.0.1", - "documentation": "^4.0.0-rc.1", - "exif-reader": "^1.0.2", - "icc": "^1.0.0", - "mocha": "^3.4.2", - "nyc": "^11.0.3", - "rimraf": "^2.6.1", - "semistandard": "^11.0.0", - "unzip": "^0.1.11" + "@biomejs/biome": "^2.3.4", + "@cpplint/cli": "^0.1.0", + "@emnapi/runtime": "^1.7.0", + "@img/sharp-libvips-dev": "1.2.4", + "@img/sharp-libvips-dev-wasm32": "1.2.4", + "@img/sharp-libvips-win32-arm64": "1.2.4", + "@img/sharp-libvips-win32-ia32": "1.2.4", + "@img/sharp-libvips-win32-x64": "1.2.4", + "@types/node": "*", + "emnapi": "^1.7.0", + "exif-reader": "^2.0.2", + "extract-zip": "^2.0.1", + "icc": "^3.0.0", + "jsdoc-to-markdown": "^9.1.3", + "node-addon-api": "^8.5.0", + "node-gyp": "^11.5.0", + "tar-fs": "^3.1.1", + "tsd": "^0.33.0" }, "license": "Apache-2.0", - "config": { - "libvips": "8.5.5" - }, "engines": { - "node": ">=4.5.0" + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" }, - "semistandard": { - "env": [ - "mocha" - ] + "config": { + "libvips": ">=8.17.3" }, - "cc": { - "linelength": "120", - "filter": [ - "build/c++11", - "build/include", - "runtime/indentation_namespace" - ] + "funding": { + "url": "https://opencollective.com/libvips" } } diff --git a/packaging/README.md b/packaging/README.md deleted file mode 100644 index dbc4e3b63..000000000 --- a/packaging/README.md +++ /dev/null @@ -1,57 +0,0 @@ -# Packaging scripts - -libvips and its dependencies are provided as pre-compiled shared libraries -for the most common operating systems and CPU architectures. - -During `npm install`, these binaries are fetched as tarballs from -[Bintray](https://dl.bintray.com/lovell/sharp/) via HTTPS -and stored locally within `node_modules/sharp`. - -## Using a custom tarball - -A custom tarball stored on the local filesystem can be used instead. -Place it in the following location, where `x.y.z` is the libvips version, -`platform` is the value of `process.platform` and -`arch` is the value of `process.arch` (plus the version number for ARM). - -`node_modules/sharp/packaging/libvips-x.y.z-platform-arch.tar.gz` - -For example, for libvips v8.3.3 on an ARMv6 Linux machine, use: - -`node_modules/sharp/packaging/libvips-8.3.3-linux-armv6.tar.gz` - -Remove any `sharp/lib` and `sharp/include` directories -before running `npm install` again. - -## Creating a tarball - -Most people will not need to do this; proceed with caution. - -The `packaging` directory contains the top-level [build script](build.sh). - -### Linux - -One [build script](build/lin.sh) is used to (cross-)compile -the same shared libraries within multiple containers. - -* [x64](linux-x64/Dockerfile) -* [ARMv6](linux-armv6/Dockerfile) -* [ARMv7-A](linux-armv7/Dockerfile) -* [ARMv8-A](linux-armv8/Dockerfile) - -The QEMU user mode emulation binaries are required to build for -the ARMv6 platform as the Debian armhf cross-compiler erroneously -generates unsupported Thumb 2 instructions. - -```sh -sudo apt-get install qemu-user-static -``` - -### Windows - -The output of libvips' [build-win64](https://github.com/jcupitt/build-win64) -"web" target is [post-processed](build/win.sh) within a [container](win32-x64/Dockerfile). - -### OS X - -See [package-libvips-darwin](https://github.com/lovell/package-libvips-darwin). diff --git a/packaging/build.sh b/packaging/build.sh deleted file mode 100755 index 7c8364cb2..000000000 --- a/packaging/build.sh +++ /dev/null @@ -1,48 +0,0 @@ -#!/bin/sh -set -e - -if [ $# -lt 1 ]; then - echo - echo "Usage: $0 VERSION [PLATFORM]" - echo "Build shared libraries for libvips and its dependencies via containers" - echo - echo "Please specify the libvips VERSION, e.g. 8.3.3" - echo - echo "Optionally build for only one PLATFORM, defaults to building for all" - echo "Possible values for PLATFORM are: win32-x64, linux-x64, linux-armv6," - echo "linux-armv7, linux-armv8" - echo - exit 1 -fi -VERSION_VIPS="$1" -PLATFORM="${2:-all}" - -# Is docker available? -if ! type docker >/dev/null; then - echo "Please install docker" - exit 1 -fi - -# Update base images -for baseimage in debian:wheezy debian:jessie debian:stretch socialdefect/raspbian-jessie-core; do - docker pull $baseimage -done - -# Windows (x64) -if [ $PLATFORM = "all" ] || [ $PLATFORM = "win32-x64" ]; then - echo "Building win32-x64..." - docker build -t vips-dev-win32-x64 win32-x64 - docker run --rm -e "VERSION_VIPS=${VERSION_VIPS}" -v $PWD:/packaging vips-dev-win32-x64 sh -c "/packaging/build/win.sh" -fi - -# Linux (x64, ARMv6, ARMv7, ARMv8) -for flavour in linux-x64 linux-armv6 linux-armv7 linux-armv8; do - if [ $PLATFORM = "all" ] || [ $PLATFORM = $flavour ]; then - echo "Building $flavour..." - docker build -t vips-dev-$flavour $flavour - docker run --rm -e "VERSION_VIPS=${VERSION_VIPS}" -v $PWD:/packaging vips-dev-$flavour sh -c "/packaging/build/lin.sh" - fi -done - -# Display checksums -sha256sum *.tar.gz diff --git a/packaging/build/lin.sh b/packaging/build/lin.sh deleted file mode 100755 index 2f559f7ee..000000000 --- a/packaging/build/lin.sh +++ /dev/null @@ -1,256 +0,0 @@ -#!/bin/sh -set -e - -# Working directories -DEPS=/deps -TARGET=/target -mkdir ${DEPS} -mkdir ${TARGET} - -# Common build paths and flags -export PKG_CONFIG_PATH="${PKG_CONFIG_PATH}:${TARGET}/lib/pkgconfig" -export PATH="${PATH}:${TARGET}/bin" -export CPPFLAGS="-I${TARGET}/include" -export LDFLAGS="-L${TARGET}/lib" -export CFLAGS="${FLAGS}" -export CXXFLAGS="${FLAGS}" - -# Dependency version numbers -VERSION_ZLIB=1.2.11 -VERSION_FFI=3.2.1 -VERSION_GLIB=2.53.1 -VERSION_XML2=2.9.4 -VERSION_GSF=1.14.41 -VERSION_EXIF=0.6.21 -VERSION_LCMS2=2.8 -VERSION_JPEG=1.5.1 -VERSION_PNG16=1.6.29 -VERSION_WEBP=0.6.0 -VERSION_TIFF=4.0.7 -VERSION_ORC=0.4.26 -VERSION_GDKPIXBUF=2.36.6 -VERSION_FREETYPE=2.8 -VERSION_EXPAT=2.2.0 -VERSION_FONTCONFIG=2.12.1 -VERSION_HARFBUZZ=1.4.6 -VERSION_PIXMAN=0.34.0 -VERSION_CAIRO=1.14.8 -VERSION_PANGO=1.40.5 -VERSION_CROCO=0.6.12 -VERSION_SVG=2.40.17 -VERSION_GIF=5.1.4 - -# Least out-of-sync Sourceforge mirror -SOURCEFORGE_MIRROR=netix - -mkdir ${DEPS}/zlib -curl -Ls http://zlib.net/zlib-${VERSION_ZLIB}.tar.xz | tar xJC ${DEPS}/zlib --strip-components=1 -cd ${DEPS}/zlib -./configure --prefix=${TARGET} --uname=linux -make install -rm ${TARGET}/lib/libz.a - -mkdir ${DEPS}/ffi -curl -Ls ftp://sourceware.org/pub/libffi/libffi-${VERSION_FFI}.tar.gz | tar xzC ${DEPS}/ffi --strip-components=1 -cd ${DEPS}/ffi -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking --disable-builddir -make install-strip - -mkdir ${DEPS}/glib -curl -Ls https://download.gnome.org/sources/glib/2.53/glib-${VERSION_GLIB}.tar.xz | tar xJC ${DEPS}/glib --strip-components=1 -cd ${DEPS}/glib -echo glib_cv_stack_grows=no >>glib.cache -echo glib_cv_uscore=no >>glib.cache -./configure --cache-file=glib.cache --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking \ - --with-pcre=internal --disable-libmount -make install-strip - -mkdir ${DEPS}/xml2 -curl -Ls http://xmlsoft.org/sources/libxml2-${VERSION_XML2}.tar.gz | tar xzC ${DEPS}/xml2 --strip-components=1 -cd ${DEPS}/xml2 -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking \ - --without-python --without-debug --without-docbook --without-ftp --without-html --without-legacy \ - --without-pattern --without-push --without-regexps --without-schemas --without-schematron --with-zlib=${TARGET} -make install-strip - -mkdir ${DEPS}/gsf -curl -Ls https://download.gnome.org/sources/libgsf/1.14/libgsf-${VERSION_GSF}.tar.xz | tar xJC ${DEPS}/gsf --strip-components=1 -cd ${DEPS}/gsf -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking -make install-strip - -mkdir ${DEPS}/exif -curl -Ls http://${SOURCEFORGE_MIRROR}.dl.sourceforge.net/project/libexif/libexif/${VERSION_EXIF}/libexif-${VERSION_EXIF}.tar.bz2 | tar xjC ${DEPS}/exif --strip-components=1 -cd ${DEPS}/exif -autoreconf -fiv -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking -make install-strip - -mkdir ${DEPS}/lcms2 -curl -Ls http://${SOURCEFORGE_MIRROR}.dl.sourceforge.net/project/lcms/lcms/${VERSION_LCMS2}/lcms2-${VERSION_LCMS2}.tar.gz | tar xzC ${DEPS}/lcms2 --strip-components=1 -cd ${DEPS}/lcms2 -# Apply patches for lcms2 vulnerabilities reported since v2.8 -VERSION_LCMS2_GIT_MASTER_SHA=$(curl -Ls https://api.github.com/repos/mm2/Little-CMS/git/refs/heads/master | jq -r '.object.sha' | head -c7) -curl -Ls https://github.com/mm2/Little-CMS/compare/lcms2.8...master.patch | patch -p1 -t || true -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking -make install-strip - -mkdir ${DEPS}/jpeg -curl -Ls https://github.com/libjpeg-turbo/libjpeg-turbo/archive/${VERSION_JPEG}.tar.gz | tar xzC ${DEPS}/jpeg --strip-components=1 -cd ${DEPS}/jpeg -autoreconf -fiv -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking --with-jpeg8 --without-turbojpeg -make install-strip - -mkdir ${DEPS}/png16 -curl -Ls http://${SOURCEFORGE_MIRROR}.dl.sourceforge.net/project/libpng/libpng16/${VERSION_PNG16}/libpng-${VERSION_PNG16}.tar.xz | tar xJC ${DEPS}/png16 --strip-components=1 -cd ${DEPS}/png16 -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking -make install-strip - -mkdir ${DEPS}/webp -curl -Ls http://downloads.webmproject.org/releases/webp/libwebp-${VERSION_WEBP}.tar.gz | tar xzC ${DEPS}/webp --strip-components=1 -cd ${DEPS}/webp -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking \ - --disable-neon --enable-libwebpmux -make install-strip - -mkdir ${DEPS}/tiff -curl -Ls http://download.osgeo.org/libtiff/tiff-${VERSION_TIFF}.tar.gz | tar xzC ${DEPS}/tiff --strip-components=1 -cd ${DEPS}/tiff -# Apply patches for libtiff vulnerabilities reported since v4.0.7 -VERSION_TIFF_GIT_MASTER_SHA=$(curl -Ls https://api.github.com/repos/vadz/libtiff/git/refs/heads/master | jq -r '.object.sha' | head -c7) -curl -Ls https://github.com/vadz/libtiff/compare/Release-v4-0-7...master.patch | patch -p1 -t || true -if [ -n "${CHOST}" ]; then autoreconf -fiv; fi -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking --disable-mdi --disable-pixarlog --disable-cxx -make install-strip - -mkdir ${DEPS}/orc -curl -Ls http://gstreamer.freedesktop.org/data/src/orc/orc-${VERSION_ORC}.tar.xz | tar xJC ${DEPS}/orc --strip-components=1 -cd ${DEPS}/orc -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking -make install-strip -cd ${TARGET}/lib -rm -rf liborc-test-* - -mkdir ${DEPS}/gdkpixbuf -curl -Ls https://download.gnome.org/sources/gdk-pixbuf/2.36/gdk-pixbuf-${VERSION_GDKPIXBUF}.tar.xz | tar xJC ${DEPS}/gdkpixbuf --strip-components=1 -cd ${DEPS}/gdkpixbuf -touch gdk-pixbuf/loaders.cache -LD_LIBRARY_PATH=${TARGET}/lib \ -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking \ - --disable-introspection --disable-modules --disable-gio-sniffing \ - --without-libtiff --without-gdiplus --with-included-loaders=png,jpeg -make install-strip - -mkdir ${DEPS}/freetype -curl -Ls http://${SOURCEFORGE_MIRROR}.dl.sourceforge.net/project/freetype/freetype2/${VERSION_FREETYPE}/freetype-${VERSION_FREETYPE}.tar.gz | tar xzC ${DEPS}/freetype --strip-components=1 -cd ${DEPS}/freetype -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static -make install - -mkdir ${DEPS}/expat -curl -Ls http://${SOURCEFORGE_MIRROR}.dl.sourceforge.net/project/expat/expat/${VERSION_EXPAT}/expat-${VERSION_EXPAT}.tar.bz2 | tar xjC ${DEPS}/expat --strip-components=1 -cd ${DEPS}/expat -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static -make install - -mkdir ${DEPS}/fontconfig -curl -Ls https://www.freedesktop.org/software/fontconfig/release/fontconfig-${VERSION_FONTCONFIG}.tar.bz2 | tar xjC ${DEPS}/fontconfig --strip-components=1 -cd ${DEPS}/fontconfig -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking \ - --with-expat-includes=${TARGET}/include --with-expat-lib=${TARGET}/lib --sysconfdir=/etc -make install-strip - -mkdir ${DEPS}/harfbuzz -curl -Ls https://www.freedesktop.org/software/harfbuzz/release/harfbuzz-${VERSION_HARFBUZZ}.tar.bz2 | tar xjC ${DEPS}/harfbuzz --strip-components=1 -cd ${DEPS}/harfbuzz -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking -make install-strip - -mkdir ${DEPS}/pixman -curl -Ls http://cairographics.org/releases/pixman-${VERSION_PIXMAN}.tar.gz | tar xzC ${DEPS}/pixman --strip-components=1 -cd ${DEPS}/pixman -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking --disable-libpng --disable-arm-iwmmxt -make install-strip - -mkdir ${DEPS}/cairo -curl -Ls http://cairographics.org/releases/cairo-${VERSION_CAIRO}.tar.xz | tar xJC ${DEPS}/cairo --strip-components=1 -cd ${DEPS}/cairo -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking \ - --disable-xlib --disable-xcb --disable-quartz --disable-win32 --disable-egl --disable-glx --disable-wgl \ - --disable-script --disable-ps --disable-gobject --disable-trace --disable-interpreter -make install-strip - -mkdir ${DEPS}/pango -curl -Ls https://download.gnome.org/sources/pango/1.40/pango-${VERSION_PANGO}.tar.xz | tar xJC ${DEPS}/pango --strip-components=1 -cd ${DEPS}/pango -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking -make install-strip - -mkdir ${DEPS}/croco -curl -Ls https://download.gnome.org/sources/libcroco/0.6/libcroco-${VERSION_CROCO}.tar.xz | tar xJC ${DEPS}/croco --strip-components=1 -cd ${DEPS}/croco -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking -make install-strip - -mkdir ${DEPS}/svg -curl -Ls https://download.gnome.org/sources/librsvg/2.40/librsvg-${VERSION_SVG}.tar.xz | tar xJC ${DEPS}/svg --strip-components=1 -cd ${DEPS}/svg -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking --disable-introspection --disable-tools --disable-pixbuf-loader -make install-strip - -mkdir ${DEPS}/gif -curl -Ls http://${SOURCEFORGE_MIRROR}.dl.sourceforge.net/project/giflib/giflib-${VERSION_GIF}.tar.gz | tar xzC ${DEPS}/gif --strip-components=1 -cd ${DEPS}/gif -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking -make install-strip - -mkdir ${DEPS}/vips -curl -Ls https://github.com/jcupitt/libvips/releases/download/v${VERSION_VIPS}/vips-${VERSION_VIPS}.tar.gz | tar xzC ${DEPS}/vips --strip-components=1 -cd ${DEPS}/vips -./configure --host=${CHOST} --prefix=${TARGET} --enable-shared --disable-static --disable-dependency-tracking \ - --disable-debug --disable-introspection --without-python --without-fftw \ - --without-magick --without-pangoft2 --without-ppm --without-analyze --without-radiance \ - --with-zip-includes=${TARGET}/include --with-zip-libraries=${TARGET}/lib \ - --with-jpeg-includes=${TARGET}/include --with-jpeg-libraries=${TARGET}/lib -make install-strip - -# Remove the old C++ bindings -cd ${TARGET}/include -rm -rf vips/vipsc++.h vips/vipscpp.h -cd ${TARGET}/lib -rm -rf pkgconfig .libs *.la libvipsCC* - -# Create JSON file of version numbers -cd ${TARGET} -echo "{\n\ - \"cairo\": \"${VERSION_CAIRO}\",\n\ - \"croco\": \"${VERSION_CROCO}\",\n\ - \"exif\": \"${VERSION_EXIF}\",\n\ - \"expat\": \"${VERSION_EXPAT}\",\n\ - \"ffi\": \"${VERSION_FFI}\",\n\ - \"fontconfig\": \"${VERSION_FONTCONFIG}\",\n\ - \"freetype\": \"${VERSION_FREETYPE}\",\n\ - \"gdkpixbuf\": \"${VERSION_GDKPIXBUF}\",\n\ - \"gif\": \"${VERSION_GIF}\",\n\ - \"glib\": \"${VERSION_GLIB}\",\n\ - \"gsf\": \"${VERSION_GSF}\",\n\ - \"harfbuzz\": \"${VERSION_HARFBUZZ}\",\n\ - \"jpeg\": \"${VERSION_JPEG}\",\n\ - \"lcms\": \"${VERSION_LCMS2}-${VERSION_LCMS2_GIT_MASTER_SHA}\",\n\ - \"orc\": \"${VERSION_ORC}\",\n\ - \"pango\": \"${VERSION_PANGO}\",\n\ - \"pixman\": \"${VERSION_PIXMAN}\",\n\ - \"png\": \"${VERSION_PNG16}\",\n\ - \"svg\": \"${VERSION_SVG}\",\n\ - \"tiff\": \"${VERSION_TIFF}-${VERSION_TIFF_GIT_MASTER_SHA}\",\n\ - \"vips\": \"${VERSION_VIPS}\",\n\ - \"webp\": \"${VERSION_WEBP}\",\n\ - \"xml\": \"${VERSION_XML2}\",\n\ - \"zlib\": \"${VERSION_ZLIB}\"\n\ -}" >lib/versions.json - -# Create .tar.gz -tar czf /packaging/libvips-${VERSION_VIPS}-${PLATFORM}.tar.gz include lib -advdef --recompress --shrink-insane /packaging/libvips-${VERSION_VIPS}-${PLATFORM}.tar.gz diff --git a/packaging/build/win.sh b/packaging/build/win.sh deleted file mode 100755 index 31c33e485..000000000 --- a/packaging/build/win.sh +++ /dev/null @@ -1,19 +0,0 @@ -#!/bin/sh -set -e - -# Fetch and unzip -mkdir /vips -cd /vips -curl -L -O https://github.com/lovell/build-win64/releases/download/v${VERSION_VIPS}/vips-dev-w64-web-${VERSION_VIPS}.zip -unzip vips-dev-w64-web-${VERSION_VIPS}.zip - -# Clean and zip -cd /vips/vips-dev-8.5 -rm bin/libvipsCC-42.dll bin/libvips-cpp-42.dll bin/libgsf-win32-1-114.dll -cp bin/*.dll lib/ -cp -r lib64/* lib/ - -echo "Creating tarball" -tar czf /packaging/libvips-${VERSION_VIPS}-${PLATFORM}.tar.gz include lib/glib-2.0 lib/libvips.lib lib/libglib-2.0.lib lib/libgobject-2.0.lib lib/*.dll -echo "Shrinking tarball" -advdef --recompress --shrink-insane /packaging/libvips-${VERSION_VIPS}-${PLATFORM}.tar.gz diff --git a/packaging/linux-armv6/Dockerfile b/packaging/linux-armv6/Dockerfile deleted file mode 100644 index af79a935c..000000000 --- a/packaging/linux-armv6/Dockerfile +++ /dev/null @@ -1,15 +0,0 @@ -FROM socialdefect/raspbian-jessie-core -MAINTAINER Lovell Fuller - -# Create Rasbian-based container suitable for compiling Linux ARMv6 binaries -# Requires the QEMU user mode emulation binaries on the host machine - -# Build dependencies -RUN \ - apt-get update && \ - apt-get install -y build-essential curl autoconf libtool nasm gtk-doc-tools texinfo advancecomp libglib2.0-dev jq - -# Compiler settings -ENV \ - PLATFORM=linux-armv6 \ - FLAGS="-Os" diff --git a/packaging/linux-armv7/Dockerfile b/packaging/linux-armv7/Dockerfile deleted file mode 100644 index 338c1fd72..000000000 --- a/packaging/linux-armv7/Dockerfile +++ /dev/null @@ -1,20 +0,0 @@ -FROM debian:jessie -MAINTAINER Lovell Fuller - -# Create Debian-based container suitable for cross-compiling Linux ARMv7-A binaries - -# Build dependencies -RUN \ - apt-get update && \ - apt-get install -y curl && \ - echo "deb http://emdebian.org/tools/debian/ jessie main" | tee /etc/apt/sources.list.d/crosstools.list && \ - curl http://emdebian.org/tools/debian/emdebian-toolchain-archive.key | apt-key add - && \ - dpkg --add-architecture armhf && \ - apt-get update && \ - apt-get install -y crossbuild-essential-armhf autoconf libtool nasm gtk-doc-tools texinfo advancecomp libglib2.0-dev jq - -# Compiler settings -ENV \ - PLATFORM=linux-armv7 \ - CHOST=arm-linux-gnueabihf \ - FLAGS="-marm -march=armv7-a -mfpu=neon-vfpv4 -mfloat-abi=hard -Os" diff --git a/packaging/linux-armv8/Dockerfile b/packaging/linux-armv8/Dockerfile deleted file mode 100644 index 12948d965..000000000 --- a/packaging/linux-armv8/Dockerfile +++ /dev/null @@ -1,18 +0,0 @@ -FROM debian:stretch -MAINTAINER Lovell Fuller - -# Create Debian-based container suitable for cross-compiling Linux ARMv8-A binaries - -# Build dependencies -RUN \ - apt-get update && \ - apt-get install -y curl && \ - dpkg --add-architecture arm64 && \ - apt-get update && \ - apt-get install -y crossbuild-essential-arm64 autoconf libtool nasm gtk-doc-tools texinfo advancecomp libglib2.0-dev jq gettext intltool autopoint - -# Compiler settings -ENV \ - PLATFORM=linux-armv8 \ - CHOST=aarch64-linux-gnu \ - FLAGS="-march=armv8-a -Os -D_GLIBCXX_USE_CXX11_ABI=0" diff --git a/packaging/linux-x64/Dockerfile b/packaging/linux-x64/Dockerfile deleted file mode 100644 index 3bf32719d..000000000 --- a/packaging/linux-x64/Dockerfile +++ /dev/null @@ -1,16 +0,0 @@ -FROM debian:wheezy -MAINTAINER Lovell Fuller - -# Create Debian-based container suitable for building Linux x64 binaries - -# Build dependencies -RUN \ - echo "deb http://ftp.debian.org/debian wheezy-backports main" | tee /etc/apt/sources.list.d/wheezy-backports.list && \ - apt-get update && \ - apt-get install -y build-essential autoconf libtool nasm gtk-doc-tools texinfo advancecomp && \ - apt-get -t wheezy-backports install -y jq - -# Compiler settings -ENV \ - PLATFORM="linux-x64" \ - FLAGS="-O3" diff --git a/packaging/test-linux-arm.sh b/packaging/test-linux-arm.sh deleted file mode 100755 index c887ac87a..000000000 --- a/packaging/test-linux-arm.sh +++ /dev/null @@ -1,28 +0,0 @@ -#!/bin/sh - -if [ $# -lt 1 ]; then - echo "Usage: $0 IP" - echo "Test sharp on ARM using Docker, where IP is" - echo "the address of a Raspberry Pi running HypriotOS" - exit 1 -fi -IP="$1" - -echo "Verifying connectivity to $IP" -if ! ping -c 1 $IP; then - echo "Could not connect to $IP" - exit 1 -fi - -if ! type sshpass >/dev/null; then - echo "Please install sshpass" - exit 1 -fi - -export SSHPASS=hypriot - -echo "Copying sharp source to device" -sshpass -e scp -o PreferredAuthentications=password -r ../../sharp pirate@${IP}:/home/pirate/sharp - -echo "Compile and test within container" -sshpass -e ssh -o PreferredAuthentications=password -t pirate@${IP} "docker run --rm -v \${PWD}/sharp:/s hypriot/rpi-node:6 sh -c 'cd /s && npm install --unsafe-perm && npm test'" diff --git a/packaging/test-linux-x64.sh b/packaging/test-linux-x64.sh deleted file mode 100755 index 3d073695b..000000000 --- a/packaging/test-linux-x64.sh +++ /dev/null @@ -1,36 +0,0 @@ -#!/bin/sh - -# Verify docker is available -if ! type docker >/dev/null; then - echo "Please install docker" - exit 1 -fi - -test="npm run clean; npm install --unsafe-perm; npm test" - -# Debian 7, 8 -# Ubuntu 14.04, 16.04 -for dist in debian:jessie debian:stretch ubuntu:trusty ubuntu:xenial; do - echo "Testing $dist..." - docker pull $dist - if docker run -i -t --rm -v $PWD:/v $dist >packaging/$dist.log 2>&1 sh -c "cd /v; ./packaging/test/debian.sh; $test"; - then echo "$dist OK" - else echo "$dist fail" && cat packaging/$dist.log - fi -done - -# Centos 7 -echo "Testing centos7..." -docker pull centos:7 -if docker run -i -t --rm -v $PWD:/v centos:7 >packaging/centos7.log 2>&1 sh -c "cd /v; ./packaging/test/centos.sh; $test"; -then echo "centos7 OK" -else echo "centos7 fail" && cat packaging/centos7.log -fi - -# Archlinux latest -echo "Testing archlinux..." -docker pull pritunl/archlinux:latest -if docker run -i -t --rm -v $PWD:/v pritunl/archlinux:latest >packaging/archlinux.log 2>&1 sh -c "cd /v; ./packaging/test/archlinux.sh; $test"; -then echo "archlinux OK" -else echo "archlinux fail" && cat packaging/archlinux.log -fi diff --git a/packaging/test/archlinux.sh b/packaging/test/archlinux.sh deleted file mode 100755 index 73afcb8dc..000000000 --- a/packaging/test/archlinux.sh +++ /dev/null @@ -1,5 +0,0 @@ -#!/bin/sh - -# Install Node.js on Archlinux -pacman -Sy --noconfirm gcc make python2 nodejs npm | cat -ln -s /usr/bin/python2 /usr/bin/python diff --git a/packaging/test/centos.sh b/packaging/test/centos.sh deleted file mode 100755 index 5a3d63fa6..000000000 --- a/packaging/test/centos.sh +++ /dev/null @@ -1,4 +0,0 @@ -#!/bin/sh - -curl -sL https://rpm.nodesource.com/setup_6.x | bash - -yum install -y gcc-c++ make nodejs diff --git a/packaging/test/debian.sh b/packaging/test/debian.sh deleted file mode 100755 index 5c6f9ef9e..000000000 --- a/packaging/test/debian.sh +++ /dev/null @@ -1,6 +0,0 @@ -#!/bin/sh - -apt-get update -apt-get install -y build-essential python pkg-config curl -curl -sL https://deb.nodesource.com/setup_6.x | bash - -apt-get install -y nodejs diff --git a/packaging/win32-x64/Dockerfile b/packaging/win32-x64/Dockerfile deleted file mode 100644 index 4c4abade9..000000000 --- a/packaging/win32-x64/Dockerfile +++ /dev/null @@ -1,8 +0,0 @@ -FROM debian:stretch -MAINTAINER Lovell Fuller - -# Create Debian-based container suitable for post-processing Windows x64 binaries - -RUN apt-get update && apt-get install -y curl zip advancecomp - -ENV PLATFORM=win32-x64 diff --git a/src/CPPLINT.cfg b/src/CPPLINT.cfg new file mode 100644 index 000000000..7a643b075 --- /dev/null +++ b/src/CPPLINT.cfg @@ -0,0 +1,10 @@ +set noparent + +linelength=120 + +filter=-build/include +filter=+build/include_alpha +filter=+build/include_subdir +filter=+build/include_what_you_use + +filter=-whitespace/indent_namespace diff --git a/src/binding.gyp b/src/binding.gyp new file mode 100644 index 000000000..2040cde5b --- /dev/null +++ b/src/binding.gyp @@ -0,0 +1,298 @@ +# Copyright 2013 Lovell Fuller and others. +# SPDX-License-Identifier: Apache-2.0 + +{ + 'variables': { + 'vips_version': ' #include +#include +#include +#include #include -#include +#include +#include #include -#include -#include -#include -#include -#include +#include #include -#include "common.h" +#include "./common.h" using vips::VImage; namespace sharp { - // Convenience methods to access the attributes of a v8::Object - bool HasAttr(v8::Handle obj, std::string attr) { - return Nan::Has(obj, Nan::New(attr).ToLocalChecked()).FromJust(); + // Convenience methods to access the attributes of a Napi::Object + bool HasAttr(Napi::Object obj, std::string attr) { + return obj.Has(attr); + } + std::string AttrAsStr(Napi::Object obj, std::string attr) { + return obj.Get(attr).As(); + } + std::string AttrAsStr(Napi::Object obj, unsigned int const attr) { + return obj.Get(attr).As(); + } + uint32_t AttrAsUint32(Napi::Object obj, std::string attr) { + return obj.Get(attr).As().Uint32Value(); + } + int32_t AttrAsInt32(Napi::Object obj, std::string attr) { + return obj.Get(attr).As().Int32Value(); } - std::string AttrAsStr(v8::Handle obj, std::string attr) { - return *Nan::Utf8String(Nan::Get(obj, Nan::New(attr).ToLocalChecked()).ToLocalChecked()); + int32_t AttrAsInt32(Napi::Object obj, unsigned int const attr) { + return obj.Get(attr).As().Int32Value(); + } + int64_t AttrAsInt64(Napi::Object obj, std::string attr) { + return obj.Get(attr).As().Int64Value(); + } + double AttrAsDouble(Napi::Object obj, std::string attr) { + return obj.Get(attr).As().DoubleValue(); + } + double AttrAsDouble(Napi::Object obj, unsigned int const attr) { + return obj.Get(attr).As().DoubleValue(); + } + bool AttrAsBool(Napi::Object obj, std::string attr) { + return obj.Get(attr).As().Value(); + } + std::vector AttrAsVectorOfDouble(Napi::Object obj, std::string attr) { + Napi::Array napiArray = obj.Get(attr).As(); + std::vector vectorOfDouble(napiArray.Length()); + for (unsigned int i = 0; i < napiArray.Length(); i++) { + vectorOfDouble[i] = AttrAsDouble(napiArray, i); + } + return vectorOfDouble; + } + std::vector AttrAsInt32Vector(Napi::Object obj, std::string attr) { + Napi::Array array = obj.Get(attr).As(); + std::vector vector(array.Length()); + for (unsigned int i = 0; i < array.Length(); i++) { + vector[i] = AttrAsInt32(array, i); + } + return vector; } - // Create an InputDescriptor instance from a v8::Object describing an input image - InputDescriptor* CreateInputDescriptor( - v8::Handle input, std::vector> buffersToPersist - ) { - Nan::HandleScope(); + // Create an InputDescriptor instance from a Napi::Object describing an input image + InputDescriptor* CreateInputDescriptor(Napi::Object input) { InputDescriptor *descriptor = new InputDescriptor; if (HasAttr(input, "file")) { descriptor->file = AttrAsStr(input, "file"); } else if (HasAttr(input, "buffer")) { - v8::Local buffer = AttrAs(input, "buffer"); - descriptor->bufferLength = node::Buffer::Length(buffer); - descriptor->buffer = node::Buffer::Data(buffer); - buffersToPersist.push_back(buffer); + Napi::Buffer buffer = input.Get("buffer").As>(); + descriptor->bufferLength = buffer.Length(); + descriptor->buffer = buffer.Data(); + descriptor->isBuffer = true; } + descriptor->failOn = AttrAsEnum(input, "failOn", VIPS_TYPE_FAIL_ON); // Density for vector-based input if (HasAttr(input, "density")) { - descriptor->density = AttrTo(input, "density"); + descriptor->density = AttrAsDouble(input, "density"); + } + // Should we ignore any embedded ICC profile + if (HasAttr(input, "ignoreIcc")) { + descriptor->ignoreIcc = AttrAsBool(input, "ignoreIcc"); } // Raw pixel input if (HasAttr(input, "rawChannels")) { - descriptor->rawChannels = AttrTo(input, "rawChannels"); - descriptor->rawWidth = AttrTo(input, "rawWidth"); - descriptor->rawHeight = AttrTo(input, "rawHeight"); + descriptor->rawDepth = AttrAsEnum(input, "rawDepth", VIPS_TYPE_BAND_FORMAT); + descriptor->rawChannels = AttrAsUint32(input, "rawChannels"); + descriptor->rawWidth = AttrAsUint32(input, "rawWidth"); + descriptor->rawHeight = AttrAsUint32(input, "rawHeight"); + descriptor->rawPremultiplied = AttrAsBool(input, "rawPremultiplied"); + descriptor->rawPageHeight = AttrAsUint32(input, "rawPageHeight"); + } + // Multi-page input (GIF, TIFF, PDF) + if (HasAttr(input, "pages")) { + descriptor->pages = AttrAsInt32(input, "pages"); + } + if (HasAttr(input, "page")) { + descriptor->page = AttrAsUint32(input, "page"); + } + // SVG + if (HasAttr(input, "svgStylesheet")) { + descriptor->svgStylesheet = AttrAsStr(input, "svgStylesheet"); + } + if (HasAttr(input, "svgHighBitdepth")) { + descriptor->svgHighBitdepth = AttrAsBool(input, "svgHighBitdepth"); + } + // Multi-level input (OpenSlide) + if (HasAttr(input, "openSlideLevel")) { + descriptor->openSlideLevel = AttrAsUint32(input, "openSlideLevel"); + } + // subIFD (OME-TIFF) + if (HasAttr(input, "subifd")) { + descriptor->tiffSubifd = AttrAsInt32(input, "tiffSubifd"); + } + // // PDF background color + if (HasAttr(input, "pdfBackground")) { + descriptor->pdfBackground = AttrAsVectorOfDouble(input, "pdfBackground"); + } + // Use JPEG 2000 oneshot mode? + if (HasAttr(input, "jp2Oneshot")) { + descriptor->jp2Oneshot = AttrAsBool(input, "jp2Oneshot"); } // Create new image if (HasAttr(input, "createChannels")) { - descriptor->createChannels = AttrTo(input, "createChannels"); - descriptor->createWidth = AttrTo(input, "createWidth"); - descriptor->createHeight = AttrTo(input, "createHeight"); - v8::Local createBackground = AttrAs(input, "createBackground"); - for (unsigned int i = 0; i < 4; i++) { - descriptor->createBackground[i] = AttrTo(createBackground, i); + descriptor->createChannels = AttrAsUint32(input, "createChannels"); + descriptor->createWidth = AttrAsUint32(input, "createWidth"); + descriptor->createHeight = AttrAsUint32(input, "createHeight"); + descriptor->createPageHeight = AttrAsUint32(input, "createPageHeight"); + if (HasAttr(input, "createNoiseType")) { + descriptor->createNoiseType = AttrAsStr(input, "createNoiseType"); + descriptor->createNoiseMean = AttrAsDouble(input, "createNoiseMean"); + descriptor->createNoiseSigma = AttrAsDouble(input, "createNoiseSigma"); + } else { + descriptor->createBackground = AttrAsVectorOfDouble(input, "createBackground"); + } + } + // Create new image with text + if (HasAttr(input, "textValue")) { + descriptor->textValue = AttrAsStr(input, "textValue"); + if (HasAttr(input, "textFont")) { + descriptor->textFont = AttrAsStr(input, "textFont"); + } + if (HasAttr(input, "textFontfile")) { + descriptor->textFontfile = AttrAsStr(input, "textFontfile"); } + if (HasAttr(input, "textWidth")) { + descriptor->textWidth = AttrAsUint32(input, "textWidth"); + } + if (HasAttr(input, "textHeight")) { + descriptor->textHeight = AttrAsUint32(input, "textHeight"); + } + if (HasAttr(input, "textAlign")) { + descriptor->textAlign = AttrAsEnum(input, "textAlign", VIPS_TYPE_ALIGN); + } + if (HasAttr(input, "textJustify")) { + descriptor->textJustify = AttrAsBool(input, "textJustify"); + } + if (HasAttr(input, "textDpi")) { + descriptor->textDpi = AttrAsUint32(input, "textDpi"); + } + if (HasAttr(input, "textRgba")) { + descriptor->textRgba = AttrAsBool(input, "textRgba"); + } + if (HasAttr(input, "textSpacing")) { + descriptor->textSpacing = AttrAsUint32(input, "textSpacing"); + } + if (HasAttr(input, "textWrap")) { + descriptor->textWrap = AttrAsEnum(input, "textWrap", VIPS_TYPE_TEXT_WRAP); + } + } + // Join images together + if (HasAttr(input, "joinAnimated")) { + descriptor->joinAnimated = AttrAsBool(input, "joinAnimated"); + } + if (HasAttr(input, "joinAcross")) { + descriptor->joinAcross = AttrAsUint32(input, "joinAcross"); + } + if (HasAttr(input, "joinShim")) { + descriptor->joinShim = AttrAsUint32(input, "joinShim"); + } + if (HasAttr(input, "joinBackground")) { + descriptor->joinBackground = AttrAsVectorOfDouble(input, "joinBackground"); + } + if (HasAttr(input, "joinHalign")) { + descriptor->joinHalign = AttrAsEnum(input, "joinHalign", VIPS_TYPE_ALIGN); + } + if (HasAttr(input, "joinValign")) { + descriptor->joinValign = AttrAsEnum(input, "joinValign", VIPS_TYPE_ALIGN); + } + // Limit input images to a given number of pixels, where pixels = width * height + descriptor->limitInputPixels = static_cast(AttrAsInt64(input, "limitInputPixels")); + if (HasAttr(input, "access")) { + descriptor->access = AttrAsBool(input, "sequentialRead") ? VIPS_ACCESS_SEQUENTIAL : VIPS_ACCESS_RANDOM; } + // Remove safety features and allow unlimited input + descriptor->unlimited = AttrAsBool(input, "unlimited"); + // Use the EXIF orientation to auto orient the image + descriptor->autoOrient = AttrAsBool(input, "autoOrient"); return descriptor; } // How many tasks are in the queue? - volatile int counterQueue = 0; + std::atomic counterQueue{0}; // How many tasks are being processed? - volatile int counterProcess = 0; + std::atomic counterProcess{0}; // Filename extension checkers static bool EndsWith(std::string const &str, std::string const &end) { @@ -94,9 +227,28 @@ namespace sharp { bool IsWebp(std::string const &str) { return EndsWith(str, ".webp") || EndsWith(str, ".WEBP"); } + bool IsGif(std::string const &str) { + return EndsWith(str, ".gif") || EndsWith(str, ".GIF"); + } + bool IsJp2(std::string const &str) { + return EndsWith(str, ".jp2") || EndsWith(str, ".jpx") || EndsWith(str, ".j2k") || EndsWith(str, ".j2c") + || EndsWith(str, ".JP2") || EndsWith(str, ".JPX") || EndsWith(str, ".J2K") || EndsWith(str, ".J2C"); + } bool IsTiff(std::string const &str) { return EndsWith(str, ".tif") || EndsWith(str, ".tiff") || EndsWith(str, ".TIF") || EndsWith(str, ".TIFF"); } + bool IsHeic(std::string const &str) { + return EndsWith(str, ".heic") || EndsWith(str, ".HEIC"); + } + bool IsHeif(std::string const &str) { + return EndsWith(str, ".heif") || EndsWith(str, ".HEIF") || IsHeic(str) || IsAvif(str); + } + bool IsAvif(std::string const &str) { + return EndsWith(str, ".avif") || EndsWith(str, ".AVIF"); + } + bool IsJxl(std::string const &str) { + return EndsWith(str, ".jxl") || EndsWith(str, ".JXL"); + } bool IsDz(std::string const &str) { return EndsWith(str, ".dzi") || EndsWith(str, ".DZI"); } @@ -107,6 +259,13 @@ namespace sharp { return EndsWith(str, ".v") || EndsWith(str, ".V") || EndsWith(str, ".vips") || EndsWith(str, ".VIPS"); } + /* + Trim space from end of string. + */ + std::string TrimEnd(std::string const &str) { + return str.substr(0, str.find_last_not_of(" \n\r\f") + 1); + } + /* Provide a string identifier for the given image type. */ @@ -118,43 +277,83 @@ namespace sharp { case ImageType::WEBP: id = "webp"; break; case ImageType::TIFF: id = "tiff"; break; case ImageType::GIF: id = "gif"; break; + case ImageType::JP2: id = "jp2"; break; case ImageType::SVG: id = "svg"; break; + case ImageType::HEIF: id = "heif"; break; case ImageType::PDF: id = "pdf"; break; case ImageType::MAGICK: id = "magick"; break; case ImageType::OPENSLIDE: id = "openslide"; break; case ImageType::PPM: id = "ppm"; break; case ImageType::FITS: id = "fits"; break; - case ImageType::VIPS: id = "v"; break; + case ImageType::EXR: id = "exr"; break; + case ImageType::JXL: id = "jxl"; break; + case ImageType::RAD: id = "rad"; break; + case ImageType::DCRAW: id = "dcraw"; break; + case ImageType::VIPS: id = "vips"; break; case ImageType::RAW: id = "raw"; break; case ImageType::UNKNOWN: id = "unknown"; break; + case ImageType::MISSING: id = "missing"; break; } return id; } + /** + * Regenerate this table with something like: + * + * $ vips -l foreign | grep -i load | awk '{ print $2, $1; }' + * + * Plus a bit of editing. + */ + std::map loaderToType = { + { "VipsForeignLoadJpegFile", ImageType::JPEG }, + { "VipsForeignLoadJpegBuffer", ImageType::JPEG }, + { "VipsForeignLoadPngFile", ImageType::PNG }, + { "VipsForeignLoadPngBuffer", ImageType::PNG }, + { "VipsForeignLoadWebpFile", ImageType::WEBP }, + { "VipsForeignLoadWebpBuffer", ImageType::WEBP }, + { "VipsForeignLoadTiffFile", ImageType::TIFF }, + { "VipsForeignLoadTiffBuffer", ImageType::TIFF }, + { "VipsForeignLoadGifFile", ImageType::GIF }, + { "VipsForeignLoadGifBuffer", ImageType::GIF }, + { "VipsForeignLoadNsgifFile", ImageType::GIF }, + { "VipsForeignLoadNsgifBuffer", ImageType::GIF }, + { "VipsForeignLoadJp2kBuffer", ImageType::JP2 }, + { "VipsForeignLoadJp2kFile", ImageType::JP2 }, + { "VipsForeignLoadSvgFile", ImageType::SVG }, + { "VipsForeignLoadSvgBuffer", ImageType::SVG }, + { "VipsForeignLoadHeifFile", ImageType::HEIF }, + { "VipsForeignLoadHeifBuffer", ImageType::HEIF }, + { "VipsForeignLoadPdfFile", ImageType::PDF }, + { "VipsForeignLoadPdfBuffer", ImageType::PDF }, + { "VipsForeignLoadMagickFile", ImageType::MAGICK }, + { "VipsForeignLoadMagickBuffer", ImageType::MAGICK }, + { "VipsForeignLoadMagick7File", ImageType::MAGICK }, + { "VipsForeignLoadMagick7Buffer", ImageType::MAGICK }, + { "VipsForeignLoadOpenslideFile", ImageType::OPENSLIDE }, + { "VipsForeignLoadPpmFile", ImageType::PPM }, + { "VipsForeignLoadFitsFile", ImageType::FITS }, + { "VipsForeignLoadOpenexr", ImageType::EXR }, + { "VipsForeignLoadJxlFile", ImageType::JXL }, + { "VipsForeignLoadJxlBuffer", ImageType::JXL }, + { "VipsForeignLoadRadFile", ImageType::RAD }, + { "VipsForeignLoadRadBuffer", ImageType::RAD }, + { "VipsForeignLoadDcRawFile", ImageType::DCRAW }, + { "VipsForeignLoadDcRawBuffer", ImageType::DCRAW }, + { "VipsForeignLoadVips", ImageType::VIPS }, + { "VipsForeignLoadVipsFile", ImageType::VIPS }, + { "VipsForeignLoadRaw", ImageType::RAW } + }; + /* Determine image format of a buffer. */ ImageType DetermineImageType(void *buffer, size_t const length) { ImageType imageType = ImageType::UNKNOWN; char const *load = vips_foreign_find_load_buffer(buffer, length); - if (load != NULL) { - std::string const loader = load; - if (EndsWith(loader, "JpegBuffer")) { - imageType = ImageType::JPEG; - } else if (EndsWith(loader, "PngBuffer")) { - imageType = ImageType::PNG; - } else if (EndsWith(loader, "WebpBuffer")) { - imageType = ImageType::WEBP; - } else if (EndsWith(loader, "TiffBuffer")) { - imageType = ImageType::TIFF; - } else if (EndsWith(loader, "GifBuffer")) { - imageType = ImageType::GIF; - } else if (EndsWith(loader, "SvgBuffer")) { - imageType = ImageType::SVG; - } else if (EndsWith(loader, "PdfBuffer")) { - imageType = ImageType::PDF; - } else if (EndsWith(loader, "MagickBuffer")) { - imageType = ImageType::MAGICK; + if (load != nullptr) { + auto it = loaderToType.find(load); + if (it != loaderToType.end()) { + imageType = it->second; } } return imageType; @@ -167,51 +366,109 @@ namespace sharp { ImageType imageType = ImageType::UNKNOWN; char const *load = vips_foreign_find_load(file); if (load != nullptr) { - std::string const loader = load; - if (EndsWith(loader, "JpegFile")) { - imageType = ImageType::JPEG; - } else if (EndsWith(loader, "Png")) { - imageType = ImageType::PNG; - } else if (EndsWith(loader, "WebpFile")) { - imageType = ImageType::WEBP; - } else if (EndsWith(loader, "Openslide")) { - imageType = ImageType::OPENSLIDE; - } else if (EndsWith(loader, "TiffFile")) { - imageType = ImageType::TIFF; - } else if (EndsWith(loader, "GifFile")) { - imageType = ImageType::GIF; - } else if (EndsWith(loader, "SvgFile")) { - imageType = ImageType::SVG; - } else if (EndsWith(loader, "PdfFile")) { - imageType = ImageType::PDF; - } else if (EndsWith(loader, "Ppm")) { - imageType = ImageType::PPM; - } else if (EndsWith(loader, "Fits")) { - imageType = ImageType::FITS; - } else if (EndsWith(loader, "Vips")) { - imageType = ImageType::VIPS; - } else if (EndsWith(loader, "Magick") || EndsWith(loader, "MagickFile")) { - imageType = ImageType::MAGICK; + auto it = loaderToType.find(load); + if (it != loaderToType.end()) { + imageType = it->second; + } + } else { + if (EndsWith(vips::VError().what(), " does not exist\n")) { + imageType = ImageType::MISSING; } } return imageType; } + /* + Does this image type support multiple pages? + */ + bool ImageTypeSupportsPage(ImageType imageType) { + return + imageType == ImageType::WEBP || + imageType == ImageType::MAGICK || + imageType == ImageType::GIF || + imageType == ImageType::JP2 || + imageType == ImageType::TIFF || + imageType == ImageType::HEIF || + imageType == ImageType::PDF; + } + + /* + Does this image type support removal of safety limits? + */ + bool ImageTypeSupportsUnlimited(ImageType imageType) { + return + imageType == ImageType::JPEG || + imageType == ImageType::PNG || + imageType == ImageType::SVG || + imageType == ImageType::TIFF || + imageType == ImageType::HEIF; + } + + /* + Format-specific options builder + */ + vips::VOption* GetOptionsForImageType(ImageType imageType, InputDescriptor *descriptor) { + vips::VOption *option = VImage::option() + ->set("access", descriptor->access) + ->set("fail_on", descriptor->failOn); + if (descriptor->unlimited && ImageTypeSupportsUnlimited(imageType)) { + option->set("unlimited", true); + } + if (ImageTypeSupportsPage(imageType)) { + option->set("n", descriptor->pages); + option->set("page", descriptor->page); + } + switch (imageType) { + case ImageType::SVG: + option->set("dpi", descriptor->density) + ->set("stylesheet", descriptor->svgStylesheet.data()) + ->set("high_bitdepth", descriptor->svgHighBitdepth); + break; + case ImageType::TIFF: + option->set("subifd", descriptor->tiffSubifd); + break; + case ImageType::PDF: + option->set("dpi", descriptor->density) + ->set("background", descriptor->pdfBackground); + break; + case ImageType::OPENSLIDE: + option->set("level", descriptor->openSlideLevel); + break; + case ImageType::JP2: + option->set("oneshot", descriptor->jp2Oneshot); + break; + case ImageType::MAGICK: + option->set("density", std::to_string(descriptor->density).data()); + break; + default: + break; + } + return option; + } + /* Open an image from the given InputDescriptor (filesystem, compressed buffer, raw pixel data) */ - std::tuple OpenInput(InputDescriptor *descriptor, VipsAccess accessMethod) { + std::tuple OpenInput(InputDescriptor *descriptor) { VImage image; ImageType imageType; - if (descriptor->buffer != nullptr) { + if (descriptor->isBuffer) { if (descriptor->rawChannels > 0) { // Raw, uncompressed pixel data + bool const is8bit = vips_band_format_is8bit(descriptor->rawDepth); image = VImage::new_from_memory(descriptor->buffer, descriptor->bufferLength, - descriptor->rawWidth, descriptor->rawHeight, descriptor->rawChannels, VIPS_FORMAT_UCHAR); + descriptor->rawWidth, descriptor->rawHeight, descriptor->rawChannels, descriptor->rawDepth); if (descriptor->rawChannels < 3) { - image.get_image()->Type = VIPS_INTERPRETATION_B_W; + image.get_image()->Type = is8bit ? VIPS_INTERPRETATION_B_W : VIPS_INTERPRETATION_GREY16; } else { - image.get_image()->Type = VIPS_INTERPRETATION_sRGB; + image.get_image()->Type = is8bit ? VIPS_INTERPRETATION_sRGB : VIPS_INTERPRETATION_RGB16; + } + if (descriptor->rawPageHeight > 0) { + image.set(VIPS_META_PAGE_HEIGHT, descriptor->rawPageHeight); + image.set(VIPS_META_N_PAGES, static_cast(descriptor->rawHeight / descriptor->rawPageHeight)); + } + if (descriptor->rawPremultiplied) { + image = image.unpremultiply(); } imageType = ImageType::RAW; } else { @@ -219,62 +476,112 @@ namespace sharp { imageType = DetermineImageType(descriptor->buffer, descriptor->bufferLength); if (imageType != ImageType::UNKNOWN) { try { - vips::VOption *option = VImage::option()->set("access", accessMethod); - if (imageType == ImageType::SVG || imageType == ImageType::PDF) { - option->set("dpi", static_cast(descriptor->density)); - } - if (imageType == ImageType::MAGICK) { - option->set("density", std::to_string(descriptor->density).data()); - } + vips::VOption *option = GetOptionsForImageType(imageType, descriptor); image = VImage::new_from_buffer(descriptor->buffer, descriptor->bufferLength, nullptr, option); if (imageType == ImageType::SVG || imageType == ImageType::PDF || imageType == ImageType::MAGICK) { - SetDensity(image, descriptor->density); + image = SetDensity(image, descriptor->density); } - } catch (...) { - throw vips::VError("Input buffer has corrupt header"); + } catch (vips::VError const &err) { + throw vips::VError(std::string("Input buffer has corrupt header: ") + err.what()); } } else { throw vips::VError("Input buffer contains unsupported image format"); } } } else { - if (descriptor->createChannels > 0) { + int const channels = descriptor->createChannels; + if (channels > 0) { // Create new image - std::vector background = { - descriptor->createBackground[0], - descriptor->createBackground[1], - descriptor->createBackground[2] - }; - if (descriptor->createChannels == 4) { - background.push_back(descriptor->createBackground[3]); + if (descriptor->createNoiseType == "gaussian") { + std::vector bands = {}; + bands.reserve(channels); + for (int _band = 0; _band < channels; _band++) { + bands.push_back(VImage::gaussnoise(descriptor->createWidth, descriptor->createHeight, VImage::option() + ->set("mean", descriptor->createNoiseMean) + ->set("sigma", descriptor->createNoiseSigma))); + } + image = VImage::bandjoin(bands).copy(VImage::option()->set("interpretation", + channels < 3 ? VIPS_INTERPRETATION_B_W: VIPS_INTERPRETATION_sRGB)); + } else { + std::vector background = { + descriptor->createBackground[0], + descriptor->createBackground[1], + descriptor->createBackground[2] + }; + if (channels == 4) { + background.push_back(descriptor->createBackground[3]); + } + image = VImage::new_matrix(descriptor->createWidth, descriptor->createHeight) + .copy(VImage::option()->set("interpretation", + channels < 3 ? VIPS_INTERPRETATION_B_W : VIPS_INTERPRETATION_sRGB)) + .new_from_image(background); + } + if (descriptor->createPageHeight > 0) { + image.set(VIPS_META_PAGE_HEIGHT, descriptor->createPageHeight); + image.set(VIPS_META_N_PAGES, static_cast(descriptor->createHeight / descriptor->createPageHeight)); + } + image = image.cast(VIPS_FORMAT_UCHAR); + imageType = ImageType::RAW; + } else if (descriptor->textValue.length() > 0) { + // Create a new image with text + vips::VOption *textOptions = VImage::option() + ->set("align", descriptor->textAlign) + ->set("justify", descriptor->textJustify) + ->set("rgba", descriptor->textRgba) + ->set("spacing", descriptor->textSpacing) + ->set("wrap", descriptor->textWrap) + ->set("autofit_dpi", &descriptor->textAutofitDpi); + if (descriptor->textWidth > 0) { + textOptions->set("width", descriptor->textWidth); + } + // Ignore dpi if height is set + if (descriptor->textWidth > 0 && descriptor->textHeight > 0) { + textOptions->set("height", descriptor->textHeight); + } else if (descriptor->textDpi > 0) { + textOptions->set("dpi", descriptor->textDpi); + } + if (descriptor->textFont.length() > 0) { + textOptions->set("font", const_cast(descriptor->textFont.data())); + } + if (descriptor->textFontfile.length() > 0) { + textOptions->set("fontfile", const_cast(descriptor->textFontfile.data())); + } + image = VImage::text(const_cast(descriptor->textValue.data()), textOptions); + if (!descriptor->textRgba) { + image = image.copy(VImage::option()->set("interpretation", VIPS_INTERPRETATION_B_W)); } - image = VImage::new_matrix(descriptor->createWidth, descriptor->createHeight).new_from_image(background); - image.get_image()->Type = VIPS_INTERPRETATION_sRGB; imageType = ImageType::RAW; } else { // From filesystem imageType = DetermineImageType(descriptor->file.data()); + if (imageType == ImageType::MISSING) { + if (descriptor->file.find("file.substr(0, 8) + "...')?"); + } + throw vips::VError("Input file is missing: " + descriptor->file); + } if (imageType != ImageType::UNKNOWN) { try { - vips::VOption *option = VImage::option()->set("access", accessMethod); - if (imageType == ImageType::SVG || imageType == ImageType::PDF) { - option->set("dpi", static_cast(descriptor->density)); - } - if (imageType == ImageType::MAGICK) { - option->set("density", std::to_string(descriptor->density).data()); - } + vips::VOption *option = GetOptionsForImageType(imageType, descriptor); image = VImage::new_from_file(descriptor->file.data(), option); if (imageType == ImageType::SVG || imageType == ImageType::PDF || imageType == ImageType::MAGICK) { - SetDensity(image, descriptor->density); + image = SetDensity(image, descriptor->density); } - } catch (...) { - throw vips::VError("Input file has corrupt header"); + } catch (vips::VError const &err) { + throw vips::VError(std::string("Input file has corrupt header: ") + err.what()); } } else { - throw vips::VError("Input file is missing or of an unsupported image format"); + throw vips::VError("Input file contains unsupported image format"); } } } + + // Limit input images to a given number of pixels, where pixels = width * height + if (descriptor->limitInputPixels > 0 && + static_cast(image.width()) * image.height() > descriptor->limitInputPixels) { + throw vips::VError("Input image exceeds pixel limit"); + } return std::make_tuple(image, imageType); } @@ -282,20 +589,54 @@ namespace sharp { Does this image have an embedded profile? */ bool HasProfile(VImage image) { - return (image.get_typeof(VIPS_META_ICC_NAME) != 0) ? TRUE : FALSE; + return image.get_typeof(VIPS_META_ICC_NAME) == VIPS_TYPE_BLOB; } /* - Does this image have an alpha channel? - Uses colour space interpretation with number of channels to guess this. + Get copy of embedded profile. */ - bool HasAlpha(VImage image) { - int const bands = image.bands(); - VipsInterpretation const interpretation = image.interpretation(); - return ( - (bands == 2 && interpretation == VIPS_INTERPRETATION_B_W) || - (bands == 4 && interpretation != VIPS_INTERPRETATION_CMYK) || - (bands == 5 && interpretation == VIPS_INTERPRETATION_CMYK)); + std::pair GetProfile(VImage image) { + std::pair icc(nullptr, 0); + if (HasProfile(image)) { + size_t length; + const void *data = image.get_blob(VIPS_META_ICC_NAME, &length); + icc.first = static_cast(g_malloc(length)); + icc.second = length; + memcpy(icc.first, data, length); + } + return icc; + } + + /* + Set embedded profile. + */ + VImage SetProfile(VImage image, std::pair icc) { + if (icc.first != nullptr) { + image = image.copy(); + image.set(VIPS_META_ICC_NAME, reinterpret_cast(vips_area_free_cb), icc.first, icc.second); + } + return image; + } + + static void* RemoveExifCallback(VipsImage *image, char const *field, GValue *value, void *data) { + std::vector *fieldNames = static_cast *>(data); + std::string fieldName(field); + if (fieldName.substr(0, 8) == ("exif-ifd")) { + fieldNames->push_back(fieldName); + } + return nullptr; + } + + /* + Remove all EXIF-related image fields. + */ + VImage RemoveExif(VImage image) { + std::vector fieldNames; + vips_image_map(image.get_image(), static_cast(RemoveExifCallback), &fieldNames); + for (const auto& f : fieldNames) { + image.remove(f.data()); + } + return image; } /* @@ -312,15 +653,65 @@ namespace sharp { /* Set EXIF Orientation of image. */ - void SetExifOrientation(VImage image, int const orientation) { - image.set(VIPS_META_ORIENTATION, orientation); + VImage SetExifOrientation(VImage image, int const orientation) { + VImage copy = image.copy(); + copy.set(VIPS_META_ORIENTATION, orientation); + return copy; } /* Remove EXIF Orientation from image. */ - void RemoveExifOrientation(VImage image) { - vips_image_remove(image.get_image(), VIPS_META_ORIENTATION); + VImage RemoveExifOrientation(VImage image) { + VImage copy = image.copy(); + copy.remove(VIPS_META_ORIENTATION); + copy.remove("exif-ifd0-Orientation"); + return copy; + } + + /* + Set animation properties if necessary. + */ + VImage SetAnimationProperties(VImage image, int nPages, int pageHeight, std::vector delay, int loop) { + bool hasDelay = !delay.empty(); + VImage copy = image.copy(); + + // Only set page-height if we have more than one page, or this could + // accidentally turn into an animated image later. + if (nPages > 1) copy.set(VIPS_META_PAGE_HEIGHT, pageHeight); + if (hasDelay) { + if (delay.size() == 1) { + // We have just one delay, repeat that value for all frames. + delay.insert(delay.end(), nPages - 1, delay[0]); + } + copy.set("delay", delay); + } + if (nPages == 1 && !hasDelay && loop == -1) { + loop = 1; + } + if (loop != -1) copy.set("loop", loop); + + return copy; + } + + /* + Remove animation properties from image. + */ + VImage RemoveAnimationProperties(VImage image) { + VImage copy = image.copy(); + copy.remove(VIPS_META_PAGE_HEIGHT); + copy.remove("delay"); + copy.remove("loop"); + return copy; + } + + /* + Remove GIF palette from image. + */ + VImage RemoveGifPalette(VImage image) { + VImage copy = image.copy(); + copy.remove("gif-palette"); + return copy; } /* @@ -340,40 +731,54 @@ namespace sharp { /* Set pixels/mm resolution based on a pixels/inch density. */ - void SetDensity(VImage image, const int density) { - const double pixelsPerMm = static_cast(density) / 25.4; - image.set("Xres", pixelsPerMm); - image.set("Yres", pixelsPerMm); - image.set(VIPS_META_RESOLUTION_UNIT, "in"); + VImage SetDensity(VImage image, const double density) { + const double pixelsPerMm = density / 25.4; + VImage copy = image.copy(); + copy.get_image()->Xres = pixelsPerMm; + copy.get_image()->Yres = pixelsPerMm; + return copy; + } + + /* + Multi-page images can have a page height. Fetch it, and sanity check it. + If page-height is not set, it defaults to the image height + */ + int GetPageHeight(VImage image) { + return vips_image_get_page_height(image.get_image()); } /* Check the proposed format supports the current dimensions. */ void AssertImageTypeDimensions(VImage image, ImageType const imageType) { + const int height = image.get_typeof(VIPS_META_PAGE_HEIGHT) == G_TYPE_INT + ? image.get_int(VIPS_META_PAGE_HEIGHT) + : image.height(); if (imageType == ImageType::JPEG) { - if (image.width() > 65535 || image.height() > 65535) { + if (image.width() > 65535 || height > 65535) { throw vips::VError("Processed image is too large for the JPEG format"); } - } else if (imageType == ImageType::PNG) { - if (image.width() > 2147483647 || image.height() > 2147483647) { - throw vips::VError("Processed image is too large for the PNG format"); - } } else if (imageType == ImageType::WEBP) { - if (image.width() > 16383 || image.height() > 16383) { + if (image.width() > 16383 || height > 16383) { throw vips::VError("Processed image is too large for the WebP format"); } + } else if (imageType == ImageType::GIF) { + if (image.width() > 65535 || height > 65535) { + throw vips::VError("Processed image is too large for the GIF format"); + } + } else if (imageType == ImageType::HEIF) { + if (image.width() > 16384 || height > 16384) { + throw vips::VError("Processed image is too large for the HEIF format"); + } } } /* Called when a Buffer undergoes GC, required to support mixed runtime libraries in Windows */ - void FreeCallback(char* data, void* hint) { - if (data != nullptr) { - g_free(data); - } - } + std::function FreeCallback = [](void*, char* data) { + g_free(data); + }; /* Temporary buffer of warnings @@ -402,9 +807,90 @@ namespace sharp { return warning; } + /* + Attach an event listener for progress updates, used to detect timeout + */ + void SetTimeout(VImage image, int const seconds) { + if (seconds > 0) { + VipsImage *im = image.get_image(); + if (im->progress_signal == NULL) { + int *timeout = VIPS_NEW(im, int); + *timeout = seconds; + g_signal_connect(im, "eval", G_CALLBACK(VipsProgressCallBack), timeout); + vips_image_set_progress(im, true); + } + } + } + + /* + Event listener for progress updates, used to detect timeout + */ + void VipsProgressCallBack(VipsImage *im, VipsProgress *progress, int *timeout) { + if (*timeout > 0 && progress->run >= *timeout) { + vips_image_set_kill(im, true); + vips_error("timeout", "%d%% complete", progress->percent); + *timeout = 0; + } + } + + /* + Calculate the (left, top) coordinates of the output image + within the input image, applying the given gravity during an embed. + + @Azurebyte: We are basically swapping the inWidth and outWidth, inHeight and outHeight from the CalculateCrop function. + */ + std::tuple CalculateEmbedPosition(int const inWidth, int const inHeight, + int const outWidth, int const outHeight, int const gravity) { + + int left = 0; + int top = 0; + switch (gravity) { + case 1: + // North + left = (outWidth - inWidth) / 2; + break; + case 2: + // East + left = outWidth - inWidth; + top = (outHeight - inHeight) / 2; + break; + case 3: + // South + left = (outWidth - inWidth) / 2; + top = outHeight - inHeight; + break; + case 4: + // West + top = (outHeight - inHeight) / 2; + break; + case 5: + // Northeast + left = outWidth - inWidth; + break; + case 6: + // Southeast + left = outWidth - inWidth; + top = outHeight - inHeight; + break; + case 7: + // Southwest + top = outHeight - inHeight; + break; + case 8: + // Northwest + // Which is the default is 0,0 so we do not assign anything here. + break; + default: + // Centre + left = (outWidth - inWidth) / 2; + top = (outHeight - inHeight) / 2; + } + return std::make_tuple(left, top); + } + /* Calculate the (left, top) coordinates of the output image - within the input image, applying the given gravity. + within the input image, applying the given gravity during a crop. */ std::tuple CalculateCrop(int const inWidth, int const inHeight, int const outWidth, int const outHeight, int const gravity) { @@ -466,26 +952,18 @@ namespace sharp { int top = 0; // assign only if valid - if (x >= 0 && x < (inWidth - outWidth)) { + if (x < (inWidth - outWidth)) { left = x; } else if (x >= (inWidth - outWidth)) { left = inWidth - outWidth; } - if (y >= 0 && y < (inHeight - outHeight)) { + if (y < (inHeight - outHeight)) { top = y; } else if (y >= (inHeight - outHeight)) { top = inHeight - outHeight; } - // the resulting left and top could have been outside the image after calculation from bottom/right edges - if (left < 0) { - left = 0; - } - if (top < 0) { - top = 0; - } - return std::make_tuple(left, top); } @@ -497,43 +975,156 @@ namespace sharp { } /* - Return the image alpha maximum. Useful for combining alpha bands. scRGB - images are 0 - 1 for image data, but the alpha is 0 - 255. + Convert RGBA value to another colourspace */ - double MaximumImageAlpha(VipsInterpretation const interpretation) { - return Is16Bit(interpretation) ? 65535.0 : 255.0; + std::vector GetRgbaAsColourspace(std::vector const rgba, + VipsInterpretation const interpretation, bool premultiply) { + int const bands = static_cast(rgba.size()); + if (bands < 3) { + return rgba; + } + VImage pixel = VImage::new_matrix(1, 1); + pixel.set("bands", bands); + pixel = pixel + .new_from_image(rgba) + .colourspace(interpretation, VImage::option()->set("source_space", VIPS_INTERPRETATION_sRGB)); + if (premultiply) { + pixel = pixel.premultiply(); + } + return pixel(0, 0); } /* - Get boolean operation type from string + Apply the alpha channel to a given colour */ - VipsOperationBoolean GetBooleanOperation(std::string const opStr) { - return static_cast( - vips_enum_from_nick(nullptr, VIPS_TYPE_OPERATION_BOOLEAN, opStr.data())); + std::tuple> ApplyAlpha(VImage image, std::vector colour, bool premultiply) { + // Scale up 8-bit values to match 16-bit input image + double const multiplier = sharp::Is16Bit(image.interpretation()) ? 256.0 : 1.0; + // Create alphaColour colour + std::vector alphaColour; + if (image.bands() > 2) { + alphaColour = { + multiplier * colour[0], + multiplier * colour[1], + multiplier * colour[2] + }; + } else { + // Convert sRGB to greyscale + alphaColour = { multiplier * ( + 0.2126 * colour[0] + + 0.7152 * colour[1] + + 0.0722 * colour[2]) + }; + } + // Add alpha channel(s) to alphaColour colour + if (colour[3] < 255.0 || image.has_alpha()) { + int extraBands = image.bands() > 4 ? image.bands() - 3 : 1; + alphaColour.insert(alphaColour.end(), extraBands, colour[3] * multiplier); + } + // Ensure alphaColour colour uses correct colourspace + alphaColour = sharp::GetRgbaAsColourspace(alphaColour, image.interpretation(), premultiply); + // Add non-transparent alpha channel, if required + if (colour[3] < 255.0 && !image.has_alpha()) { + image = image.bandjoin_const({ 255 * multiplier }); + } + return std::make_tuple(image, alphaColour); } /* - Get interpretation type from string + Removes alpha channels, if any. */ - VipsInterpretation GetInterpretation(std::string const typeStr) { - return static_cast( - vips_enum_from_nick(nullptr, VIPS_TYPE_INTERPRETATION, typeStr.data())); + VImage RemoveAlpha(VImage image) { + while (image.bands() > 1 && image.has_alpha()) { + image = image.extract_band(0, VImage::option()->set("n", image.bands() - 1)); + } + return image; } /* - Convert RGBA value to another colourspace + Ensures alpha channel, if missing. */ - std::vector GetRgbaAsColourspace(std::vector const rgba, VipsInterpretation const interpretation) { - int const bands = static_cast(rgba.size()); - if (bands < 3 || interpretation == VIPS_INTERPRETATION_sRGB || interpretation == VIPS_INTERPRETATION_RGB) { - return rgba; - } else { - VImage pixel = VImage::new_matrix(1, 1); - pixel.set("bands", bands); - pixel = pixel.new_from_image(rgba); - pixel = pixel.colourspace(interpretation, VImage::option()->set("source_space", VIPS_INTERPRETATION_sRGB)); - return pixel(0, 0); + VImage EnsureAlpha(VImage image, double const value) { + if (!image.has_alpha()) { + image = image.bandjoin_const({ value * vips_interpretation_max_alpha(image.interpretation()) }); + } + return image; + } + + std::pair ResolveShrink(int width, int height, int targetWidth, int targetHeight, + Canvas canvas, bool withoutEnlargement, bool withoutReduction) { + double hshrink = 1.0; + double vshrink = 1.0; + + if (targetWidth > 0 && targetHeight > 0) { + // Fixed width and height + hshrink = static_cast(width) / targetWidth; + vshrink = static_cast(height) / targetHeight; + + switch (canvas) { + case Canvas::CROP: + case Canvas::MIN: + if (hshrink < vshrink) { + vshrink = hshrink; + } else { + hshrink = vshrink; + } + break; + case Canvas::EMBED: + case Canvas::MAX: + if (hshrink > vshrink) { + vshrink = hshrink; + } else { + hshrink = vshrink; + } + break; + case Canvas::IGNORE_ASPECT: + break; + } + } else if (targetWidth > 0) { + // Fixed width + hshrink = static_cast(width) / targetWidth; + + if (canvas != Canvas::IGNORE_ASPECT) { + // Auto height + vshrink = hshrink; + } + } else if (targetHeight > 0) { + // Fixed height + vshrink = static_cast(height) / targetHeight; + + if (canvas != Canvas::IGNORE_ASPECT) { + // Auto width + hshrink = vshrink; + } + } + + // We should not reduce or enlarge the output image, if + // withoutReduction or withoutEnlargement is specified. + if (withoutReduction) { + // Equivalent of VIPS_SIZE_UP + hshrink = std::min(1.0, hshrink); + vshrink = std::min(1.0, vshrink); + } else if (withoutEnlargement) { + // Equivalent of VIPS_SIZE_DOWN + hshrink = std::max(1.0, hshrink); + vshrink = std::max(1.0, vshrink); } + + // We don't want to shrink so much that we send an axis to 0 + hshrink = std::min(hshrink, static_cast(width)); + vshrink = std::min(vshrink, static_cast(height)); + + return std::make_pair(hshrink, vshrink); } + /* + Ensure decoding remains sequential. + */ + VImage StaySequential(VImage image, bool condition) { + if (vips_image_is_sequential(image.get_image()) && condition) { + image = image.copy_memory().copy(); + image.remove(VIPS_META_SEQUENTIAL); + } + return image; + } } // namespace sharp diff --git a/src/common.h b/src/common.h index 1bf3b41cc..c15755bb0 100644 --- a/src/common.h +++ b/src/common.h @@ -1,41 +1,31 @@ -// Copyright 2013, 2014, 2015, 2016, 2017 Lovell Fuller and contributors. -// -// Licensed under the Apache License, Version 2.0 (the "License"); -// you may not use this file except in compliance with the License. -// You may obtain a copy of the License at -// -// http://www.apache.org/licenses/LICENSE-2.0 -// -// Unless required by applicable law or agreed to in writing, software -// distributed under the License is distributed on an "AS IS" BASIS, -// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -// See the License for the specific language governing permissions and -// limitations under the License. +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ #ifndef SRC_COMMON_H_ #define SRC_COMMON_H_ +#include #include #include +#include #include -#include -#include +#include #include // Verify platform and compiler compatibility -#if (VIPS_MAJOR_VERSION < 8 || (VIPS_MAJOR_VERSION == 8 && VIPS_MINOR_VERSION < 5)) -#error libvips version 8.5.x required - see sharp.dimens.io/page/install +#if (VIPS_MAJOR_VERSION < 8) || \ + (VIPS_MAJOR_VERSION == 8 && VIPS_MINOR_VERSION < 17) || \ + (VIPS_MAJOR_VERSION == 8 && VIPS_MINOR_VERSION == 17 && VIPS_MICRO_VERSION < 3) +#error "libvips version 8.17.3+ is required - please see https://sharp.pixelplumbing.com/install" #endif -#if ((!defined(__clang__)) && defined(__GNUC__) && (__GNUC__ < 4 || (__GNUC__ == 4 && __GNUC_MINOR__ < 6))) -#error GCC version 4.6+ is required for C++11 features - see sharp.dimens.io/page/install#prerequisites -#endif - -#if (defined(__clang__) && defined(__has_feature)) -#if (!__has_feature(cxx_range_for)) -#error clang version 3.0+ is required for C++11 features - see sharp.dimens.io/page/install#prerequisites +#if defined(__has_include) +#if !__has_include() +#error "C++17 compiler required - please see https://sharp.pixelplumbing.com/install" #endif #endif @@ -46,83 +36,183 @@ namespace sharp { struct InputDescriptor { std::string name; std::string file; + bool autoOrient; char *buffer; + VipsFailOn failOn; + uint64_t limitInputPixels; + bool unlimited; + VipsAccess access; size_t bufferLength; - int density; + bool isBuffer; + double density; + bool ignoreIcc; + VipsBandFormat rawDepth; int rawChannels; int rawWidth; int rawHeight; + bool rawPremultiplied; + int rawPageHeight; + int pages; + int page; int createChannels; int createWidth; int createHeight; - double createBackground[4]; + int createPageHeight; + std::vector createBackground; + std::string createNoiseType; + double createNoiseMean; + double createNoiseSigma; + std::string textValue; + std::string textFont; + std::string textFontfile; + int textWidth; + int textHeight; + VipsAlign textAlign; + bool textJustify; + int textDpi; + bool textRgba; + int textSpacing; + VipsTextWrap textWrap; + int textAutofitDpi; + bool joinAnimated; + int joinAcross; + int joinShim; + std::vector joinBackground; + VipsAlign joinHalign; + VipsAlign joinValign; + std::string svgStylesheet; + bool svgHighBitdepth; + int tiffSubifd; + int openSlideLevel; + std::vector pdfBackground; + bool jp2Oneshot; InputDescriptor(): + autoOrient(false), buffer(nullptr), + failOn(VIPS_FAIL_ON_WARNING), + limitInputPixels(0x3FFF * 0x3FFF), + unlimited(false), + access(VIPS_ACCESS_SEQUENTIAL), bufferLength(0), - density(72), + isBuffer(false), + density(72.0), + ignoreIcc(false), + rawDepth(VIPS_FORMAT_UCHAR), rawChannels(0), rawWidth(0), rawHeight(0), + rawPremultiplied(false), + rawPageHeight(0), + pages(1), + page(0), createChannels(0), createWidth(0), - createHeight(0) { - createBackground[0] = 0.0; - createBackground[1] = 0.0; - createBackground[2] = 0.0; - createBackground[3] = 255.0; - } + createHeight(0), + createPageHeight(0), + createBackground{ 0.0, 0.0, 0.0, 255.0 }, + createNoiseMean(0.0), + createNoiseSigma(0.0), + textWidth(0), + textHeight(0), + textAlign(VIPS_ALIGN_LOW), + textJustify(false), + textDpi(72), + textRgba(false), + textSpacing(0), + textWrap(VIPS_TEXT_WRAP_WORD), + textAutofitDpi(0), + joinAnimated(false), + joinAcross(1), + joinShim(0), + joinBackground{ 0.0, 0.0, 0.0, 255.0 }, + joinHalign(VIPS_ALIGN_LOW), + joinValign(VIPS_ALIGN_LOW), + svgHighBitdepth(false), + tiffSubifd(-1), + openSlideLevel(0), + pdfBackground{ 255.0, 255.0, 255.0, 255.0 }, + jp2Oneshot(false) {} }; - // Convenience methods to access the attributes of a v8::Object - bool HasAttr(v8::Handle obj, std::string attr); - std::string AttrAsStr(v8::Handle obj, std::string attr); - template v8::Local AttrAs(v8::Handle obj, std::string attr) { - return Nan::Get(obj, Nan::New(attr).ToLocalChecked()).ToLocalChecked().As(); - } - template T AttrTo(v8::Handle obj, std::string attr) { - return Nan::To(Nan::Get(obj, Nan::New(attr).ToLocalChecked()).ToLocalChecked()).FromJust(); - } - template T AttrTo(v8::Handle obj, int attr) { - return Nan::To(Nan::Get(obj, attr).ToLocalChecked()).FromJust(); + // Convenience methods to access the attributes of a Napi::Object + bool HasAttr(Napi::Object obj, std::string attr); + std::string AttrAsStr(Napi::Object obj, std::string attr); + std::string AttrAsStr(Napi::Object obj, unsigned int const attr); + uint32_t AttrAsUint32(Napi::Object obj, std::string attr); + int32_t AttrAsInt32(Napi::Object obj, std::string attr); + int32_t AttrAsInt32(Napi::Object obj, unsigned int const attr); + double AttrAsDouble(Napi::Object obj, std::string attr); + double AttrAsDouble(Napi::Object obj, unsigned int const attr); + bool AttrAsBool(Napi::Object obj, std::string attr); + std::vector AttrAsVectorOfDouble(Napi::Object obj, std::string attr); + std::vector AttrAsInt32Vector(Napi::Object obj, std::string attr); + template T AttrAsEnum(Napi::Object obj, std::string attr, GType type) { + return static_cast( + vips_enum_from_nick(nullptr, type, AttrAsStr(obj, attr).data())); } - // Create an InputDescriptor instance from a v8::Object describing an input image - InputDescriptor* CreateInputDescriptor( - v8::Handle input, std::vector> buffersToPersist); + // Create an InputDescriptor instance from a Napi::Object describing an input image + InputDescriptor* CreateInputDescriptor(Napi::Object input); enum class ImageType { JPEG, PNG, WEBP, + JP2, TIFF, GIF, SVG, + HEIF, PDF, MAGICK, OPENSLIDE, PPM, FITS, + EXR, + JXL, + RAD, + DCRAW, VIPS, RAW, - UNKNOWN + UNKNOWN, + MISSING + }; + + enum class Canvas { + CROP, + EMBED, + MAX, + MIN, + IGNORE_ASPECT }; // How many tasks are in the queue? - extern volatile int counterQueue; + extern std::atomic counterQueue; // How many tasks are being processed? - extern volatile int counterProcess; + extern std::atomic counterProcess; // Filename extension checkers bool IsJpeg(std::string const &str); bool IsPng(std::string const &str); bool IsWebp(std::string const &str); + bool IsJp2(std::string const &str); + bool IsGif(std::string const &str); bool IsTiff(std::string const &str); + bool IsHeic(std::string const &str); + bool IsHeif(std::string const &str); + bool IsAvif(std::string const &str); + bool IsJxl(std::string const &str); bool IsDz(std::string const &str); bool IsDzZip(std::string const &str); bool IsV(std::string const &str); + /* + Trim space from end of string. + */ + std::string TrimEnd(std::string const &str); + /* Provide a string identifier for the given image type. */ @@ -138,10 +228,15 @@ namespace sharp { */ ImageType DetermineImageType(char const *file); + /* + Format-specific options builder + */ + vips::VOption* GetOptionsForImageType(ImageType imageType, InputDescriptor *descriptor); + /* Open an image from the given InputDescriptor (filesystem, compressed buffer, raw pixel data) */ - std::tuple OpenInput(InputDescriptor *descriptor, VipsAccess accessMethod); + std::tuple OpenInput(InputDescriptor *descriptor); /* Does this image have an embedded profile? @@ -149,10 +244,19 @@ namespace sharp { bool HasProfile(VImage image); /* - Does this image have an alpha channel? - Uses colour space interpretation with number of channels to guess this. + Get copy of embedded profile. */ - bool HasAlpha(VImage image); + std::pair GetProfile(VImage image); + + /* + Set embedded profile. + */ + VImage SetProfile(VImage image, std::pair icc); + + /* + Remove all EXIF-related image fields. + */ + VImage RemoveExif(VImage image); /* Get EXIF Orientation of image, if any. @@ -162,12 +266,27 @@ namespace sharp { /* Set EXIF Orientation of image. */ - void SetExifOrientation(VImage image, int const orientation); + VImage SetExifOrientation(VImage image, int const orientation); /* Remove EXIF Orientation from image. */ - void RemoveExifOrientation(VImage image); + VImage RemoveExifOrientation(VImage image); + + /* + Set animation properties if necessary. + */ + VImage SetAnimationProperties(VImage image, int nPages, int pageHeight, std::vector delay, int loop); + + /* + Remove animation properties from image. + */ + VImage RemoveAnimationProperties(VImage image); + + /* + Remove GIF palette from image. + */ + VImage RemoveGifPalette(VImage image); /* Does this image have a non-default density? @@ -182,7 +301,13 @@ namespace sharp { /* Set pixels/mm resolution based on a pixels/inch density. */ - void SetDensity(VImage image, const int density); + VImage SetDensity(VImage image, const double density); + + /* + Multi-page images can have a page height. Fetch it, and sanity check it. + If page-height is not set, it defaults to the image height + */ + int GetPageHeight(VImage image); /* Check the proposed format supports the current dimensions. @@ -192,7 +317,7 @@ namespace sharp { /* Called when a Buffer undergoes GC, required to support mixed runtime libraries in Windows */ - void FreeCallback(char* data, void* hint); + extern std::function FreeCallback; /* Called with warnings from the glib-registered "VIPS" domain @@ -204,6 +329,23 @@ namespace sharp { */ std::string VipsWarningPop(); + /* + Attach an event listener for progress updates, used to detect timeout + */ + void SetTimeout(VImage image, int const timeoutSeconds); + + /* + Event listener for progress updates, used to detect timeout + */ + void VipsProgressCallBack(VipsImage *image, VipsProgress *progress, int *timeoutSeconds); + + /* + Calculate the (left, top) coordinates of the output image + within the input image, applying the given gravity during an embed. + */ + std::tuple CalculateEmbedPosition(int const inWidth, int const inHeight, + int const outWidth, int const outHeight, int const gravity); + /* Calculate the (left, top) coordinates of the output image within the input image, applying the given gravity. @@ -224,25 +366,36 @@ namespace sharp { bool Is16Bit(VipsInterpretation const interpretation); /* - Return the image alpha maximum. Useful for combining alpha bands. scRGB - images are 0 - 1 for image data, but the alpha is 0 - 255. + Convert RGBA value to another colourspace + */ + std::vector GetRgbaAsColourspace(std::vector const rgba, + VipsInterpretation const interpretation, bool premultiply); + + /* + Apply the alpha channel to a given colour + */ + std::tuple> ApplyAlpha(VImage image, std::vector colour, bool premultiply); + + /* + Removes alpha channels, if any. */ - double MaximumImageAlpha(VipsInterpretation const interpretation); + VImage RemoveAlpha(VImage image); /* - Get boolean operation type from string + Ensures alpha channel, if missing. */ - VipsOperationBoolean GetBooleanOperation(std::string const opStr); + VImage EnsureAlpha(VImage image, double const value); /* - Get interpretation type from string + Calculate the horizontal and vertical shrink factors, taking the canvas mode into account. */ - VipsInterpretation GetInterpretation(std::string const typeStr); + std::pair ResolveShrink(int width, int height, int targetWidth, int targetHeight, + Canvas canvas, bool withoutEnlargement, bool withoutReduction); /* - Convert RGBA value to another colourspace + Ensure decoding remains sequential. */ - std::vector GetRgbaAsColourspace(std::vector const rgba, VipsInterpretation const interpretation); + VImage StaySequential(VImage image, bool condition = true); } // namespace sharp diff --git a/src/emscripten/common.gypi b/src/emscripten/common.gypi new file mode 100644 index 000000000..714affe59 --- /dev/null +++ b/src/emscripten/common.gypi @@ -0,0 +1,39 @@ +# Copyright 2013 Lovell Fuller and others. +# SPDX-License-Identifier: Apache-2.0 + +{ + 'variables': { + 'OS': 'emscripten' + }, + 'target_defaults': { + 'default_configuration': 'Release', + 'type': 'executable', + 'cflags': [ + '-pthread' + ], + 'cflags_cc': [ + '-pthread' + ], + 'ldflags': [ + '--js-library= JSON.stringify(path.relative(process.cwd(), x))).join(\' \')")' + ], + 'configurations': { + 'Release': {} + } + } +} diff --git a/src/emscripten/pre.js b/src/emscripten/pre.js new file mode 100644 index 000000000..1163b9e77 --- /dev/null +++ b/src/emscripten/pre.js @@ -0,0 +1,21 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +/* global Module, ENV, _vips_shutdown, _uv_library_shutdown */ + +Module.preRun = () => { + ENV.VIPS_CONCURRENCY = Number(process.env.VIPS_CONCURRENCY) || 1; +}; + +Module.onRuntimeInitialized = () => { + module.exports = Module.emnapiInit({ + context: require('@emnapi/runtime').getDefaultContext() + }); + + process.once('exit', () => { + _vips_shutdown(); + _uv_library_shutdown(); + }); +}; diff --git a/src/libvips/cplusplus/VError.cpp b/src/libvips/cplusplus/VError.cpp deleted file mode 100644 index 67e67348e..000000000 --- a/src/libvips/cplusplus/VError.cpp +++ /dev/null @@ -1,52 +0,0 @@ -// Code for error type - -/* - - Copyright (C) 1991-2001 The National Gallery - - This program is free software; you can redistribute it and/or modify - it under the terms of the GNU Lesser General Public License as published by - the Free Software Foundation; either version 2 of the License, or - (at your option) any later version. - - This program is distributed in the hope that it will be useful, - but WITHOUT ANY WARRANTY; without even the implied warranty of - MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - GNU Lesser General Public License for more details. - - You should have received a copy of the GNU Lesser General Public License - along with this program; if not, write to the Free Software - Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA - 02110-1301 USA - - */ - -/* - - These files are distributed with VIPS - http://www.vips.ecs.soton.ac.uk - - */ - -#ifdef HAVE_CONFIG_H -#include -#endif /*HAVE_CONFIG_H*/ -#include - -#include - -#include - -VIPS_NAMESPACE_START - -std::ostream &operator<<( std::ostream &file, const VError &err ) -{ - err.ostream_print( file ); - return( file ); -} - -void VError::ostream_print( std::ostream &file ) const -{ - file << _what; -} - -VIPS_NAMESPACE_END diff --git a/src/libvips/cplusplus/VImage.cpp b/src/libvips/cplusplus/VImage.cpp deleted file mode 100644 index 27b29fdff..000000000 --- a/src/libvips/cplusplus/VImage.cpp +++ /dev/null @@ -1,1410 +0,0 @@ -/* Object part of VImage class - * - * 30/12/14 - * - allow set enum value from string - * 10/6/16 - * - missing implementation of VImage::write() - * 11/6/16 - * - added arithmetic assignment overloads, += etc. - */ - -/* - - Copyright (C) 1991-2001 The National Gallery - - This program is free software; you can redistribute it and/or modify - it under the terms of the GNU Lesser General Public License as published by - the Free Software Foundation; either version 2 of the License, or - (at your option) any later version. - - This program is distributed in the hope that it will be useful, - but WITHOUT ANY WARRANTY; without even the implied warranty of - MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - GNU Lesser General Public License for more details. - - You should have received a copy of the GNU Lesser General Public License - along with this program; if not, write to the Free Software - Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA - 02110-1301 USA - - */ - -/* - - These files are distributed with VIPS - http://www.vips.ecs.soton.ac.uk - - */ - -#ifdef HAVE_CONFIG_H -#include -#endif /*HAVE_CONFIG_H*/ -#include - -#include - -#include - -/* -#define VIPS_DEBUG -#define VIPS_DEBUG_VERBOSE - */ - -VIPS_NAMESPACE_START - -std::vector -to_vectorv( int n, ... ) -{ - std::vector vector( n ); - va_list ap; - - va_start( ap, n ); - for( int i = 0; i < n; i++ ) - vector[i] = va_arg( ap, double ); - va_end( ap ); - - return( vector ); -} - -std::vector -to_vector( double value ) -{ - return( to_vectorv( 1, value ) ); -} - -std::vector -to_vector( int n, double array[] ) -{ - std::vector vector( n ); - - for( int i = 0; i < n; i++ ) - vector[i] = array[i]; - - return( vector ); -} - -std::vector -negate( std::vector vector ) -{ - std::vector new_vector( vector.size() ); - - for( unsigned int i = 0; i < vector.size(); i++ ) - new_vector[i] = vector[i] * -1; - - return( new_vector ); -} - -std::vector -invert( std::vector vector ) -{ - std::vector new_vector( vector.size() ); - - for( unsigned int i = 0; i < vector.size(); i++ ) - new_vector[i] = 1.0 / vector[i]; - - return( new_vector ); -} - -VOption::~VOption() -{ - std::list::iterator i; - - for( i = options.begin(); i != options.end(); ++i ) - delete *i; -} - -// input bool -VOption * -VOption::set( const char *name, bool value ) -{ - Pair *pair = new Pair( name ); - - pair->input = true; - g_value_init( &pair->value, G_TYPE_BOOLEAN ); - g_value_set_boolean( &pair->value, value ); - options.push_back( pair ); - - return( this ); -} - -// input int ... this path is used for enums as well -VOption * -VOption::set( const char *name, int value ) -{ - Pair *pair = new Pair( name ); - - pair->input = true; - g_value_init( &pair->value, G_TYPE_INT ); - g_value_set_int( &pair->value, value ); - options.push_back( pair ); - - return( this ); -} - -// input double -VOption * -VOption::set( const char *name, double value ) -{ - Pair *pair = new Pair( name ); - - pair->input = true; - g_value_init( &pair->value, G_TYPE_DOUBLE ); - g_value_set_double( &pair->value, value ); - options.push_back( pair ); - - return( this ); -} - -VOption * -VOption::set( const char *name, const char *value ) -{ - Pair *pair = new Pair( name ); - - pair->input = true; - g_value_init( &pair->value, G_TYPE_STRING ); - g_value_set_string( &pair->value, value ); - options.push_back( pair ); - - return( this ); -} - -// input image -VOption * -VOption::set( const char *name, VImage value ) -{ - Pair *pair = new Pair( name ); - - pair->input = true; - g_value_init( &pair->value, VIPS_TYPE_IMAGE ); - g_value_set_object( &pair->value, value.get_image() ); - options.push_back( pair ); - - return( this ); -} - -// input double array -VOption * -VOption::set( const char *name, std::vector value ) -{ - Pair *pair = new Pair( name ); - - double *array; - unsigned int i; - - pair->input = true; - - g_value_init( &pair->value, VIPS_TYPE_ARRAY_DOUBLE ); - vips_value_set_array_double( &pair->value, NULL, - static_cast< int >( value.size() ) ); - array = vips_value_get_array_double( &pair->value, NULL ); - - for( i = 0; i < value.size(); i++ ) - array[i] = value[i]; - - options.push_back( pair ); - - return( this ); -} - -// input image array -VOption * -VOption::set( const char *name, std::vector value ) -{ - Pair *pair = new Pair( name ); - - VipsImage **array; - unsigned int i; - - pair->input = true; - - g_value_init( &pair->value, VIPS_TYPE_ARRAY_IMAGE ); - vips_value_set_array_image( &pair->value, - static_cast< int >( value.size() ) ); - array = vips_value_get_array_image( &pair->value, NULL ); - - for( i = 0; i < value.size(); i++ ) { - VipsImage *vips_image = value[i].get_image(); - - array[i] = vips_image; - g_object_ref( vips_image ); - } - - options.push_back( pair ); - - return( this ); -} - -// input blob -VOption * -VOption::set( const char *name, VipsBlob *value ) -{ - Pair *pair = new Pair( name ); - - pair->input = true; - g_value_init( &pair->value, VIPS_TYPE_BLOB ); - g_value_set_boxed( &pair->value, value ); - options.push_back( pair ); - - return( this ); -} - -// output bool -VOption * -VOption::set( const char *name, bool *value ) -{ - Pair *pair = new Pair( name ); - - pair->input = false; - pair->vbool = value; - g_value_init( &pair->value, G_TYPE_BOOLEAN ); - - options.push_back( pair ); - - return( this ); -} - -// output int -VOption * -VOption::set( const char *name, int *value ) -{ - Pair *pair = new Pair( name ); - - pair->input = false; - pair->vint = value; - g_value_init( &pair->value, G_TYPE_INT ); - - options.push_back( pair ); - - return( this ); -} - -// output double -VOption * -VOption::set( const char *name, double *value ) -{ - Pair *pair = new Pair( name ); - - pair->input = false; - pair->vdouble = value; - g_value_init( &pair->value, G_TYPE_DOUBLE ); - - options.push_back( pair ); - - return( this ); -} - -// output image -VOption * -VOption::set( const char *name, VImage *value ) -{ - Pair *pair = new Pair( name ); - - pair->input = false; - pair->vimage = value; - g_value_init( &pair->value, VIPS_TYPE_IMAGE ); - - options.push_back( pair ); - - return( this ); -} - -// output doublearray -VOption * -VOption::set( const char *name, std::vector *value ) -{ - Pair *pair = new Pair( name ); - - pair->input = false; - pair->vvector = value; - g_value_init( &pair->value, VIPS_TYPE_ARRAY_DOUBLE ); - - options.push_back( pair ); - - return( this ); -} - -// output blob -VOption * -VOption::set( const char *name, VipsBlob **value ) -{ - Pair *pair = new Pair( name ); - - pair->input = false; - pair->vblob = value; - g_value_init( &pair->value, VIPS_TYPE_BLOB ); - - options.push_back( pair ); - - return( this ); -} - -// just g_object_set_property(), except we allow set enum from string -static void -set_property( VipsObject *object, const char *name, const GValue *value ) -{ - VipsObjectClass *object_class = VIPS_OBJECT_GET_CLASS( object ); - GType type = G_VALUE_TYPE( value ); - - GParamSpec *pspec; - VipsArgumentClass *argument_class; - VipsArgumentInstance *argument_instance; - - if( vips_object_get_argument( object, name, - &pspec, &argument_class, &argument_instance ) ) { - g_warning( "%s", vips_error_buffer() ); - vips_error_clear(); - return; - } - - if( G_IS_PARAM_SPEC_ENUM( pspec ) && - type == G_TYPE_STRING ) { - GType pspec_type = G_PARAM_SPEC_VALUE_TYPE( pspec ); - - int enum_value; - GValue value2 = { 0 }; - - if( (enum_value = vips_enum_from_nick( object_class->nickname, - pspec_type, g_value_get_string( value ) )) < 0 ) { - g_warning( "%s", vips_error_buffer() ); - vips_error_clear(); - return; - } - - g_value_init( &value2, pspec_type ); - g_value_set_enum( &value2, enum_value ); - g_object_set_property( G_OBJECT( object ), name, &value2 ); - g_value_unset( &value2 ); - } - else - g_object_set_property( G_OBJECT( object ), name, value ); -} - -// walk the options and set props on the operation -void -VOption::set_operation( VipsOperation *operation ) -{ - std::list::iterator i; - - for( i = options.begin(); i != options.end(); ++i ) - if( (*i)->input ) { -#ifdef VIPS_DEBUG_VERBOSE - printf( "set_operation: " ); - vips_object_print_name( VIPS_OBJECT( operation ) ); - char *str_value = g_strdup_value_contents( &(*i)->value ); - printf( ".%s = %s\n", (*i)->name, str_value ); - g_free( str_value ); -#endif /*VIPS_DEBUG_VERBOSE*/ - - set_property( VIPS_OBJECT( operation ), - (*i)->name, &(*i)->value ); - } -} - -// walk the options and fetch any requested outputs -void -VOption::get_operation( VipsOperation *operation ) -{ - std::list::iterator i; - - for( i = options.begin(); i != options.end(); ++i ) - if( ! (*i)->input ) { - const char *name = (*i)->name; - - g_object_get_property( G_OBJECT( operation ), - name, &(*i)->value ); - -#ifdef VIPS_DEBUG_VERBOSE - printf( "get_operation: " ); - vips_object_print_name( VIPS_OBJECT( operation ) ); - char *str_value = g_strdup_value_contents( - &(*i)->value ); - printf( ".%s = %s\n", name, str_value ); - g_free( str_value ); -#endif /*VIPS_DEBUG_VERBOSE*/ - - GValue *value = &(*i)->value; - GType type = G_VALUE_TYPE( value ); - - if( type == VIPS_TYPE_IMAGE ) { - // rebox object - VipsImage *image = VIPS_IMAGE( - g_value_get_object( value ) ); - *((*i)->vimage) = VImage( image ); - } - else if( type == G_TYPE_INT ) - *((*i)->vint) = g_value_get_int( value ); - else if( type == G_TYPE_BOOLEAN ) - *((*i)->vbool) = g_value_get_boolean( value ); - else if( type == G_TYPE_DOUBLE ) - *((*i)->vdouble) = g_value_get_double( value ); - else if( type == VIPS_TYPE_ARRAY_DOUBLE ) { - int length; - double *array = - vips_value_get_array_double( value, - &length ); - int j; - - ((*i)->vvector)->resize( length ); - for( j = 0; j < length; j++ ) - (*((*i)->vvector))[j] = array[j]; - } - else if( type == VIPS_TYPE_BLOB ) { - // our caller gets a reference - *((*i)->vblob) = - (VipsBlob *) g_value_dup_boxed( value ); - } - } -} - -void -VImage::call_option_string( const char *operation_name, - const char *option_string, VOption *options ) -{ - VipsOperation *operation; - - VIPS_DEBUG_MSG( "call_option_string: starting for %s ...\n", - operation_name ); - - if( !(operation = vips_operation_new( operation_name )) ) { - if( options ) - delete options; - throw( VError() ); - } - - /* Set str options before vargs options, so the user can't - * override things we set deliberately. - */ - if( option_string && - vips_object_set_from_string( VIPS_OBJECT( operation ), - option_string ) ) { - vips_object_unref_outputs( VIPS_OBJECT( operation ) ); - g_object_unref( operation ); - delete options; - throw( VError() ); - } - - if( options ) - options->set_operation( operation ); - - /* Build from cache. - */ - if( vips_cache_operation_buildp( &operation ) ) { - vips_object_unref_outputs( VIPS_OBJECT( operation ) ); - g_object_unref( operation ); - delete options; - throw( VError() ); - } - - /* Walk args again, writing output. - */ - if( options ) - options->get_operation( operation ); - - /* We're done with options! - */ - delete options; - - /* The operation we have built should now have been reffed by - * one of its arguments or have finished its work. Either - * way, we can unref. - */ - g_object_unref( operation ); -} - -void -VImage::call( const char *operation_name, VOption *options ) -{ - call_option_string( operation_name, NULL, options ); -} - -VImage -VImage::new_from_file( const char *name, VOption *options ) -{ - char filename[VIPS_PATH_MAX]; - char option_string[VIPS_PATH_MAX]; - const char *operation_name; - - VImage out; - - vips__filename_split8( name, filename, option_string ); - if( !(operation_name = vips_foreign_find_load( filename )) ) { - delete options; - throw VError(); - } - - call_option_string( operation_name, option_string, - (options ? options : VImage::option())-> - set( "filename", filename )-> - set( "out", &out ) ); - - return( out ); -} - -VImage -VImage::new_from_buffer( void *buf, size_t len, const char *option_string, - VOption *options ) -{ - const char *operation_name; - VipsBlob *blob; - VImage out; - - if( !(operation_name = vips_foreign_find_load_buffer( buf, len )) ) { - delete options; - throw( VError() ); - } - - /* We don't take a copy of the data or free it. - */ - blob = vips_blob_new( NULL, buf, len ); - options = (options ? options : VImage::option())-> - set( "buffer", blob )-> - set( "out", &out ); - vips_area_unref( VIPS_AREA( blob ) ); - - call_option_string( operation_name, option_string, options ); - - return( out ); -} - -VImage -VImage::new_from_image( std::vector pixel ) -{ - VImage onepx = VImage::black( 1, 1, - VImage::option()->set( "bands", bands() ) ); - - onepx = (onepx + pixel).cast( format() ); - - VImage big = onepx.embed( 0, 0, width(), height(), - VImage::option()->set( "extend", VIPS_EXTEND_COPY ) ); - - big = big.copy( - VImage::option()-> - set( "interpretation", interpretation() )-> - set( "xres", xres() )-> - set( "yres", yres() )-> - set( "xoffset", xres() )-> - set( "yoffset", yres() ) ); - - return( big ); -} - -VImage -VImage::new_from_image( double pixel ) -{ - return( new_from_image( to_vectorv( 1, pixel ) ) ); -} - -VImage -VImage::new_matrix( int width, int height ) -{ - return( VImage( vips_image_new_matrix( width, height ) ) ); -} - -VImage -VImage::new_matrixv( int width, int height, ... ) -{ - VImage matrix = new_matrix( width, height ); - VipsImage *vips_matrix = matrix.get_image(); - - va_list ap; - - va_start( ap, height ); - for( int y = 0; y < height; y++ ) - for( int x = 0; x < width; x++ ) - *VIPS_MATRIX( vips_matrix, x, y ) = - va_arg( ap, double ); - va_end( ap ); - - return( matrix ); -} - -VImage -VImage::write( VImage out ) -{ - if( vips_image_write( this->get_image(), out.get_image() ) ) - throw VError(); - - return( out ); -} - -void -VImage::write_to_file( const char *name, VOption *options ) -{ - char filename[VIPS_PATH_MAX]; - char option_string[VIPS_PATH_MAX]; - const char *operation_name; - - vips__filename_split8( name, filename, option_string ); - if( !(operation_name = vips_foreign_find_save( filename )) ) { - delete options; - throw VError(); - } - - call_option_string( operation_name, option_string, - (options ? options : VImage::option())-> - set( "in", *this )-> - set( "filename", filename ) ); -} - -void -VImage::write_to_buffer( const char *suffix, void **buf, size_t *size, - VOption *options ) -{ - char filename[VIPS_PATH_MAX]; - char option_string[VIPS_PATH_MAX]; - const char *operation_name; - VipsBlob *blob; - - vips__filename_split8( suffix, filename, option_string ); - if( !(operation_name = vips_foreign_find_save_buffer( filename )) ) { - delete options; - throw VError(); - } - - call_option_string( operation_name, option_string, - (options ? options : VImage::option())-> - set( "in", *this )-> - set( "buffer", &blob ) ); - - if( blob ) { - if( buf ) { - *buf = VIPS_AREA( blob )->data; - VIPS_AREA( blob )->free_fn = NULL; - } - if( size ) - *size = VIPS_AREA( blob )->length; - - vips_area_unref( VIPS_AREA( blob ) ); - } -} - -#include "vips-operators.cpp" - -std::vector -VImage::bandsplit( VOption *options ) -{ - std::vector b; - - for( int i = 0; i < bands(); i++ ) - b.push_back( extract_band( i ) ); - - return( b ); -} - -VImage -VImage::bandjoin( VImage other, VOption *options ) -{ - VImage v[2] = { *this, other }; - std::vector vec( v, v + VIPS_NUMBER( v ) ); - - return( bandjoin( vec, options ) ); -} - -std::complex -VImage::minpos( VOption *options ) -{ - double x, y; - - (void) min( - (options ? options : VImage::option()) -> - set( "x", &x ) -> - set( "y", &y ) ); - - return( std::complex( x, y ) ); -} - -std::complex -VImage::maxpos( VOption *options ) -{ - double x, y; - - (void) max( - (options ? options : VImage::option()) -> - set( "x", &x ) -> - set( "y", &y ) ); - - return( std::complex( x, y ) ); -} - -// Operator overloads - -VImage -VImage::operator[]( int index ) -{ - return( this->extract_band( index ) ); -} - -std::vector -VImage::operator()( int x, int y ) -{ - return( this->getpoint( x, y ) ); -} - -VImage -operator+( VImage a, VImage b ) -{ - return( a.add( b ) ); -} - -VImage -operator+( double a, VImage b ) -{ - return( b.linear( 1.0, a ) ); -} - -VImage -operator+( VImage a, double b ) -{ - return( a.linear( 1.0, b ) ); -} - -VImage -operator+( std::vector a, VImage b ) -{ - return( b.linear( 1.0, a ) ); -} - -VImage -operator+( VImage a, std::vector b ) -{ - return( a.linear( 1.0, b ) ); -} - -VImage & -operator+=( VImage &a, const VImage b ) -{ - return( a = a + b ); -} - -VImage & -operator+=( VImage &a, const double b ) -{ - return( a = a + b ); -} - -VImage & -operator+=( VImage &a, std::vector b ) -{ - return( a = a + b ); -} - -VImage -operator-( VImage a, VImage b ) -{ - return( a.subtract( b ) ); -} - -VImage -operator-( double a, VImage b ) -{ - return( b.linear( -1.0, a ) ); -} - -VImage -operator-( VImage a, double b ) -{ - return( a.linear( 1.0, -b ) ); -} - -VImage -operator-( std::vector a, VImage b ) -{ - return( b.linear( -1.0, a ) ); -} - -VImage -operator-( VImage a, std::vector b ) -{ - return( a.linear( 1.0, vips::negate( b ) ) ); -} - -VImage & -operator-=( VImage &a, const VImage b ) -{ - return( a = a - b ); -} - -VImage & -operator-=( VImage &a, const double b ) -{ - return( a = a - b ); -} - -VImage & -operator-=( VImage &a, std::vector b ) -{ - return( a = a - b ); -} - -VImage -operator-( VImage a ) -{ - return( a * -1 ); -} - -VImage -operator*( VImage a, VImage b ) -{ - return( a.multiply( b ) ); -} - -VImage -operator*( double a, VImage b ) -{ - return( b.linear( a, 0.0 ) ); -} - -VImage -operator*( VImage a, double b ) -{ - return( a.linear( b, 0.0 ) ); -} - -VImage -operator*( std::vector a, VImage b ) -{ - return( b.linear( a, 0.0 ) ); -} - -VImage -operator*( VImage a, std::vector b ) -{ - return( a.linear( b, 0.0 ) ); -} - -VImage & -operator*=( VImage &a, const VImage b ) -{ - return( a = a * b ); -} - -VImage & -operator*=( VImage &a, const double b ) -{ - return( a = a * b ); -} - -VImage & -operator*=( VImage &a, std::vector b ) -{ - return( a = a * b ); -} - -VImage -operator/( VImage a, VImage b ) -{ - return( a.divide( b ) ); -} - -VImage -operator/( double a, VImage b ) -{ - return( b.pow( -1.0 ).linear( a, 0.0 ) ); -} - -VImage -operator/( VImage a, double b ) -{ - return( a.linear( 1.0 / b, 0.0 ) ); -} - -VImage -operator/( std::vector a, VImage b ) -{ - return( b.pow( -1.0 ).linear( a, 0.0 ) ); -} - -VImage -operator/( VImage a, std::vector b ) -{ - return( a.linear( vips::invert( b ), 0.0 ) ); -} - -VImage & -operator/=( VImage &a, const VImage b ) -{ - return( a = a / b ); -} - -VImage & -operator/=( VImage &a, const double b ) -{ - return( a = a / b ); -} - -VImage & -operator/=( VImage &a, std::vector b ) -{ - return( a = a / b ); -} - -VImage -operator%( VImage a, VImage b ) -{ - return( a.remainder( b ) ); -} - -VImage -operator%( VImage a, double b ) -{ - return( a.remainder_const( to_vector( b ) ) ); -} - -VImage -operator%( VImage a, std::vector b ) -{ - return( a.remainder_const( b ) ); -} - -VImage & -operator%=( VImage &a, const VImage b ) -{ - return( a = a % b ); -} - -VImage & -operator%=( VImage &a, const double b ) -{ - return( a = a % b ); -} - -VImage & -operator%=( VImage &a, std::vector b ) -{ - return( a = a % b ); -} - -VImage -operator<( VImage a, VImage b ) -{ - return( a.relational( b, VIPS_OPERATION_RELATIONAL_LESS ) ); -} - -VImage -operator<( double a, VImage b ) -{ - return( b.relational_const( VIPS_OPERATION_RELATIONAL_MORE, - to_vector( a ) ) ); -} - -VImage -operator<( VImage a, double b ) -{ - return( a.relational_const( VIPS_OPERATION_RELATIONAL_LESS, - to_vector( b ) ) ); -} - -VImage -operator<( std::vector a, VImage b ) -{ - return( b.relational_const( VIPS_OPERATION_RELATIONAL_MORE, - a ) ); -} - -VImage -operator<( VImage a, std::vector b ) -{ - return( a.relational_const( VIPS_OPERATION_RELATIONAL_LESS, - b ) ); -} - -VImage -operator<=( VImage a, VImage b ) -{ - return( a.relational( b, VIPS_OPERATION_RELATIONAL_LESSEQ ) ); -} - -VImage -operator<=( double a, VImage b ) -{ - return( b.relational_const( VIPS_OPERATION_RELATIONAL_MOREEQ, - to_vector( a ) ) ); -} - -VImage -operator<=( VImage a, double b ) -{ - return( a.relational_const( VIPS_OPERATION_RELATIONAL_LESSEQ, - to_vector( b ) ) ); -} - -VImage -operator<=( std::vector a, VImage b ) -{ - return( b.relational_const( VIPS_OPERATION_RELATIONAL_MOREEQ, - a ) ); -} - -VImage -operator<=( VImage a, std::vector b ) -{ - return( a.relational_const( VIPS_OPERATION_RELATIONAL_LESSEQ, - b ) ); -} - -VImage -operator>( VImage a, VImage b ) -{ - return( a.relational( b, VIPS_OPERATION_RELATIONAL_MORE ) ); -} - -VImage -operator>( double a, VImage b ) -{ - return( b.relational_const( VIPS_OPERATION_RELATIONAL_LESS, - to_vector( a ) ) ); -} - -VImage -operator>( VImage a, double b ) -{ - return( a.relational_const( VIPS_OPERATION_RELATIONAL_MORE, - to_vector( b ) ) ); -} - -VImage -operator>( std::vector a, VImage b ) -{ - return( b.relational_const( VIPS_OPERATION_RELATIONAL_LESS, - a ) ); -} - -VImage -operator>( VImage a, std::vector b ) -{ - return( a.relational_const( VIPS_OPERATION_RELATIONAL_MORE, - b ) ); -} - -VImage -operator>=( VImage a, VImage b ) -{ - return( a.relational( b, VIPS_OPERATION_RELATIONAL_MOREEQ ) ); -} - -VImage -operator>=( double a, VImage b ) -{ - return( b.relational_const( VIPS_OPERATION_RELATIONAL_LESSEQ, - to_vector( a ) ) ); -} - -VImage -operator>=( VImage a, double b ) -{ - return( a.relational_const( VIPS_OPERATION_RELATIONAL_MOREEQ, - to_vector( b ) ) ); -} - -VImage -operator>=( std::vector a, VImage b ) -{ - return( b.relational_const( VIPS_OPERATION_RELATIONAL_LESSEQ, - a ) ); -} - -VImage -operator>=( VImage a, std::vector b ) -{ - return( a.relational_const( VIPS_OPERATION_RELATIONAL_MOREEQ, - b ) ); -} - -VImage -operator==( VImage a, VImage b ) -{ - return( a.relational( b, VIPS_OPERATION_RELATIONAL_EQUAL ) ); -} - -VImage -operator==( double a, VImage b ) -{ - return( b.relational_const( VIPS_OPERATION_RELATIONAL_EQUAL, - to_vector( a ) ) ); -} - -VImage -operator==( VImage a, double b ) -{ - return( a.relational_const( VIPS_OPERATION_RELATIONAL_EQUAL, - to_vector( b ) ) ); -} - -VImage -operator==( std::vector a, VImage b ) -{ - return( b.relational_const( VIPS_OPERATION_RELATIONAL_EQUAL, - a ) ); -} - -VImage -operator==( VImage a, std::vector b ) -{ - return( a.relational_const( VIPS_OPERATION_RELATIONAL_EQUAL, - b ) ); -} - -VImage -operator!=( VImage a, VImage b ) -{ - return( a.relational( b, VIPS_OPERATION_RELATIONAL_NOTEQ ) ); -} - -VImage -operator!=( double a, VImage b ) -{ - return( b.relational_const( VIPS_OPERATION_RELATIONAL_NOTEQ, - to_vector( a ) ) ); -} - -VImage -operator!=( VImage a, double b ) -{ - return( a.relational_const( VIPS_OPERATION_RELATIONAL_NOTEQ, - to_vector( b ) ) ); -} - -VImage -operator!=( std::vector a, VImage b ) -{ - return( b.relational_const( VIPS_OPERATION_RELATIONAL_NOTEQ, - a ) ); -} - -VImage -operator!=( VImage a, std::vector b ) -{ - return( a.relational_const( VIPS_OPERATION_RELATIONAL_NOTEQ, - b ) ); -} - -VImage -operator&( VImage a, VImage b ) -{ - return( a.boolean( b, VIPS_OPERATION_BOOLEAN_AND ) ); -} - -VImage -operator&( double a, VImage b ) -{ - return( b.boolean_const( VIPS_OPERATION_BOOLEAN_AND, - to_vector( a ) ) ); -} - -VImage -operator&( VImage a, double b ) -{ - return( a.boolean_const( VIPS_OPERATION_BOOLEAN_AND, - to_vector( b ) ) ); -} - -VImage -operator&( std::vector a, VImage b ) -{ - return( b.boolean_const( VIPS_OPERATION_BOOLEAN_AND, a ) ); -} - -VImage -operator&( VImage a, std::vector b ) -{ - return( a.boolean_const( VIPS_OPERATION_BOOLEAN_AND, b ) ); -} - -VImage & -operator&=( VImage &a, const VImage b ) -{ - return( a = a & b ); -} - -VImage & -operator&=( VImage &a, const double b ) -{ - return( a = a & b ); -} - -VImage & -operator&=( VImage &a, std::vector b ) -{ - return( a = a & b ); -} - -VImage -operator|( VImage a, VImage b ) -{ - return( a.boolean( b, VIPS_OPERATION_BOOLEAN_OR ) ); -} - -VImage -operator|( double a, VImage b ) -{ - return( b.boolean_const( VIPS_OPERATION_BOOLEAN_OR, - to_vector( a ) ) ); -} - -VImage -operator|( VImage a, double b ) -{ - return( a.boolean_const( VIPS_OPERATION_BOOLEAN_OR, - to_vector( b ) ) ); -} - -VImage -operator|( std::vector a, VImage b ) -{ - return( b.boolean_const( VIPS_OPERATION_BOOLEAN_OR, - a ) ); -} - -VImage -operator|( VImage a, std::vector b ) -{ - return( a.boolean_const( VIPS_OPERATION_BOOLEAN_OR, - b ) ); -} - -VImage & -operator|=( VImage &a, const VImage b ) -{ - return( a = a | b ); -} - -VImage & -operator|=( VImage &a, const double b ) -{ - return( a = a | b ); -} - -VImage & -operator|=( VImage &a, std::vector b ) -{ - return( a = a | b ); -} - -VImage -operator^( VImage a, VImage b ) -{ - return( a.boolean( b, VIPS_OPERATION_BOOLEAN_EOR ) ); -} - -VImage -operator^( double a, VImage b ) -{ - return( b.boolean_const( VIPS_OPERATION_BOOLEAN_EOR, - to_vector( a ) ) ); -} - -VImage -operator^( VImage a, double b ) -{ - return( a.boolean_const( VIPS_OPERATION_BOOLEAN_EOR, - to_vector( b ) ) ); -} - -VImage -operator^( std::vector a, VImage b ) -{ - return( b.boolean_const( VIPS_OPERATION_BOOLEAN_EOR, - a ) ); -} - -VImage -operator^( VImage a, std::vector b ) -{ - return( a.boolean_const( VIPS_OPERATION_BOOLEAN_EOR, - b ) ); -} - -VImage & -operator^=( VImage &a, const VImage b ) -{ - return( a = a ^ b ); -} - -VImage & -operator^=( VImage &a, const double b ) -{ - return( a = a ^ b ); -} - -VImage & -operator^=( VImage &a, std::vector b ) -{ - return( a = a ^ b ); -} - -VImage -operator<<( VImage a, VImage b ) -{ - return( a.boolean( b, VIPS_OPERATION_BOOLEAN_LSHIFT ) ); -} - -VImage -operator<<( VImage a, double b ) -{ - return( a.boolean_const( VIPS_OPERATION_BOOLEAN_LSHIFT, - to_vector( b ) ) ); -} - -VImage -operator<<( VImage a, std::vector b ) -{ - return( a.boolean_const( VIPS_OPERATION_BOOLEAN_LSHIFT, - b ) ); -} - -VImage & -operator<<=( VImage &a, const VImage b ) -{ - return( a = a << b ); -} - -VImage & -operator<<=( VImage &a, const double b ) -{ - return( a = a << b ); -} - -VImage & -operator<<=( VImage &a, std::vector b ) -{ - return( a = a << b ); -} - -VImage -operator>>( VImage a, VImage b ) -{ - return( a.boolean( b, VIPS_OPERATION_BOOLEAN_RSHIFT ) ); -} - -VImage -operator>>( VImage a, double b ) -{ - return( a.boolean_const( VIPS_OPERATION_BOOLEAN_RSHIFT, - to_vector( b ) ) ); -} - -VImage -operator>>( VImage a, std::vector b ) -{ - return( a.boolean_const( VIPS_OPERATION_BOOLEAN_RSHIFT, - b ) ); -} - -VImage & -operator>>=( VImage &a, const VImage b ) -{ - return( a = a << b ); -} - -VImage & -operator>>=( VImage &a, const double b ) -{ - return( a = a << b ); -} - -VImage & -operator>>=( VImage &a, std::vector b ) -{ - return( a = a << b ); -} - -VIPS_NAMESPACE_END diff --git a/src/libvips/cplusplus/VInterpolate.cpp b/src/libvips/cplusplus/VInterpolate.cpp deleted file mode 100644 index 265bf66e2..000000000 --- a/src/libvips/cplusplus/VInterpolate.cpp +++ /dev/null @@ -1,76 +0,0 @@ -/* Object part of VInterpolate class - */ - -/* - - Copyright (C) 1991-2001 The National Gallery - - This program is free software; you can redistribute it and/or modify - it under the terms of the GNU Lesser General Public License as published by - the Free Software Foundation; either version 2 of the License, or - (at your option) any later version. - - This program is distributed in the hope that it will be useful, - but WITHOUT ANY WARRANTY; without even the implied warranty of - MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - GNU Lesser General Public License for more details. - - You should have received a copy of the GNU Lesser General Public License - along with this program; if not, write to the Free Software - Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA - 02110-1301 USA - - */ - -/* - - These files are distributed with VIPS - http://www.vips.ecs.soton.ac.uk - - */ - -#ifdef HAVE_CONFIG_H -#include -#endif /*HAVE_CONFIG_H*/ -#include - -#include - -#include - -/* -#define VIPS_DEBUG -#define VIPS_DEBUG_VERBOSE - */ - -VIPS_NAMESPACE_START - -VInterpolate -VInterpolate::new_from_name( const char *name, VOption *options ) -{ - VipsInterpolate *interp; - - if( !(interp = vips_interpolate_new( name )) ) { - delete options; - throw VError(); - } - delete options; - - VInterpolate out( interp ); - - return( out ); -} - -VOption * -VOption::set( const char *name, VInterpolate value ) -{ - Pair *pair = new Pair( name ); - - pair->input = true; - g_value_init( &pair->value, VIPS_TYPE_INTERPOLATE ); - g_value_set_object( &pair->value, value.get_interpolate() ); - options.push_back( pair ); - - return( this ); -} - -VIPS_NAMESPACE_END diff --git a/src/libvips/cplusplus/vips-operators.cpp b/src/libvips/cplusplus/vips-operators.cpp deleted file mode 100644 index bff2a7453..000000000 --- a/src/libvips/cplusplus/vips-operators.cpp +++ /dev/null @@ -1,2990 +0,0 @@ -// bodies for vips operations -// Mon 13 Mar 13:22:17 GMT 2017 -// this file is generated automatically, do not edit! - -void VImage::system( char * cmd_format , VOption *options ) -{ - call( "system" , - (options ? options : VImage::option()) -> - set( "cmd-format", cmd_format ) ); -} - -VImage VImage::add( VImage right , VOption *options ) -{ - VImage out; - - call( "add" , - (options ? options : VImage::option()) -> - set( "left", *this ) -> - set( "right", right ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::subtract( VImage right , VOption *options ) -{ - VImage out; - - call( "subtract" , - (options ? options : VImage::option()) -> - set( "left", *this ) -> - set( "right", right ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::multiply( VImage right , VOption *options ) -{ - VImage out; - - call( "multiply" , - (options ? options : VImage::option()) -> - set( "left", *this ) -> - set( "right", right ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::divide( VImage right , VOption *options ) -{ - VImage out; - - call( "divide" , - (options ? options : VImage::option()) -> - set( "left", *this ) -> - set( "right", right ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::relational( VImage right , VipsOperationRelational relational , VOption *options ) -{ - VImage out; - - call( "relational" , - (options ? options : VImage::option()) -> - set( "left", *this ) -> - set( "right", right ) -> - set( "out", &out ) -> - set( "relational", relational ) ); - - return( out ); -} - -VImage VImage::remainder( VImage right , VOption *options ) -{ - VImage out; - - call( "remainder" , - (options ? options : VImage::option()) -> - set( "left", *this ) -> - set( "right", right ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::boolean( VImage right , VipsOperationBoolean boolean , VOption *options ) -{ - VImage out; - - call( "boolean" , - (options ? options : VImage::option()) -> - set( "left", *this ) -> - set( "right", right ) -> - set( "out", &out ) -> - set( "boolean", boolean ) ); - - return( out ); -} - -VImage VImage::math2( VImage right , VipsOperationMath2 math2 , VOption *options ) -{ - VImage out; - - call( "math2" , - (options ? options : VImage::option()) -> - set( "left", *this ) -> - set( "right", right ) -> - set( "out", &out ) -> - set( "math2", math2 ) ); - - return( out ); -} - -VImage VImage::complex2( VImage right , VipsOperationComplex2 cmplx , VOption *options ) -{ - VImage out; - - call( "complex2" , - (options ? options : VImage::option()) -> - set( "left", *this ) -> - set( "right", right ) -> - set( "out", &out ) -> - set( "cmplx", cmplx ) ); - - return( out ); -} - -VImage VImage::complexform( VImage right , VOption *options ) -{ - VImage out; - - call( "complexform" , - (options ? options : VImage::option()) -> - set( "left", *this ) -> - set( "right", right ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::sum( std::vector in , VOption *options ) -{ - VImage out; - - call( "sum" , - (options ? options : VImage::option()) -> - set( "in", in ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::invert( VOption *options ) -{ - VImage out; - - call( "invert" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::linear( std::vector a , std::vector b , VOption *options ) -{ - VImage out; - - call( "linear" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "a", a ) -> - set( "b", b ) ); - - return( out ); -} - -VImage VImage::math( VipsOperationMath math , VOption *options ) -{ - VImage out; - - call( "math" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "math", math ) ); - - return( out ); -} - -VImage VImage::abs( VOption *options ) -{ - VImage out; - - call( "abs" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::sign( VOption *options ) -{ - VImage out; - - call( "sign" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::round( VipsOperationRound round , VOption *options ) -{ - VImage out; - - call( "round" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "round", round ) ); - - return( out ); -} - -VImage VImage::relational_const( VipsOperationRelational relational , std::vector c , VOption *options ) -{ - VImage out; - - call( "relational_const" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "relational", relational ) -> - set( "c", c ) ); - - return( out ); -} - -VImage VImage::remainder_const( std::vector c , VOption *options ) -{ - VImage out; - - call( "remainder_const" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "c", c ) ); - - return( out ); -} - -VImage VImage::boolean_const( VipsOperationBoolean boolean , std::vector c , VOption *options ) -{ - VImage out; - - call( "boolean_const" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "boolean", boolean ) -> - set( "c", c ) ); - - return( out ); -} - -VImage VImage::math2_const( VipsOperationMath2 math2 , std::vector c , VOption *options ) -{ - VImage out; - - call( "math2_const" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "math2", math2 ) -> - set( "c", c ) ); - - return( out ); -} - -VImage VImage::complex( VipsOperationComplex cmplx , VOption *options ) -{ - VImage out; - - call( "complex" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "cmplx", cmplx ) ); - - return( out ); -} - -VImage VImage::complexget( VipsOperationComplexget get , VOption *options ) -{ - VImage out; - - call( "complexget" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "get", get ) ); - - return( out ); -} - -double VImage::avg( VOption *options ) -{ - double out; - - call( "avg" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -double VImage::min( VOption *options ) -{ - double out; - - call( "min" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -double VImage::max( VOption *options ) -{ - double out; - - call( "max" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -double VImage::deviate( VOption *options ) -{ - double out; - - call( "deviate" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::stats( VOption *options ) -{ - VImage out; - - call( "stats" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::hist_find( VOption *options ) -{ - VImage out; - - call( "hist_find" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::hist_find_ndim( VOption *options ) -{ - VImage out; - - call( "hist_find_ndim" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::hist_find_indexed( VImage index , VOption *options ) -{ - VImage out; - - call( "hist_find_indexed" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "index", index ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::hough_line( VOption *options ) -{ - VImage out; - - call( "hough_line" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::hough_circle( VOption *options ) -{ - VImage out; - - call( "hough_circle" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::project( VImage * rows , VOption *options ) -{ - VImage columns; - - call( "project" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "columns", &columns ) -> - set( "rows", rows ) ); - - return( columns ); -} - -VImage VImage::profile( VImage * rows , VOption *options ) -{ - VImage columns; - - call( "profile" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "columns", &columns ) -> - set( "rows", rows ) ); - - return( columns ); -} - -VImage VImage::measure( int h , int v , VOption *options ) -{ - VImage out; - - call( "measure" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "h", h ) -> - set( "v", v ) ); - - return( out ); -} - -std::vector VImage::getpoint( int x , int y , VOption *options ) -{ - std::vector out_array; - - call( "getpoint" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out-array", &out_array ) -> - set( "x", x ) -> - set( "y", y ) ); - - return( out_array ); -} - -VImage VImage::copy( VOption *options ) -{ - VImage out; - - call( "copy" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::tilecache( VOption *options ) -{ - VImage out; - - call( "tilecache" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::linecache( VOption *options ) -{ - VImage out; - - call( "linecache" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::sequential( VOption *options ) -{ - VImage out; - - call( "sequential" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::cache( VOption *options ) -{ - VImage out; - - call( "cache" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::embed( int x , int y , int width , int height , VOption *options ) -{ - VImage out; - - call( "embed" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "x", x ) -> - set( "y", y ) -> - set( "width", width ) -> - set( "height", height ) ); - - return( out ); -} - -VImage VImage::flip( VipsDirection direction , VOption *options ) -{ - VImage out; - - call( "flip" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "direction", direction ) ); - - return( out ); -} - -VImage VImage::insert( VImage sub , int x , int y , VOption *options ) -{ - VImage out; - - call( "insert" , - (options ? options : VImage::option()) -> - set( "main", *this ) -> - set( "sub", sub ) -> - set( "out", &out ) -> - set( "x", x ) -> - set( "y", y ) ); - - return( out ); -} - -VImage VImage::join( VImage in2 , VipsDirection direction , VOption *options ) -{ - VImage out; - - call( "join" , - (options ? options : VImage::option()) -> - set( "in1", *this ) -> - set( "in2", in2 ) -> - set( "out", &out ) -> - set( "direction", direction ) ); - - return( out ); -} - -VImage VImage::arrayjoin( std::vector in , VOption *options ) -{ - VImage out; - - call( "arrayjoin" , - (options ? options : VImage::option()) -> - set( "in", in ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::extract_area( int left , int top , int width , int height , VOption *options ) -{ - VImage out; - - call( "extract_area" , - (options ? options : VImage::option()) -> - set( "input", *this ) -> - set( "out", &out ) -> - set( "left", left ) -> - set( "top", top ) -> - set( "width", width ) -> - set( "height", height ) ); - - return( out ); -} - -VImage VImage::smartcrop( int width , int height , VOption *options ) -{ - VImage out; - - call( "smartcrop" , - (options ? options : VImage::option()) -> - set( "input", *this ) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) ); - - return( out ); -} - -VImage VImage::extract_band( int band , VOption *options ) -{ - VImage out; - - call( "extract_band" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "band", band ) ); - - return( out ); -} - -VImage VImage::bandjoin( std::vector in , VOption *options ) -{ - VImage out; - - call( "bandjoin" , - (options ? options : VImage::option()) -> - set( "in", in ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::bandjoin_const( std::vector c , VOption *options ) -{ - VImage out; - - call( "bandjoin_const" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "c", c ) ); - - return( out ); -} - -VImage VImage::bandrank( std::vector in , VOption *options ) -{ - VImage out; - - call( "bandrank" , - (options ? options : VImage::option()) -> - set( "in", in ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::bandmean( VOption *options ) -{ - VImage out; - - call( "bandmean" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::bandbool( VipsOperationBoolean boolean , VOption *options ) -{ - VImage out; - - call( "bandbool" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "boolean", boolean ) ); - - return( out ); -} - -VImage VImage::replicate( int across , int down , VOption *options ) -{ - VImage out; - - call( "replicate" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "across", across ) -> - set( "down", down ) ); - - return( out ); -} - -VImage VImage::cast( VipsBandFormat format , VOption *options ) -{ - VImage out; - - call( "cast" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "format", format ) ); - - return( out ); -} - -VImage VImage::rot( VipsAngle angle , VOption *options ) -{ - VImage out; - - call( "rot" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "angle", angle ) ); - - return( out ); -} - -VImage VImage::rot45( VOption *options ) -{ - VImage out; - - call( "rot45" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::autorot( VOption *options ) -{ - VImage out; - - call( "autorot" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::ifthenelse( VImage in1 , VImage in2 , VOption *options ) -{ - VImage out; - - call( "ifthenelse" , - (options ? options : VImage::option()) -> - set( "cond", *this ) -> - set( "in1", in1 ) -> - set( "in2", in2 ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::recomb( VImage m , VOption *options ) -{ - VImage out; - - call( "recomb" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "m", m ) ); - - return( out ); -} - -VImage VImage::bandfold( VOption *options ) -{ - VImage out; - - call( "bandfold" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::bandunfold( VOption *options ) -{ - VImage out; - - call( "bandunfold" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::flatten( VOption *options ) -{ - VImage out; - - call( "flatten" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::premultiply( VOption *options ) -{ - VImage out; - - call( "premultiply" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::unpremultiply( VOption *options ) -{ - VImage out; - - call( "unpremultiply" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::grid( int tile_height , int across , int down , VOption *options ) -{ - VImage out; - - call( "grid" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "tile-height", tile_height ) -> - set( "across", across ) -> - set( "down", down ) ); - - return( out ); -} - -VImage VImage::scale( VOption *options ) -{ - VImage out; - - call( "scale" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::wrap( VOption *options ) -{ - VImage out; - - call( "wrap" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::zoom( int xfac , int yfac , VOption *options ) -{ - VImage out; - - call( "zoom" , - (options ? options : VImage::option()) -> - set( "input", *this ) -> - set( "out", &out ) -> - set( "xfac", xfac ) -> - set( "yfac", yfac ) ); - - return( out ); -} - -VImage VImage::subsample( int xfac , int yfac , VOption *options ) -{ - VImage out; - - call( "subsample" , - (options ? options : VImage::option()) -> - set( "input", *this ) -> - set( "out", &out ) -> - set( "xfac", xfac ) -> - set( "yfac", yfac ) ); - - return( out ); -} - -VImage VImage::msb( VOption *options ) -{ - VImage out; - - call( "msb" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::byteswap( VOption *options ) -{ - VImage out; - - call( "byteswap" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::falsecolour( VOption *options ) -{ - VImage out; - - call( "falsecolour" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::gamma( VOption *options ) -{ - VImage out; - - call( "gamma" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::black( int width , int height , VOption *options ) -{ - VImage out; - - call( "black" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) ); - - return( out ); -} - -VImage VImage::gaussnoise( int width , int height , VOption *options ) -{ - VImage out; - - call( "gaussnoise" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) ); - - return( out ); -} - -VImage VImage::text( char * text , VOption *options ) -{ - VImage out; - - call( "text" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "text", text ) ); - - return( out ); -} - -VImage VImage::xyz( int width , int height , VOption *options ) -{ - VImage out; - - call( "xyz" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) ); - - return( out ); -} - -VImage VImage::gaussmat( double sigma , double min_ampl , VOption *options ) -{ - VImage out; - - call( "gaussmat" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "sigma", sigma ) -> - set( "min-ampl", min_ampl ) ); - - return( out ); -} - -VImage VImage::logmat( double sigma , double min_ampl , VOption *options ) -{ - VImage out; - - call( "logmat" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "sigma", sigma ) -> - set( "min-ampl", min_ampl ) ); - - return( out ); -} - -VImage VImage::eye( int width , int height , VOption *options ) -{ - VImage out; - - call( "eye" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) ); - - return( out ); -} - -VImage VImage::grey( int width , int height , VOption *options ) -{ - VImage out; - - call( "grey" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) ); - - return( out ); -} - -VImage VImage::zone( int width , int height , VOption *options ) -{ - VImage out; - - call( "zone" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) ); - - return( out ); -} - -VImage VImage::sines( int width , int height , VOption *options ) -{ - VImage out; - - call( "sines" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) ); - - return( out ); -} - -VImage VImage::mask_ideal( int width , int height , double frequency_cutoff , VOption *options ) -{ - VImage out; - - call( "mask_ideal" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) -> - set( "frequency-cutoff", frequency_cutoff ) ); - - return( out ); -} - -VImage VImage::mask_ideal_ring( int width , int height , double frequency_cutoff , double ringwidth , VOption *options ) -{ - VImage out; - - call( "mask_ideal_ring" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) -> - set( "frequency-cutoff", frequency_cutoff ) -> - set( "ringwidth", ringwidth ) ); - - return( out ); -} - -VImage VImage::mask_ideal_band( int width , int height , double frequency_cutoff_x , double frequency_cutoff_y , double radius , VOption *options ) -{ - VImage out; - - call( "mask_ideal_band" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) -> - set( "frequency-cutoff-x", frequency_cutoff_x ) -> - set( "frequency-cutoff-y", frequency_cutoff_y ) -> - set( "radius", radius ) ); - - return( out ); -} - -VImage VImage::mask_butterworth( int width , int height , double order , double frequency_cutoff , double amplitude_cutoff , VOption *options ) -{ - VImage out; - - call( "mask_butterworth" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) -> - set( "order", order ) -> - set( "frequency-cutoff", frequency_cutoff ) -> - set( "amplitude-cutoff", amplitude_cutoff ) ); - - return( out ); -} - -VImage VImage::mask_butterworth_ring( int width , int height , double order , double frequency_cutoff , double amplitude_cutoff , double ringwidth , VOption *options ) -{ - VImage out; - - call( "mask_butterworth_ring" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) -> - set( "order", order ) -> - set( "frequency-cutoff", frequency_cutoff ) -> - set( "amplitude-cutoff", amplitude_cutoff ) -> - set( "ringwidth", ringwidth ) ); - - return( out ); -} - -VImage VImage::mask_butterworth_band( int width , int height , double order , double frequency_cutoff_x , double frequency_cutoff_y , double radius , double amplitude_cutoff , VOption *options ) -{ - VImage out; - - call( "mask_butterworth_band" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) -> - set( "order", order ) -> - set( "frequency-cutoff-x", frequency_cutoff_x ) -> - set( "frequency-cutoff-y", frequency_cutoff_y ) -> - set( "radius", radius ) -> - set( "amplitude-cutoff", amplitude_cutoff ) ); - - return( out ); -} - -VImage VImage::mask_gaussian( int width , int height , double frequency_cutoff , double amplitude_cutoff , VOption *options ) -{ - VImage out; - - call( "mask_gaussian" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) -> - set( "frequency-cutoff", frequency_cutoff ) -> - set( "amplitude-cutoff", amplitude_cutoff ) ); - - return( out ); -} - -VImage VImage::mask_gaussian_ring( int width , int height , double frequency_cutoff , double amplitude_cutoff , double ringwidth , VOption *options ) -{ - VImage out; - - call( "mask_gaussian_ring" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) -> - set( "frequency-cutoff", frequency_cutoff ) -> - set( "amplitude-cutoff", amplitude_cutoff ) -> - set( "ringwidth", ringwidth ) ); - - return( out ); -} - -VImage VImage::mask_gaussian_band( int width , int height , double frequency_cutoff_x , double frequency_cutoff_y , double radius , double amplitude_cutoff , VOption *options ) -{ - VImage out; - - call( "mask_gaussian_band" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) -> - set( "frequency-cutoff-x", frequency_cutoff_x ) -> - set( "frequency-cutoff-y", frequency_cutoff_y ) -> - set( "radius", radius ) -> - set( "amplitude-cutoff", amplitude_cutoff ) ); - - return( out ); -} - -VImage VImage::mask_fractal( int width , int height , double fractal_dimension , VOption *options ) -{ - VImage out; - - call( "mask_fractal" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) -> - set( "fractal-dimension", fractal_dimension ) ); - - return( out ); -} - -VImage VImage::buildlut( VOption *options ) -{ - VImage out; - - call( "buildlut" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::invertlut( VOption *options ) -{ - VImage out; - - call( "invertlut" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::tonelut( VOption *options ) -{ - VImage out; - - call( "tonelut" , - (options ? options : VImage::option()) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::identity( VOption *options ) -{ - VImage out; - - call( "identity" , - (options ? options : VImage::option()) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::fractsurf( int width , int height , double fractal_dimension , VOption *options ) -{ - VImage out; - - call( "fractsurf" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) -> - set( "fractal-dimension", fractal_dimension ) ); - - return( out ); -} - -VImage VImage::worley( int width , int height , VOption *options ) -{ - VImage out; - - call( "worley" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) ); - - return( out ); -} - -VImage VImage::perlin( int width , int height , VOption *options ) -{ - VImage out; - - call( "perlin" , - (options ? options : VImage::option()) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) ); - - return( out ); -} - -VImage VImage::csvload( char * filename , VOption *options ) -{ - VImage out; - - call( "csvload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::matrixload( char * filename , VOption *options ) -{ - VImage out; - - call( "matrixload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::rawload( char * filename , int width , int height , int bands , VOption *options ) -{ - VImage out; - - call( "rawload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) -> - set( "bands", bands ) ); - - return( out ); -} - -VImage VImage::vipsload( char * filename , VOption *options ) -{ - VImage out; - - call( "vipsload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::analyzeload( char * filename , VOption *options ) -{ - VImage out; - - call( "analyzeload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::ppmload( char * filename , VOption *options ) -{ - VImage out; - - call( "ppmload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::radload( char * filename , VOption *options ) -{ - VImage out; - - call( "radload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::pdfload( char * filename , VOption *options ) -{ - VImage out; - - call( "pdfload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::pdfload_buffer( VipsBlob * buffer , VOption *options ) -{ - VImage out; - - call( "pdfload_buffer" , - (options ? options : VImage::option()) -> - set( "buffer", buffer ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::svgload( char * filename , VOption *options ) -{ - VImage out; - - call( "svgload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::svgload_buffer( VipsBlob * buffer , VOption *options ) -{ - VImage out; - - call( "svgload_buffer" , - (options ? options : VImage::option()) -> - set( "buffer", buffer ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::gifload( char * filename , VOption *options ) -{ - VImage out; - - call( "gifload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::gifload_buffer( VipsBlob * buffer , VOption *options ) -{ - VImage out; - - call( "gifload_buffer" , - (options ? options : VImage::option()) -> - set( "buffer", buffer ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::pngload( char * filename , VOption *options ) -{ - VImage out; - - call( "pngload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::pngload_buffer( VipsBlob * buffer , VOption *options ) -{ - VImage out; - - call( "pngload_buffer" , - (options ? options : VImage::option()) -> - set( "buffer", buffer ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::matload( char * filename , VOption *options ) -{ - VImage out; - - call( "matload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::jpegload( char * filename , VOption *options ) -{ - VImage out; - - call( "jpegload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::jpegload_buffer( VipsBlob * buffer , VOption *options ) -{ - VImage out; - - call( "jpegload_buffer" , - (options ? options : VImage::option()) -> - set( "buffer", buffer ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::webpload( char * filename , VOption *options ) -{ - VImage out; - - call( "webpload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::webpload_buffer( VipsBlob * buffer , VOption *options ) -{ - VImage out; - - call( "webpload_buffer" , - (options ? options : VImage::option()) -> - set( "buffer", buffer ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::tiffload( char * filename , VOption *options ) -{ - VImage out; - - call( "tiffload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::tiffload_buffer( VipsBlob * buffer , VOption *options ) -{ - VImage out; - - call( "tiffload_buffer" , - (options ? options : VImage::option()) -> - set( "buffer", buffer ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::openslideload( char * filename , VOption *options ) -{ - VImage out; - - call( "openslideload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::magickload( char * filename , VOption *options ) -{ - VImage out; - - call( "magickload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::magickload_buffer( VipsBlob * buffer , VOption *options ) -{ - VImage out; - - call( "magickload_buffer" , - (options ? options : VImage::option()) -> - set( "buffer", buffer ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::fitsload( char * filename , VOption *options ) -{ - VImage out; - - call( "fitsload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::openexrload( char * filename , VOption *options ) -{ - VImage out; - - call( "openexrload" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) ); - - return( out ); -} - -void VImage::csvsave( char * filename , VOption *options ) -{ - call( "csvsave" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "filename", filename ) ); -} - -void VImage::matrixsave( char * filename , VOption *options ) -{ - call( "matrixsave" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "filename", filename ) ); -} - -void VImage::matrixprint( VOption *options ) -{ - call( "matrixprint" , - (options ? options : VImage::option()) -> - set( "in", *this ) ); -} - -void VImage::rawsave( char * filename , VOption *options ) -{ - call( "rawsave" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "filename", filename ) ); -} - -void VImage::rawsave_fd( int fd , VOption *options ) -{ - call( "rawsave_fd" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "fd", fd ) ); -} - -void VImage::vipssave( char * filename , VOption *options ) -{ - call( "vipssave" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "filename", filename ) ); -} - -void VImage::ppmsave( char * filename , VOption *options ) -{ - call( "ppmsave" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "filename", filename ) ); -} - -void VImage::radsave( char * filename , VOption *options ) -{ - call( "radsave" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "filename", filename ) ); -} - -VipsBlob * VImage::radsave_buffer( VOption *options ) -{ - VipsBlob * buffer; - - call( "radsave_buffer" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "buffer", &buffer ) ); - - return( buffer ); -} - -void VImage::dzsave( char * filename , VOption *options ) -{ - call( "dzsave" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "filename", filename ) ); -} - -VipsBlob * VImage::dzsave_buffer( VOption *options ) -{ - VipsBlob * buffer; - - call( "dzsave_buffer" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "buffer", &buffer ) ); - - return( buffer ); -} - -void VImage::pngsave( char * filename , VOption *options ) -{ - call( "pngsave" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "filename", filename ) ); -} - -VipsBlob * VImage::pngsave_buffer( VOption *options ) -{ - VipsBlob * buffer; - - call( "pngsave_buffer" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "buffer", &buffer ) ); - - return( buffer ); -} - -void VImage::jpegsave( char * filename , VOption *options ) -{ - call( "jpegsave" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "filename", filename ) ); -} - -VipsBlob * VImage::jpegsave_buffer( VOption *options ) -{ - VipsBlob * buffer; - - call( "jpegsave_buffer" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "buffer", &buffer ) ); - - return( buffer ); -} - -void VImage::jpegsave_mime( VOption *options ) -{ - call( "jpegsave_mime" , - (options ? options : VImage::option()) -> - set( "in", *this ) ); -} - -void VImage::webpsave( char * filename , VOption *options ) -{ - call( "webpsave" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "filename", filename ) ); -} - -VipsBlob * VImage::webpsave_buffer( VOption *options ) -{ - VipsBlob * buffer; - - call( "webpsave_buffer" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "buffer", &buffer ) ); - - return( buffer ); -} - -void VImage::tiffsave( char * filename , VOption *options ) -{ - call( "tiffsave" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "filename", filename ) ); -} - -VipsBlob * VImage::tiffsave_buffer( VOption *options ) -{ - VipsBlob * buffer; - - call( "tiffsave_buffer" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "buffer", &buffer ) ); - - return( buffer ); -} - -void VImage::fitssave( char * filename , VOption *options ) -{ - call( "fitssave" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "filename", filename ) ); -} - -VImage VImage::thumbnail( char * filename , int width , VOption *options ) -{ - VImage out; - - call( "thumbnail" , - (options ? options : VImage::option()) -> - set( "filename", filename ) -> - set( "out", &out ) -> - set( "width", width ) ); - - return( out ); -} - -VImage VImage::thumbnail_buffer( VipsBlob * buffer , int width , VOption *options ) -{ - VImage out; - - call( "thumbnail_buffer" , - (options ? options : VImage::option()) -> - set( "buffer", buffer ) -> - set( "out", &out ) -> - set( "width", width ) ); - - return( out ); -} - -VImage VImage::mapim( VImage index , VOption *options ) -{ - VImage out; - - call( "mapim" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "index", index ) ); - - return( out ); -} - -VImage VImage::shrink( double hshrink , double vshrink , VOption *options ) -{ - VImage out; - - call( "shrink" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "hshrink", hshrink ) -> - set( "vshrink", vshrink ) ); - - return( out ); -} - -VImage VImage::shrinkh( int hshrink , VOption *options ) -{ - VImage out; - - call( "shrinkh" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "hshrink", hshrink ) ); - - return( out ); -} - -VImage VImage::shrinkv( int vshrink , VOption *options ) -{ - VImage out; - - call( "shrinkv" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "vshrink", vshrink ) ); - - return( out ); -} - -VImage VImage::reduceh( double hshrink , VOption *options ) -{ - VImage out; - - call( "reduceh" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "hshrink", hshrink ) ); - - return( out ); -} - -VImage VImage::reducev( double vshrink , VOption *options ) -{ - VImage out; - - call( "reducev" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "vshrink", vshrink ) ); - - return( out ); -} - -VImage VImage::reduce( double hshrink , double vshrink , VOption *options ) -{ - VImage out; - - call( "reduce" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "hshrink", hshrink ) -> - set( "vshrink", vshrink ) ); - - return( out ); -} - -VImage VImage::quadratic( VImage coeff , VOption *options ) -{ - VImage out; - - call( "quadratic" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "coeff", coeff ) ); - - return( out ); -} - -VImage VImage::affine( std::vector matrix , VOption *options ) -{ - VImage out; - - call( "affine" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "matrix", matrix ) ); - - return( out ); -} - -VImage VImage::similarity( VOption *options ) -{ - VImage out; - - call( "similarity" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::resize( double scale , VOption *options ) -{ - VImage out; - - call( "resize" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "scale", scale ) ); - - return( out ); -} - -VImage VImage::colourspace( VipsInterpretation space , VOption *options ) -{ - VImage out; - - call( "colourspace" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "space", space ) ); - - return( out ); -} - -VImage VImage::Lab2XYZ( VOption *options ) -{ - VImage out; - - call( "Lab2XYZ" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::XYZ2Lab( VOption *options ) -{ - VImage out; - - call( "XYZ2Lab" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::Lab2LCh( VOption *options ) -{ - VImage out; - - call( "Lab2LCh" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::LCh2Lab( VOption *options ) -{ - VImage out; - - call( "LCh2Lab" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::LCh2CMC( VOption *options ) -{ - VImage out; - - call( "LCh2CMC" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::CMC2LCh( VOption *options ) -{ - VImage out; - - call( "CMC2LCh" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::XYZ2Yxy( VOption *options ) -{ - VImage out; - - call( "XYZ2Yxy" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::Yxy2XYZ( VOption *options ) -{ - VImage out; - - call( "Yxy2XYZ" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::scRGB2XYZ( VOption *options ) -{ - VImage out; - - call( "scRGB2XYZ" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::XYZ2scRGB( VOption *options ) -{ - VImage out; - - call( "XYZ2scRGB" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::LabQ2Lab( VOption *options ) -{ - VImage out; - - call( "LabQ2Lab" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::Lab2LabQ( VOption *options ) -{ - VImage out; - - call( "Lab2LabQ" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::LabQ2LabS( VOption *options ) -{ - VImage out; - - call( "LabQ2LabS" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::LabS2LabQ( VOption *options ) -{ - VImage out; - - call( "LabS2LabQ" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::LabS2Lab( VOption *options ) -{ - VImage out; - - call( "LabS2Lab" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::Lab2LabS( VOption *options ) -{ - VImage out; - - call( "Lab2LabS" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::rad2float( VOption *options ) -{ - VImage out; - - call( "rad2float" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::float2rad( VOption *options ) -{ - VImage out; - - call( "float2rad" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::LabQ2sRGB( VOption *options ) -{ - VImage out; - - call( "LabQ2sRGB" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::sRGB2HSV( VOption *options ) -{ - VImage out; - - call( "sRGB2HSV" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::HSV2sRGB( VOption *options ) -{ - VImage out; - - call( "HSV2sRGB" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::icc_import( VOption *options ) -{ - VImage out; - - call( "icc_import" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::icc_export( VOption *options ) -{ - VImage out; - - call( "icc_export" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::icc_transform( char * output_profile , VOption *options ) -{ - VImage out; - - call( "icc_transform" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "output-profile", output_profile ) ); - - return( out ); -} - -VImage VImage::dE76( VImage right , VOption *options ) -{ - VImage out; - - call( "dE76" , - (options ? options : VImage::option()) -> - set( "left", *this ) -> - set( "right", right ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::dE00( VImage right , VOption *options ) -{ - VImage out; - - call( "dE00" , - (options ? options : VImage::option()) -> - set( "left", *this ) -> - set( "right", right ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::dECMC( VImage right , VOption *options ) -{ - VImage out; - - call( "dECMC" , - (options ? options : VImage::option()) -> - set( "left", *this ) -> - set( "right", right ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::sRGB2scRGB( VOption *options ) -{ - VImage out; - - call( "sRGB2scRGB" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::scRGB2BW( VOption *options ) -{ - VImage out; - - call( "scRGB2BW" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::scRGB2sRGB( VOption *options ) -{ - VImage out; - - call( "scRGB2sRGB" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::maplut( VImage lut , VOption *options ) -{ - VImage out; - - call( "maplut" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "lut", lut ) ); - - return( out ); -} - -int VImage::percent( double percent , VOption *options ) -{ - int threshold; - - call( "percent" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "percent", percent ) -> - set( "threshold", &threshold ) ); - - return( threshold ); -} - -VImage VImage::stdif( int width , int height , VOption *options ) -{ - VImage out; - - call( "stdif" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) ); - - return( out ); -} - -VImage VImage::hist_cum( VOption *options ) -{ - VImage out; - - call( "hist_cum" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::hist_match( VImage ref , VOption *options ) -{ - VImage out; - - call( "hist_match" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "ref", ref ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::hist_norm( VOption *options ) -{ - VImage out; - - call( "hist_norm" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::hist_equal( VOption *options ) -{ - VImage out; - - call( "hist_equal" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::hist_plot( VOption *options ) -{ - VImage out; - - call( "hist_plot" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::hist_local( int width , int height , VOption *options ) -{ - VImage out; - - call( "hist_local" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) ); - - return( out ); -} - -bool VImage::hist_ismonotonic( VOption *options ) -{ - bool monotonic; - - call( "hist_ismonotonic" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "monotonic", &monotonic ) ); - - return( monotonic ); -} - -double VImage::hist_entropy( VOption *options ) -{ - double out; - - call( "hist_entropy" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::conv( VImage mask , VOption *options ) -{ - VImage out; - - call( "conv" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "mask", mask ) ); - - return( out ); -} - -VImage VImage::conva( VImage mask , VOption *options ) -{ - VImage out; - - call( "conva" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "mask", mask ) ); - - return( out ); -} - -VImage VImage::convf( VImage mask , VOption *options ) -{ - VImage out; - - call( "convf" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "mask", mask ) ); - - return( out ); -} - -VImage VImage::convi( VImage mask , VOption *options ) -{ - VImage out; - - call( "convi" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "mask", mask ) ); - - return( out ); -} - -VImage VImage::compass( VImage mask , VOption *options ) -{ - VImage out; - - call( "compass" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "mask", mask ) ); - - return( out ); -} - -VImage VImage::convsep( VImage mask , VOption *options ) -{ - VImage out; - - call( "convsep" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "mask", mask ) ); - - return( out ); -} - -VImage VImage::convasep( VImage mask , VOption *options ) -{ - VImage out; - - call( "convasep" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "mask", mask ) ); - - return( out ); -} - -VImage VImage::fastcor( VImage ref , VOption *options ) -{ - VImage out; - - call( "fastcor" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "ref", ref ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::spcor( VImage ref , VOption *options ) -{ - VImage out; - - call( "spcor" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "ref", ref ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::sharpen( VOption *options ) -{ - VImage out; - - call( "sharpen" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::gaussblur( double sigma , VOption *options ) -{ - VImage out; - - call( "gaussblur" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "sigma", sigma ) ); - - return( out ); -} - -VImage VImage::fwfft( VOption *options ) -{ - VImage out; - - call( "fwfft" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::invfft( VOption *options ) -{ - VImage out; - - call( "invfft" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::freqmult( VImage mask , VOption *options ) -{ - VImage out; - - call( "freqmult" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "mask", mask ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::spectrum( VOption *options ) -{ - VImage out; - - call( "spectrum" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::phasecor( VImage in2 , VOption *options ) -{ - VImage out; - - call( "phasecor" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "in2", in2 ) -> - set( "out", &out ) ); - - return( out ); -} - -VImage VImage::morph( VImage mask , VipsOperationMorphology morph , VOption *options ) -{ - VImage out; - - call( "morph" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "mask", mask ) -> - set( "morph", morph ) ); - - return( out ); -} - -VImage VImage::rank( int width , int height , int index , VOption *options ) -{ - VImage out; - - call( "rank" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) -> - set( "width", width ) -> - set( "height", height ) -> - set( "index", index ) ); - - return( out ); -} - -double VImage::countlines( VipsDirection direction , VOption *options ) -{ - double nolines; - - call( "countlines" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "nolines", &nolines ) -> - set( "direction", direction ) ); - - return( nolines ); -} - -VImage VImage::labelregions( VOption *options ) -{ - VImage mask; - - call( "labelregions" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "mask", &mask ) ); - - return( mask ); -} - -void VImage::draw_rect( std::vector ink , int left , int top , int width , int height , VOption *options ) -{ - call( "draw_rect" , - (options ? options : VImage::option()) -> - set( "image", *this ) -> - set( "ink", ink ) -> - set( "left", left ) -> - set( "top", top ) -> - set( "width", width ) -> - set( "height", height ) ); -} - -void VImage::draw_mask( std::vector ink , VImage mask , int x , int y , VOption *options ) -{ - call( "draw_mask" , - (options ? options : VImage::option()) -> - set( "image", *this ) -> - set( "ink", ink ) -> - set( "mask", mask ) -> - set( "x", x ) -> - set( "y", y ) ); -} - -void VImage::draw_line( std::vector ink , int x1 , int y1 , int x2 , int y2 , VOption *options ) -{ - call( "draw_line" , - (options ? options : VImage::option()) -> - set( "image", *this ) -> - set( "ink", ink ) -> - set( "x1", x1 ) -> - set( "y1", y1 ) -> - set( "x2", x2 ) -> - set( "y2", y2 ) ); -} - -void VImage::draw_circle( std::vector ink , int cx , int cy , int radius , VOption *options ) -{ - call( "draw_circle" , - (options ? options : VImage::option()) -> - set( "image", *this ) -> - set( "ink", ink ) -> - set( "cx", cx ) -> - set( "cy", cy ) -> - set( "radius", radius ) ); -} - -void VImage::draw_flood( std::vector ink , int x , int y , VOption *options ) -{ - call( "draw_flood" , - (options ? options : VImage::option()) -> - set( "image", *this ) -> - set( "ink", ink ) -> - set( "x", x ) -> - set( "y", y ) ); -} - -void VImage::draw_image( VImage sub , int x , int y , VOption *options ) -{ - call( "draw_image" , - (options ? options : VImage::option()) -> - set( "image", *this ) -> - set( "sub", sub ) -> - set( "x", x ) -> - set( "y", y ) ); -} - -void VImage::draw_smudge( int left , int top , int width , int height , VOption *options ) -{ - call( "draw_smudge" , - (options ? options : VImage::option()) -> - set( "image", *this ) -> - set( "left", left ) -> - set( "top", top ) -> - set( "width", width ) -> - set( "height", height ) ); -} - -VImage VImage::merge( VImage sec , VipsDirection direction , int dx , int dy , VOption *options ) -{ - VImage out; - - call( "merge" , - (options ? options : VImage::option()) -> - set( "ref", *this ) -> - set( "sec", sec ) -> - set( "out", &out ) -> - set( "direction", direction ) -> - set( "dx", dx ) -> - set( "dy", dy ) ); - - return( out ); -} - -VImage VImage::mosaic( VImage sec , VipsDirection direction , int xref , int yref , int xsec , int ysec , VOption *options ) -{ - VImage out; - - call( "mosaic" , - (options ? options : VImage::option()) -> - set( "ref", *this ) -> - set( "sec", sec ) -> - set( "out", &out ) -> - set( "direction", direction ) -> - set( "xref", xref ) -> - set( "yref", yref ) -> - set( "xsec", xsec ) -> - set( "ysec", ysec ) ); - - return( out ); -} - -VImage VImage::mosaic1( VImage sec , VipsDirection direction , int xr1 , int yr1 , int xs1 , int ys1 , int xr2 , int yr2 , int xs2 , int ys2 , VOption *options ) -{ - VImage out; - - call( "mosaic1" , - (options ? options : VImage::option()) -> - set( "ref", *this ) -> - set( "sec", sec ) -> - set( "out", &out ) -> - set( "direction", direction ) -> - set( "xr1", xr1 ) -> - set( "yr1", yr1 ) -> - set( "xs1", xs1 ) -> - set( "ys1", ys1 ) -> - set( "xr2", xr2 ) -> - set( "yr2", yr2 ) -> - set( "xs2", xs2 ) -> - set( "ys2", ys2 ) ); - - return( out ); -} - -VImage VImage::match( VImage sec , int xr1 , int yr1 , int xs1 , int ys1 , int xr2 , int yr2 , int xs2 , int ys2 , VOption *options ) -{ - VImage out; - - call( "match" , - (options ? options : VImage::option()) -> - set( "ref", *this ) -> - set( "sec", sec ) -> - set( "out", &out ) -> - set( "xr1", xr1 ) -> - set( "yr1", yr1 ) -> - set( "xs1", xs1 ) -> - set( "ys1", ys1 ) -> - set( "xr2", xr2 ) -> - set( "yr2", yr2 ) -> - set( "xs2", xs2 ) -> - set( "ys2", ys2 ) ); - - return( out ); -} - -VImage VImage::globalbalance( VOption *options ) -{ - VImage out; - - call( "globalbalance" , - (options ? options : VImage::option()) -> - set( "in", *this ) -> - set( "out", &out ) ); - - return( out ); -} - diff --git a/src/metadata.cc b/src/metadata.cc index 3e6ed9b86..2fde7bf6a 100644 --- a/src/metadata.cc +++ b/src/metadata.cc @@ -1,51 +1,36 @@ -// Copyright 2013, 2014, 2015, 2016, 2017 Lovell Fuller and contributors. -// -// Licensed under the Apache License, Version 2.0 (the "License"); -// you may not use this file except in compliance with the License. -// You may obtain a copy of the License at -// -// http://www.apache.org/licenses/LICENSE-2.0 -// -// Unless required by applicable law or agreed to in writing, software -// distributed under the License is distributed on an "AS IS" BASIS, -// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -// See the License for the specific language governing permissions and -// limitations under the License. +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ +#include #include +#include +#include #include -#include -#include +#include #include -#include "common.h" -#include "metadata.h" +#include "./common.h" +#include "./metadata.h" + +static void* readPNGComment(VipsImage *image, const char *field, GValue *value, void *p); -class MetadataWorker : public Nan::AsyncWorker { +class MetadataWorker : public Napi::AsyncWorker { public: - MetadataWorker( - Nan::Callback *callback, MetadataBaton *baton, Nan::Callback *debuglog, - std::vector> const buffersToPersist) : - Nan::AsyncWorker(callback), baton(baton), debuglog(debuglog), - buffersToPersist(buffersToPersist) { - // Protect Buffer objects from GC, keyed on index - std::accumulate(buffersToPersist.begin(), buffersToPersist.end(), 0, - [this](uint32_t index, v8::Local const buffer) -> uint32_t { - SaveToPersistent(index, buffer); - return index + 1; - }); - } + MetadataWorker(Napi::Function callback, MetadataBaton *baton, Napi::Function debuglog) : + Napi::AsyncWorker(callback), baton(baton), debuglog(Napi::Persistent(debuglog)) {} ~MetadataWorker() {} void Execute() { // Decrement queued task counter - g_atomic_int_dec_and_test(&sharp::counterQueue); + sharp::counterQueue--; vips::VImage image; sharp::ImageType imageType = sharp::ImageType::UNKNOWN; try { - std::tie(image, imageType) = OpenInput(baton->input, VIPS_ACCESS_SEQUENTIAL); + std::tie(image, imageType) = OpenInput(baton->input); } catch (vips::VError const &err) { (baton->err).append(err.what()); } @@ -61,9 +46,60 @@ class MetadataWorker : public Nan::AsyncWorker { if (sharp::HasDensity(image)) { baton->density = sharp::GetDensity(image); } + if (image.get_typeof("jpeg-chroma-subsample") == VIPS_TYPE_REF_STRING) { + baton->chromaSubsampling = image.get_string("jpeg-chroma-subsample"); + } + if (image.get_typeof("interlaced") == G_TYPE_INT) { + baton->isProgressive = image.get_int("interlaced") == 1; + } + if (image.get_typeof(VIPS_META_PALETTE) == G_TYPE_INT) { + baton->isPalette = image.get_int(VIPS_META_PALETTE); + } + if (image.get_typeof(VIPS_META_BITS_PER_SAMPLE) == G_TYPE_INT) { + baton->bitsPerSample = image.get_int(VIPS_META_BITS_PER_SAMPLE); + } + if (image.get_typeof(VIPS_META_N_PAGES) == G_TYPE_INT) { + baton->pages = image.get_int(VIPS_META_N_PAGES); + } + if (image.get_typeof(VIPS_META_PAGE_HEIGHT) == G_TYPE_INT) { + baton->pageHeight = image.get_int(VIPS_META_PAGE_HEIGHT); + } + if (image.get_typeof("loop") == G_TYPE_INT) { + baton->loop = image.get_int("loop"); + } + if (image.get_typeof("delay") == VIPS_TYPE_ARRAY_INT) { + baton->delay = image.get_array_int("delay"); + } + if (image.get_typeof("heif-primary") == G_TYPE_INT) { + baton->pagePrimary = image.get_int("heif-primary"); + } + if (image.get_typeof("heif-compression") == VIPS_TYPE_REF_STRING) { + baton->compression = image.get_string("heif-compression"); + } + if (image.get_typeof(VIPS_META_RESOLUTION_UNIT) == VIPS_TYPE_REF_STRING) { + baton->resolutionUnit = image.get_string(VIPS_META_RESOLUTION_UNIT); + } + if (image.get_typeof("magick-format") == VIPS_TYPE_REF_STRING) { + baton->formatMagick = image.get_string("magick-format"); + } + if (image.get_typeof("openslide.level-count") == VIPS_TYPE_REF_STRING) { + int const levels = std::stoi(image.get_string("openslide.level-count")); + for (int l = 0; l < levels; l++) { + std::string prefix = "openslide.level[" + std::to_string(l) + "]."; + int const width = std::stoi(image.get_string((prefix + "width").data())); + int const height = std::stoi(image.get_string((prefix + "height").data())); + baton->levels.push_back(std::pair(width, height)); + } + } + if (image.get_typeof(VIPS_META_N_SUBIFDS) == G_TYPE_INT) { + baton->subifds = image.get_int(VIPS_META_N_SUBIFDS); + } baton->hasProfile = sharp::HasProfile(image); + if (image.get_typeof("background") == VIPS_TYPE_ARRAY_DOUBLE) { + baton->background = image.get_array_double("background"); + } // Derived attributes - baton->hasAlpha = sharp::HasAlpha(image); + baton->hasAlpha = image.has_alpha(); baton->orientation = sharp::ExifOrientation(image); // EXIF if (image.get_typeof(VIPS_META_EXIF_NAME) == VIPS_TYPE_BLOB) { @@ -81,6 +117,32 @@ class MetadataWorker : public Nan::AsyncWorker { memcpy(baton->icc, icc, iccLength); baton->iccLength = iccLength; } + // IPTC + if (image.get_typeof(VIPS_META_IPTC_NAME) == VIPS_TYPE_BLOB) { + size_t iptcLength; + void const *iptc = image.get_blob(VIPS_META_IPTC_NAME, &iptcLength); + baton->iptc = static_cast(g_malloc(iptcLength)); + memcpy(baton->iptc, iptc, iptcLength); + baton->iptcLength = iptcLength; + } + // XMP + if (image.get_typeof(VIPS_META_XMP_NAME) == VIPS_TYPE_BLOB) { + size_t xmpLength; + void const *xmp = image.get_blob(VIPS_META_XMP_NAME, &xmpLength); + baton->xmp = static_cast(g_malloc(xmpLength)); + memcpy(baton->xmp, xmp, xmpLength); + baton->xmpLength = xmpLength; + } + // TIFFTAG_PHOTOSHOP + if (image.get_typeof(VIPS_META_PHOTOSHOP_NAME) == VIPS_TYPE_BLOB) { + size_t tifftagPhotoshopLength; + void const *tifftagPhotoshop = image.get_blob(VIPS_META_PHOTOSHOP_NAME, &tifftagPhotoshopLength); + baton->tifftagPhotoshop = static_cast(g_malloc(tifftagPhotoshopLength)); + memcpy(baton->tifftagPhotoshop, tifftagPhotoshop, tifftagPhotoshopLength); + baton->tifftagPhotoshopLength = tifftagPhotoshopLength; + } + // PNG comments + vips_image_map(image.get_image(), readPNGComment, &baton->comments); } // Clean up @@ -88,92 +150,197 @@ class MetadataWorker : public Nan::AsyncWorker { vips_thread_shutdown(); } - void HandleOKCallback() { - using Nan::New; - using Nan::Set; - Nan::HandleScope(); + void OnOK() { + Napi::Env env = Env(); + Napi::HandleScope scope(env); - v8::Local argv[2] = { Nan::Null(), Nan::Null() }; - if (!baton->err.empty()) { - argv[0] = Nan::Error(baton->err.data()); - } else { - // Metadata Object - v8::Local info = New(); - Set(info, New("format").ToLocalChecked(), New(baton->format).ToLocalChecked()); - Set(info, New("width").ToLocalChecked(), New(baton->width)); - Set(info, New("height").ToLocalChecked(), New(baton->height)); - Set(info, New("space").ToLocalChecked(), New(baton->space).ToLocalChecked()); - Set(info, New("channels").ToLocalChecked(), New(baton->channels)); - Set(info, New("depth").ToLocalChecked(), New(baton->depth).ToLocalChecked()); + // Handle warnings + std::string warning = sharp::VipsWarningPop(); + while (!warning.empty()) { + debuglog.Call(Receiver().Value(), { Napi::String::New(env, warning) }); + warning = sharp::VipsWarningPop(); + } + + if (baton->err.empty()) { + Napi::Object info = Napi::Object::New(env); + info.Set("format", baton->format); + if (baton->input->bufferLength > 0) { + info.Set("size", baton->input->bufferLength); + } + info.Set("width", baton->width); + info.Set("height", baton->height); + info.Set("space", baton->space); + info.Set("channels", baton->channels); + info.Set("depth", baton->depth); if (baton->density > 0) { - Set(info, New("density").ToLocalChecked(), New(baton->density)); + info.Set("density", baton->density); + } + if (!baton->chromaSubsampling.empty()) { + info.Set("chromaSubsampling", baton->chromaSubsampling); + } + info.Set("isProgressive", baton->isProgressive); + info.Set("isPalette", baton->isPalette); + if (baton->bitsPerSample > 0) { + info.Set("bitsPerSample", baton->bitsPerSample); + if (baton->isPalette) { + // Deprecated, remove with libvips 8.17.0 + info.Set("paletteBitDepth", baton->bitsPerSample); + } + } + if (baton->pages > 0) { + info.Set("pages", baton->pages); } - Set(info, New("hasProfile").ToLocalChecked(), New(baton->hasProfile)); - Set(info, New("hasAlpha").ToLocalChecked(), New(baton->hasAlpha)); + if (baton->pageHeight > 0) { + info.Set("pageHeight", baton->pageHeight); + } + if (baton->loop >= 0) { + info.Set("loop", baton->loop); + } + if (!baton->delay.empty()) { + int i = 0; + Napi::Array delay = Napi::Array::New(env, static_cast(baton->delay.size())); + for (int const d : baton->delay) { + delay.Set(i++, d); + } + info.Set("delay", delay); + } + if (baton->pagePrimary > -1) { + info.Set("pagePrimary", baton->pagePrimary); + } + if (!baton->compression.empty()) { + info.Set("compression", baton->compression); + } + if (!baton->resolutionUnit.empty()) { + info.Set("resolutionUnit", baton->resolutionUnit == "in" ? "inch" : baton->resolutionUnit); + } + if (!baton->formatMagick.empty()) { + info.Set("formatMagick", baton->formatMagick); + } + if (!baton->levels.empty()) { + int i = 0; + Napi::Array levels = Napi::Array::New(env, static_cast(baton->levels.size())); + for (const auto& [width, height] : baton->levels) { + Napi::Object level = Napi::Object::New(env); + level.Set("width", width); + level.Set("height", height); + levels.Set(i++, level); + } + info.Set("levels", levels); + } + if (baton->subifds > 0) { + info.Set("subifds", baton->subifds); + } + if (!baton->background.empty()) { + Napi::Object background = Napi::Object::New(env); + if (baton->background.size() == 3) { + background.Set("r", baton->background[0]); + background.Set("g", baton->background[1]); + background.Set("b", baton->background[2]); + } else { + background.Set("gray", round(baton->background[0] * 100 / 255)); + } + info.Set("background", background); + } + info.Set("hasProfile", baton->hasProfile); + info.Set("hasAlpha", baton->hasAlpha); if (baton->orientation > 0) { - Set(info, New("orientation").ToLocalChecked(), New(baton->orientation)); + info.Set("orientation", baton->orientation); + } + Napi::Object autoOrient = Napi::Object::New(env); + info.Set("autoOrient", autoOrient); + if (baton->orientation >= 5) { + autoOrient.Set("width", baton->height); + autoOrient.Set("height", baton->width); + } else { + autoOrient.Set("width", baton->width); + autoOrient.Set("height", baton->height); } if (baton->exifLength > 0) { - Set(info, - New("exif").ToLocalChecked(), - Nan::NewBuffer(baton->exif, baton->exifLength, sharp::FreeCallback, nullptr).ToLocalChecked()); + info.Set("exif", Napi::Buffer::NewOrCopy(env, baton->exif, baton->exifLength, sharp::FreeCallback)); } if (baton->iccLength > 0) { - Set(info, - New("icc").ToLocalChecked(), - Nan::NewBuffer(baton->icc, baton->iccLength, sharp::FreeCallback, nullptr).ToLocalChecked()); + info.Set("icc", Napi::Buffer::NewOrCopy(env, baton->icc, baton->iccLength, sharp::FreeCallback)); + } + if (baton->iptcLength > 0) { + info.Set("iptc", Napi::Buffer::NewOrCopy(env, baton->iptc, baton->iptcLength, sharp::FreeCallback)); } - argv[1] = info; + if (baton->xmpLength > 0) { + if (g_utf8_validate(static_cast(baton->xmp), baton->xmpLength, nullptr)) { + info.Set("xmpAsString", + Napi::String::New(env, static_cast(baton->xmp), baton->xmpLength)); + } + info.Set("xmp", Napi::Buffer::NewOrCopy(env, baton->xmp, baton->xmpLength, sharp::FreeCallback)); + } + if (baton->tifftagPhotoshopLength > 0) { + info.Set("tifftagPhotoshop", + Napi::Buffer::NewOrCopy(env, baton->tifftagPhotoshop, + baton->tifftagPhotoshopLength, sharp::FreeCallback)); + } + if (baton->comments.size() > 0) { + int i = 0; + Napi::Array comments = Napi::Array::New(env, baton->comments.size()); + for (const auto& [keyword, text] : baton->comments) { + Napi::Object comment = Napi::Object::New(env); + comment.Set("keyword", keyword); + comment.Set("text", text); + comments.Set(i++, comment); + } + info.Set("comments", comments); + } + Callback().Call(Receiver().Value(), { env.Null(), info }); + } else { + Callback().Call(Receiver().Value(), { Napi::Error::New(env, sharp::TrimEnd(baton->err)).Value() }); } - // Dispose of Persistent wrapper around input Buffers so they can be garbage collected - std::accumulate(buffersToPersist.begin(), buffersToPersist.end(), 0, - [this](uint32_t index, v8::Local const buffer) -> uint32_t { - GetFromPersistent(index); - return index + 1; - }); delete baton->input; delete baton; - - // Handle warnings - std::string warning = sharp::VipsWarningPop(); - while (!warning.empty()) { - v8::Local message[1] = { New(warning).ToLocalChecked() }; - debuglog->Call(1, message); - warning = sharp::VipsWarningPop(); - } - - // Return to JavaScript - callback->Call(2, argv); } private: MetadataBaton* baton; - Nan::Callback *debuglog; - std::vector> buffersToPersist; + Napi::FunctionReference debuglog; }; /* metadata(options, callback) */ -NAN_METHOD(metadata) { - // Input Buffers must not undergo GC compaction during processing - std::vector> buffersToPersist; - +Napi::Value metadata(const Napi::CallbackInfo& info) { // V8 objects are converted to non-V8 types held in the baton struct MetadataBaton *baton = new MetadataBaton; - v8::Local options = info[0].As(); + Napi::Object options = info[size_t(0)].As(); // Input - baton->input = sharp::CreateInputDescriptor(sharp::AttrAs(options, "input"), buffersToPersist); + baton->input = sharp::CreateInputDescriptor(options.Get("input").As()); // Function to notify of libvips warnings - Nan::Callback *debuglog = new Nan::Callback(sharp::AttrAs(options, "debuglog")); + Napi::Function debuglog = options.Get("debuglog").As(); // Join queue for worker thread - Nan::Callback *callback = new Nan::Callback(info[1].As()); - Nan::AsyncQueueWorker(new MetadataWorker(callback, baton, debuglog, buffersToPersist)); + Napi::Function callback = info[size_t(1)].As(); + MetadataWorker *worker = new MetadataWorker(callback, baton, debuglog); + worker->Receiver().Set("options", options); + worker->Queue(); // Increment queued task counter - g_atomic_int_inc(&sharp::counterQueue); + sharp::counterQueue++; + + return info.Env().Undefined(); +} + +const char *PNG_COMMENT_START = "png-comment-"; +const int PNG_COMMENT_START_LEN = strlen(PNG_COMMENT_START); + +static void* readPNGComment(VipsImage *image, const char *field, GValue *value, void *p) { + MetadataComments *comments = static_cast(p); + + if (vips_isprefix(PNG_COMMENT_START, field)) { + const char *keyword = strchr(field + PNG_COMMENT_START_LEN, '-'); + const char *str; + if (keyword != NULL && !vips_image_get_string(image, field, &str)) { + keyword++; // Skip the hyphen + comments->push_back(std::make_pair(keyword, str)); + } + } + + return NULL; } diff --git a/src/metadata.h b/src/metadata.h index 501756396..6f02d452c 100644 --- a/src/metadata.h +++ b/src/metadata.h @@ -1,25 +1,19 @@ -// Copyright 2013, 2014, 2015, 2016, 2017 Lovell Fuller and contributors. -// -// Licensed under the Apache License, Version 2.0 (the "License"); -// you may not use this file except in compliance with the License. -// You may obtain a copy of the License at -// -// http://www.apache.org/licenses/LICENSE-2.0 -// -// Unless required by applicable law or agreed to in writing, software -// distributed under the License is distributed on an "AS IS" BASIS, -// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -// See the License for the specific language governing permissions and -// limitations under the License. +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ #ifndef SRC_METADATA_H_ #define SRC_METADATA_H_ #include -#include +#include +#include #include "./common.h" +typedef std::vector> MetadataComments; + struct MetadataBaton { // Input sharp::InputDescriptor *input; @@ -31,6 +25,21 @@ struct MetadataBaton { int channels; std::string depth; int density; + std::string chromaSubsampling; + bool isProgressive; + bool isPalette; + int bitsPerSample; + int pages; + int pageHeight; + int loop; + std::vector delay; + int pagePrimary; + std::string compression; + std::string resolutionUnit; + std::string formatMagick; + std::vector> levels; + int subifds; + std::vector background; bool hasProfile; bool hasAlpha; int orientation; @@ -38,6 +47,13 @@ struct MetadataBaton { size_t exifLength; char *icc; size_t iccLength; + char *iptc; + size_t iptcLength; + char *xmp; + size_t xmpLength; + char *tifftagPhotoshop; + size_t tifftagPhotoshopLength; + MetadataComments comments; std::string err; MetadataBaton(): @@ -46,15 +62,29 @@ struct MetadataBaton { height(0), channels(0), density(0), + isProgressive(false), + isPalette(false), + bitsPerSample(0), + pages(0), + pageHeight(0), + loop(-1), + pagePrimary(-1), + subifds(0), hasProfile(false), hasAlpha(false), orientation(0), exif(nullptr), exifLength(0), icc(nullptr), - iccLength(0) {} + iccLength(0), + iptc(nullptr), + iptcLength(0), + xmp(nullptr), + xmpLength(0), + tifftagPhotoshop(nullptr), + tifftagPhotoshopLength(0) {} }; -NAN_METHOD(metadata); +Napi::Value metadata(const Napi::CallbackInfo& info); #endif // SRC_METADATA_H_ diff --git a/src/operations.cc b/src/operations.cc index 274ec911f..daeba5ab4 100644 --- a/src/operations.cc +++ b/src/operations.cc @@ -1,161 +1,67 @@ -// Copyright 2013, 2014, 2015, 2016, 2017 Lovell Fuller and contributors. -// -// Licensed under the Apache License, Version 2.0 (the "License"); -// you may not use this file except in compliance with the License. -// You may obtain a copy of the License at -// -// http://www.apache.org/licenses/LICENSE-2.0 -// -// Unless required by applicable law or agreed to in writing, software -// distributed under the License is distributed on an "AS IS" BASIS, -// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -// See the License for the specific language governing permissions and -// limitations under the License. +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ #include #include #include #include #include - #include -#include "common.h" -#include "operations.h" +#include "./common.h" +#include "./operations.h" using vips::VImage; using vips::VError; namespace sharp { - /* - Composite overlayImage over image at given position - Assumes alpha channels are already premultiplied and will be unpremultiplied after + * Tint an image using the provided RGB. */ - VImage Composite(VImage image, VImage overlayImage, int const left, int const top) { - if (HasAlpha(overlayImage)) { - // Alpha composite - if (overlayImage.width() < image.width() || overlayImage.height() < image.height()) { - // Enlarge overlay - std::vector const background { 0.0, 0.0, 0.0, 0.0 }; - overlayImage = overlayImage.embed(left, top, image.width(), image.height(), VImage::option() - ->set("extend", VIPS_EXTEND_BACKGROUND) - ->set("background", background)); - } - return AlphaComposite(image, overlayImage); - } else { - if (HasAlpha(image)) { - // Add alpha channel to overlayImage so channels match - double const multiplier = sharp::Is16Bit(overlayImage.interpretation()) ? 256.0 : 1.0; - overlayImage = overlayImage.bandjoin( - VImage::new_matrix(overlayImage.width(), overlayImage.height()).new_from_image(255 * multiplier)); - } - return image.insert(overlayImage, left, top); - } - } - - VImage AlphaComposite(VImage dst, VImage src) { - // Split src into non-alpha and alpha channels - VImage srcWithoutAlpha = src.extract_band(0, VImage::option()->set("n", src.bands() - 1)); - VImage srcAlpha = src[src.bands() - 1] * (1.0 / 255.0); - - // Split dst into non-alpha and alpha channels - VImage dstWithoutAlpha = dst.extract_band(0, VImage::option()->set("n", dst.bands() - 1)); - VImage dstAlpha = dst[dst.bands() - 1] * (1.0 / 255.0); - - // - // Compute normalized output alpha channel: - // - // References: - // - http://en.wikipedia.org/wiki/Alpha_compositing#Alpha_blending - // - https://github.com/jcupitt/ruby-vips/issues/28#issuecomment-9014826 - // - // out_a = src_a + dst_a * (1 - src_a) - // ^^^^^^^^^^^ - // t0 - VImage t0 = srcAlpha.linear(-1.0, 1.0); - VImage outAlphaNormalized = srcAlpha + dstAlpha * t0; - - // - // Compute output RGB channels: - // - // Wikipedia: - // out_rgb = (src_rgb * src_a + dst_rgb * dst_a * (1 - src_a)) / out_a - // ^^^^^^^^^^^ - // t0 - // - // Omit division by `out_a` since `Compose` is supposed to output a - // premultiplied RGBA image as reversal of premultiplication is handled - // externally. - // - VImage outRGBPremultiplied = srcWithoutAlpha + dstWithoutAlpha * t0; - - // Combine RGB and alpha channel into output image: - return outRGBPremultiplied.bandjoin(outAlphaNormalized * 255.0); - } - - /* - Cutout src over dst with given gravity. - */ - VImage Cutout(VImage mask, VImage dst, const int gravity) { - using sharp::CalculateCrop; - using sharp::HasAlpha; - using sharp::MaximumImageAlpha; - - bool maskHasAlpha = HasAlpha(mask); - - if (!maskHasAlpha && mask.bands() > 1) { - throw VError("Overlay image must have an alpha channel or one band"); + VImage Tint(VImage image, std::vector const tint) { + std::vector const tintLab = (VImage::black(1, 1) + tint) + .colourspace(VIPS_INTERPRETATION_LAB, VImage::option()->set("source_space", VIPS_INTERPRETATION_sRGB)) + .getpoint(0, 0); + // LAB identity function + VImage identityLab = VImage::identity(VImage::option()->set("bands", 3)) + .colourspace(VIPS_INTERPRETATION_LAB, VImage::option()->set("source_space", VIPS_INTERPRETATION_sRGB)); + // Scale luminance range, 0.0 to 1.0 + VImage l = identityLab[0] / 100; + // Weighting functions + VImage weightL = 1.0 - 4.0 * ((l - 0.5) * (l - 0.5)); + VImage weightAB = (weightL * tintLab).extract_band(1, VImage::option()->set("n", 2)); + identityLab = identityLab[0].bandjoin(weightAB); + // Convert lookup table to sRGB + VImage lut = identityLab.colourspace(VIPS_INTERPRETATION_sRGB, + VImage::option()->set("source_space", VIPS_INTERPRETATION_LAB)); + // Original colourspace + VipsInterpretation typeBeforeTint = image.interpretation(); + if (typeBeforeTint == VIPS_INTERPRETATION_RGB) { + typeBeforeTint = VIPS_INTERPRETATION_sRGB; } - if (!HasAlpha(dst)) { - throw VError("Image to be overlaid must have an alpha channel"); - } - if (mask.width() > dst.width() || mask.height() > dst.height()) { - throw VError("Overlay image must have same dimensions or smaller"); - } - - // Enlarge overlay mask, if required - if (mask.width() < dst.width() || mask.height() < dst.height()) { - // Calculate the (left, top) coordinates of the output image within the input image, applying the given gravity. - int left; - int top; - std::tie(left, top) = CalculateCrop(dst.width(), dst.height(), mask.width(), mask.height(), gravity); - // Embed onto transparent background - std::vector background { 0.0, 0.0, 0.0, 0.0 }; - mask = mask.embed(left, top, dst.width(), dst.height(), VImage::option() - ->set("extend", VIPS_EXTEND_BACKGROUND) - ->set("background", background)); - } - - // we use the mask alpha if it has alpha - if (maskHasAlpha) { - mask = mask.extract_band(mask.bands() - 1, VImage::option()->set("n", 1));; + // Apply lookup table + if (image.has_alpha()) { + VImage alpha = image[image.bands() - 1]; + image = RemoveAlpha(image) + .colourspace(VIPS_INTERPRETATION_B_W) + .maplut(lut) + .colourspace(typeBeforeTint) + .bandjoin(alpha); + } else { + image = image + .colourspace(VIPS_INTERPRETATION_B_W) + .maplut(lut) + .colourspace(typeBeforeTint); } - - // Split dst into an optional alpha - VImage dstAlpha = dst.extract_band(dst.bands() - 1, VImage::option()->set("n", 1)); - - // we use the dst non-alpha - dst = dst.extract_band(0, VImage::option()->set("n", dst.bands() - 1)); - - // the range of the mask and the image need to match .. one could be - // 16-bit, one 8-bit - double const dstMax = MaximumImageAlpha(dst.interpretation()); - double const maskMax = MaximumImageAlpha(mask.interpretation()); - - // combine the new mask and the existing alpha ... there are - // many ways of doing this, mult is the simplest - mask = dstMax * ((mask / maskMax) * (dstAlpha / dstMax)); - - // append the mask to the image data ... the mask might be float now, - // we must cast the format down to match the image data - return dst.bandjoin(mask.cast(dst.format())); + return image; } /* * Stretch luminance to cover full dynamic range. */ - VImage Normalise(VImage image) { + VImage Normalise(VImage image, int const lower, int const upper) { // Get original colourspace VipsInterpretation typeBeforeNormalize = image.interpretation(); if (typeBeforeNormalize == VIPS_INTERPRETATION_RGB) { @@ -165,11 +71,12 @@ namespace sharp { VImage lab = image.colourspace(VIPS_INTERPRETATION_LAB); // Extract luminance VImage luminance = lab[0]; + // Find luminance range - VImage stats = luminance.stats(); - double min = stats(0, 0)[0]; - double max = stats(1, 0)[0]; - if (min != max) { + int const min = lower == 0 ? luminance.min() : luminance.percent(lower); + int const max = upper == 100 ? luminance.max() : luminance.percent(upper); + + if (std::abs(max - min) > 1) { // Extract chroma VImage chroma = lab.extract_band(1, VImage::option()->set("n", 2)); // Calculate multiplication factor and addition @@ -178,7 +85,7 @@ namespace sharp { // Scale luminance, join to chroma, convert back to original colourspace VImage normalized = luminance.linear(f, a).bandjoin(chroma).colourspace(typeBeforeNormalize); // Attach original alpha channel, if any - if (HasAlpha(image)) { + if (image.has_alpha()) { // Extract original alpha channel VImage alpha = image[image.bands() - 1]; // Join alpha channel to normalised image @@ -190,25 +97,56 @@ namespace sharp { return image; } + /* + * Contrast limiting adapative histogram equalization (CLAHE) + */ + VImage Clahe(VImage image, int const width, int const height, int const maxSlope) { + return image.hist_local(width, height, VImage::option()->set("max_slope", maxSlope)); + } + /* * Gamma encoding/decoding */ VImage Gamma(VImage image, double const exponent) { - if (HasAlpha(image)) { + if (image.has_alpha()) { // Separate alpha channel - VImage imageWithoutAlpha = image.extract_band(0, - VImage::option()->set("n", image.bands() - 1)); VImage alpha = image[image.bands() - 1]; - return imageWithoutAlpha.gamma(VImage::option()->set("exponent", exponent)).bandjoin(alpha); + return RemoveAlpha(image).gamma(VImage::option()->set("exponent", exponent)).bandjoin(alpha); } else { return image.gamma(VImage::option()->set("exponent", exponent)); } } + /* + * Flatten image to remove alpha channel + */ + VImage Flatten(VImage image, std::vector flattenBackground) { + double const multiplier = sharp::Is16Bit(image.interpretation()) ? 256.0 : 1.0; + std::vector background { + flattenBackground[0] * multiplier, + flattenBackground[1] * multiplier, + flattenBackground[2] * multiplier + }; + return image.flatten(VImage::option()->set("background", background)); + } + + /** + * Produce the "negative" of the image. + */ + VImage Negate(VImage image, bool const negateAlpha) { + if (image.has_alpha() && !negateAlpha) { + // Separate alpha channel + VImage alpha = image[image.bands() - 1]; + return RemoveAlpha(image).invert().bandjoin(alpha); + } else { + return image.invert(); + } + } + /* * Gaussian blur. Use sigma of -1.0 for fast blur. */ - VImage Blur(VImage image, double const sigma) { + VImage Blur(VImage image, double const sigma, VipsPrecision precision, double const minAmpl) { if (sigma == -1.0) { // Fast, mild blur - averages neighbouring pixels VImage blur = VImage::new_matrixv(3, 3, @@ -219,7 +157,9 @@ namespace sharp { return image.conv(blur); } else { // Slower, accurate Gaussian blur - return image.gaussblur(sigma); + return StaySequential(image).gaussblur(sigma, VImage::option() + ->set("precision", precision) + ->set("min_ampl", minAmpl)); } } @@ -228,10 +168,10 @@ namespace sharp { */ VImage Convolve(VImage image, int const width, int const height, double const scale, double const offset, - std::unique_ptr const &kernel_v + std::vector const &kernel_v ) { VImage kernel = VImage::new_from_memory( - kernel_v.get(), + static_cast(const_cast(kernel_v.data())), width * height * sizeof(double), width, height, @@ -243,10 +183,57 @@ namespace sharp { return image.conv(kernel); } + /* + * Recomb with a Matrix of the given bands/channel size. + * Eg. RGB will be a 3x3 matrix. + */ + VImage Recomb(VImage image, std::vector const& matrix) { + double* m = const_cast(matrix.data()); + image = image.colourspace(VIPS_INTERPRETATION_sRGB); + if (matrix.size() == 9) { + return image + .recomb(image.bands() == 3 + ? VImage::new_matrix(3, 3, m, 9) + : VImage::new_matrixv(4, 4, + m[0], m[1], m[2], 0.0, + m[3], m[4], m[5], 0.0, + m[6], m[7], m[8], 0.0, + 0.0, 0.0, 0.0, 1.0)); + } else { + return image.recomb(VImage::new_matrix(4, 4, m, 16)); + } + } + + VImage Modulate(VImage image, double const brightness, double const saturation, + int const hue, double const lightness) { + VipsInterpretation colourspaceBeforeModulate = image.interpretation(); + if (image.has_alpha()) { + // Separate alpha channel + VImage alpha = image[image.bands() - 1]; + return RemoveAlpha(image) + .colourspace(VIPS_INTERPRETATION_LCH) + .linear( + { brightness, saturation, 1}, + { lightness, 0.0, static_cast(hue) } + ) + .colourspace(colourspaceBeforeModulate) + .bandjoin(alpha); + } else { + return image + .colourspace(VIPS_INTERPRETATION_LCH) + .linear( + { brightness, saturation, 1 }, + { lightness, 0.0, static_cast(hue) } + ) + .colourspace(colourspaceBeforeModulate); + } + } + /* * Sharpen flat and jagged areas. Use sigma of -1.0 for fast sharpen. */ - VImage Sharpen(VImage image, double const sigma, double const flat, double const jagged) { + VImage Sharpen(VImage image, double const sigma, double const m1, double const m2, + double const x1, double const y2, double const y3) { if (sigma == -1.0) { // Fast, mild sharpen VImage sharpen = VImage::new_matrixv(3, 3, @@ -261,29 +248,18 @@ namespace sharp { if (colourspaceBeforeSharpen == VIPS_INTERPRETATION_RGB) { colourspaceBeforeSharpen = VIPS_INTERPRETATION_sRGB; } - return image.sharpen( - VImage::option()->set("sigma", sigma)->set("m1", flat)->set("m2", jagged)) + return image + .sharpen(VImage::option() + ->set("sigma", sigma) + ->set("m1", m1) + ->set("m2", m2) + ->set("x1", x1) + ->set("y2", y2) + ->set("y3", y3)) .colourspace(colourspaceBeforeSharpen); } } - /* - Insert a tile cache to prevent over-computation of any previous operations in the pipeline - */ - VImage TileCache(VImage image, double const factor) { - int tile_width; - int tile_height; - int scanline_count; - vips_get_tile_size(image.get_image(), &tile_width, &tile_height, &scanline_count); - double const need_lines = 1.2 * scanline_count / factor; - return image.tilecache(VImage::option() - ->set("tile_width", image.width()) - ->set("tile_height", 10) - ->set("max_tiles", static_cast(round(1.0 + need_lines / 10.0))) - ->set("access", VIPS_ACCESS_SEQUENTIAL) - ->set("threaded", TRUE)); - } - VImage Threshold(VImage image, double const threshold, bool const thresholdGrayscale) { if (!thresholdGrayscale) { return image >= threshold; @@ -306,56 +282,218 @@ namespace sharp { return image.boolean(imageR, boolean); } - VImage Trim(VImage image, int const tolerance) { - using sharp::MaximumImageAlpha; - // An equivalent of ImageMagick's -trim in C++ ... automatically remove - // "boring" image edges. + /* + Trim an image + */ + VImage Trim(VImage image, std::vector background, double threshold, bool const lineArt) { + if (image.width() < 3 && image.height() < 3) { + throw VError("Image to trim must be at least 3x3 pixels"); + } + if (background.size() == 0) { + // Top-left pixel provides the default background colour if none is given + background = image.extract_area(0, 0, 1, 1)(0, 0); + } else if (sharp::Is16Bit(image.interpretation())) { + for (size_t i = 0; i < background.size(); i++) { + background[i] *= 256.0; + } + threshold *= 256.0; + } + std::vector backgroundAlpha({ background.back() }); + if (image.has_alpha()) { + background.pop_back(); + } else { + background.resize(image.bands()); + } + int left, top, width, height; + left = image.find_trim(&top, &width, &height, VImage::option() + ->set("background", background) + ->set("line_art", lineArt) + ->set("threshold", threshold)); + if (image.has_alpha()) { + // Search alpha channel (A) + int leftA, topA, widthA, heightA; + VImage alpha = image[image.bands() - 1]; + leftA = alpha.find_trim(&topA, &widthA, &heightA, VImage::option() + ->set("background", backgroundAlpha) + ->set("line_art", lineArt) + ->set("threshold", threshold)); + if (widthA > 0 && heightA > 0) { + if (width > 0 && height > 0) { + // Combined bounding box (B) + int const leftB = std::min(left, leftA); + int const topB = std::min(top, topA); + int const widthB = std::max(left + width, leftA + widthA) - leftB; + int const heightB = std::max(top + height, topA + heightA) - topB; + return image.extract_area(leftB, topB, widthB, heightB); + } else { + // Use alpha only + return image.extract_area(leftA, topA, widthA, heightA); + } + } + } + if (width > 0 && height > 0) { + return image.extract_area(left, top, width, height); + } + return image; + } + + /* + * Calculate (a * in + b) + */ + VImage Linear(VImage image, std::vector const a, std::vector const b) { + size_t const bands = static_cast(image.bands()); + if (a.size() > bands) { + throw VError("Band expansion using linear is unsupported"); + } + bool const uchar = !Is16Bit(image.interpretation()); + if (image.has_alpha() && a.size() != bands && (a.size() == 1 || a.size() == bands - 1 || bands - 1 == 1)) { + // Separate alpha channel + VImage alpha = image[bands - 1]; + return RemoveAlpha(image).linear(a, b, VImage::option()->set("uchar", uchar)).bandjoin(alpha); + } else { + return image.linear(a, b, VImage::option()->set("uchar", uchar)); + } + } + + /* + * Unflatten + */ + VImage Unflatten(VImage image) { + if (image.has_alpha()) { + VImage alpha = image[image.bands() - 1]; + VImage noAlpha = RemoveAlpha(image); + return noAlpha.bandjoin(alpha & (noAlpha.colourspace(VIPS_INTERPRETATION_B_W) < 255)); + } else { + return image.bandjoin(image.colourspace(VIPS_INTERPRETATION_B_W) < 255); + } + } + + /* + * Ensure the image is in a given colourspace + */ + VImage EnsureColourspace(VImage image, VipsInterpretation colourspace) { + if (colourspace != VIPS_INTERPRETATION_LAST && image.interpretation() != colourspace) { + image = image.colourspace(colourspace, + VImage::option()->set("source_space", image.interpretation())); + } + return image; + } - // We use .project to sum the rows and columns of a 0/255 mask image, the first - // non-zero row or column is the object edge. We make the mask image with an - // amount-different-from-background image plus a threshold. + /* + * Split and crop each frame, reassemble, and update pageHeight. + */ + VImage CropMultiPage(VImage image, int left, int top, int width, int height, + int nPages, int *pageHeight) { + if (top == 0 && height == *pageHeight) { + // Fast path; no need to adjust the height of the multi-page image + return image.extract_area(left, 0, width, image.height()); + } else { + std::vector pages; + pages.reserve(nPages); + + // Split the image into cropped frames + image = StaySequential(image); + for (int i = 0; i < nPages; i++) { + pages.push_back( + image.extract_area(left, *pageHeight * i + top, width, height)); + } - // find the value of the pixel at (0, 0) ... we will search for all pixels - // significantly different from this - std::vector background = image(0, 0); + // Reassemble the frames into a tall, thin image + VImage assembled = VImage::arrayjoin(pages, + VImage::option()->set("across", 1)); - double const max = MaximumImageAlpha(image.interpretation()); + // Update the page height + *pageHeight = height; - // we need to smooth the image, subtract the background from every pixel, take - // the absolute value of the difference, then threshold - VImage mask = (image.median(3) - background).abs() > (max * tolerance / 100); + return assembled; + } + } - // sum mask rows and columns, then search for the first non-zero sum in each - // direction - VImage rows; - VImage columns = mask.project(&rows); + /* + * Split into frames, embed each frame, reassemble, and update pageHeight. + */ + VImage EmbedMultiPage(VImage image, int left, int top, int width, int height, + VipsExtend extendWith, std::vector background, int nPages, int *pageHeight) { + if (top == 0 && height == *pageHeight) { + // Fast path; no need to adjust the height of the multi-page image + return image.embed(left, 0, width, image.height(), VImage::option() + ->set("extend", extendWith) + ->set("background", background)); + } else if (left == 0 && width == image.width()) { + // Fast path; no need to adjust the width of the multi-page image + std::vector pages; + pages.reserve(nPages); + + // Rearrange the tall image into a vertical grid + image = image.grid(*pageHeight, nPages, 1); + + // Do the embed on the wide image + image = image.embed(0, top, image.width(), height, VImage::option() + ->set("extend", extendWith) + ->set("background", background)); + + // Split the wide image into frames + for (int i = 0; i < nPages; i++) { + pages.push_back( + image.extract_area(width * i, 0, width, height)); + } - VImage profileLeftV; - VImage profileLeftH = columns.profile(&profileLeftV); + // Reassemble the frames into a tall, thin image + VImage assembled = VImage::arrayjoin(pages, + VImage::option()->set("across", 1)); - VImage profileRightV; - VImage profileRightH = columns.fliphor().profile(&profileRightV); + // Update the page height + *pageHeight = height; - VImage profileTopV; - VImage profileTopH = rows.profile(&profileTopV); + return assembled; + } else { + std::vector pages; + pages.reserve(nPages); - VImage profileBottomV; - VImage profileBottomH = rows.flipver().profile(&profileBottomV); + // Split the image into frames + for (int i = 0; i < nPages; i++) { + pages.push_back( + image.extract_area(0, *pageHeight * i, image.width(), *pageHeight)); + } + + // Embed each frame in the target size + for (int i = 0; i < nPages; i++) { + pages[i] = pages[i].embed(left, top, width, height, VImage::option() + ->set("extend", extendWith) + ->set("background", background)); + } - int left = static_cast(floor(profileLeftV.min())); - int right = columns.width() - static_cast(floor(profileRightV.min())); - int top = static_cast(floor(profileTopH.min())); - int bottom = rows.height() - static_cast(floor(profileBottomH.min())); + // Reassemble the frames into a tall, thin image + VImage assembled = VImage::arrayjoin(pages, + VImage::option()->set("across", 1)); - int width = right - left; - int height = bottom - top; + // Update the page height + *pageHeight = height; - if (width <= 0 || height <= 0) { - throw VError("Unexpected error while trimming. Try to lower the tolerance"); + return assembled; } + } - // and now crop the original image - return image.extract_area(left, top, width, height); + /* + * Dilate an image + */ + VImage Dilate(VImage image, int const width) { + int const maskWidth = 2 * width + 1; + VImage mask = VImage::new_matrix(maskWidth, maskWidth); + return image.morph( + mask, + VIPS_OPERATION_MORPHOLOGY_DILATE).invert(); + } + + /* + * Erode an image + */ + VImage Erode(VImage image, int const width) { + int const maskWidth = 2 * width + 1; + VImage mask = VImage::new_matrix(maskWidth, maskWidth); + return image.morph( + mask, + VIPS_OPERATION_MORPHOLOGY_ERODE).invert(); } } // namespace sharp diff --git a/src/operations.h b/src/operations.h index 529bca445..c281c02cd 100644 --- a/src/operations.h +++ b/src/operations.h @@ -1,16 +1,7 @@ -// Copyright 2013, 2014, 2015, 2016, 2017 Lovell Fuller and contributors. -// -// Licensed under the Apache License, Version 2.0 (the "License"); -// you may not use this file except in compliance with the License. -// You may obtain a copy of the License at -// -// http://www.apache.org/licenses/LICENSE-2.0 -// -// Unless required by applicable law or agreed to in writing, software -// distributed under the License is distributed on an "AS IS" BASIS, -// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -// See the License for the specific language governing permissions and -// limitations under the License. +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ #ifndef SRC_OPERATIONS_H_ #define SRC_OPERATIONS_H_ @@ -19,6 +10,7 @@ #include #include #include +#include #include using vips::VImage; @@ -26,56 +18,51 @@ using vips::VImage; namespace sharp { /* - Alpha composite src over dst with given gravity. - Assumes alpha channels are already premultiplied and will be unpremultiplied after. + * Tint an image using the provided RGB. */ - VImage Composite(VImage src, VImage dst, const int gravity); + VImage Tint(VImage image, std::vector const tint); /* - Composite overlayImage over image at given position + * Stretch luminance to cover full dynamic range. */ - VImage Composite(VImage image, VImage overlayImage, int const x, int const y); + VImage Normalise(VImage image, int const lower, int const upper); /* - Alpha composite overlayImage over image, assumes matching dimensions - */ - VImage AlphaComposite(VImage image, VImage overlayImage); + * Contrast limiting adapative histogram equalization (CLAHE) + */ + VImage Clahe(VImage image, int const width, int const height, int const maxSlope); /* - Cutout src over dst with given gravity. - */ - VImage Cutout(VImage src, VImage dst, const int gravity); + * Gamma encoding/decoding + */ + VImage Gamma(VImage image, double const exponent); /* - * Stretch luminance to cover full dynamic range. + * Flatten image to remove alpha channel */ - VImage Normalise(VImage image); + VImage Flatten(VImage image, std::vector flattenBackground); /* - * Gamma encoding/decoding + * Produce the "negative" of the image. */ - VImage Gamma(VImage image, double const exponent); + VImage Negate(VImage image, bool const negateAlpha); /* * Gaussian blur. Use sigma of -1.0 for fast blur. */ - VImage Blur(VImage image, double const sigma); + VImage Blur(VImage image, double const sigma, VipsPrecision precision, double const minAmpl); /* * Convolution with a kernel. */ VImage Convolve(VImage image, int const width, int const height, - double const scale, double const offset, std::unique_ptr const &kernel_v); + double const scale, double const offset, std::vector const &kernel_v); /* * Sharpen flat and jagged areas. Use sigma of -1.0 for fast sharpen. */ - VImage Sharpen(VImage image, double const sigma, double const flat, double const jagged); - - /* - Insert a tile cache to prevent over-computation of any previous operations in the pipeline - */ - VImage TileCache(VImage image, double const factor); + VImage Sharpen(VImage image, double const sigma, double const m1, double const m2, + double const x1, double const y2, double const y3); /* Threshold an image @@ -95,8 +82,56 @@ namespace sharp { /* Trim an image */ - VImage Trim(VImage image, int const tolerance); + VImage Trim(VImage image, std::vector background, double threshold, bool const lineArt); + + /* + * Linear adjustment (a * in + b) + */ + VImage Linear(VImage image, std::vector const a, std::vector const b); + /* + * Unflatten + */ + VImage Unflatten(VImage image); + + /* + * Recomb with a Matrix of the given bands/channel size. + * Eg. RGB will be a 3x3 matrix. + */ + VImage Recomb(VImage image, std::vector const &matrix); + + /* + * Modulate brightness, saturation, hue and lightness + */ + VImage Modulate(VImage image, double const brightness, double const saturation, + int const hue, double const lightness); + + /* + * Ensure the image is in a given colourspace + */ + VImage EnsureColourspace(VImage image, VipsInterpretation colourspace); + + /* + * Split and crop each frame, reassemble, and update pageHeight. + */ + VImage CropMultiPage(VImage image, int left, int top, int width, int height, + int nPages, int *pageHeight); + + /* + * Split into frames, embed each frame, reassemble, and update pageHeight. + */ + VImage EmbedMultiPage(VImage image, int left, int top, int width, int height, + VipsExtend extendWith, std::vector background, int nPages, int *pageHeight); + + /* + * Dilate an image + */ + VImage Dilate(VImage image, int const maskWidth); + + /* + * Erode an image + */ + VImage Erode(VImage image, int const maskWidth); } // namespace sharp #endif // SRC_OPERATIONS_H_ diff --git a/src/pipeline.cc b/src/pipeline.cc index 4396f5634..5f0a3bb0e 100644 --- a/src/pipeline.cc +++ b/src/pipeline.cc @@ -1,19 +1,11 @@ -// Copyright 2013, 2014, 2015, 2016, 2017 Lovell Fuller and contributors. -// -// Licensed under the Apache License, Version 2.0 (the "License"); -// you may not use this file except in compliance with the License. -// You may obtain a copy of the License at -// -// http://www.apache.org/licenses/LICENSE-2.0 -// -// Unless required by applicable law or agreed to in writing, software -// distributed under the License is distributed on an "AS IS" BASIS, -// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -// See the License for the specific language governing permissions and -// limitations under the License. +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ #include #include +#include // NOLINT(build/c++17) #include #include #include @@ -21,318 +13,351 @@ #include #include #include +#include +#include #include -#include -#include +#include -#include "common.h" -#include "operations.h" -#include "pipeline.h" +#include "./common.h" +#include "./operations.h" +#include "./pipeline.h" -class PipelineWorker : public Nan::AsyncWorker { +class PipelineWorker : public Napi::AsyncWorker { public: - PipelineWorker( - Nan::Callback *callback, PipelineBaton *baton, Nan::Callback *debuglog, Nan::Callback *queueListener, - std::vector> const buffersToPersist) : - Nan::AsyncWorker(callback), baton(baton), debuglog(debuglog), queueListener(queueListener), - buffersToPersist(buffersToPersist) { - // Protect Buffer objects from GC, keyed on index - std::accumulate(buffersToPersist.begin(), buffersToPersist.end(), 0, - [this](uint32_t index, v8::Local const buffer) -> uint32_t { - SaveToPersistent(index, buffer); - return index + 1; - }); - } + PipelineWorker(Napi::Function callback, PipelineBaton *baton, + Napi::Function debuglog, Napi::Function queueListener) : + Napi::AsyncWorker(callback), + baton(baton), + debuglog(Napi::Persistent(debuglog)), + queueListener(Napi::Persistent(queueListener)) {} ~PipelineWorker() {} // libuv worker void Execute() { - using sharp::HasAlpha; - using sharp::ImageType; - // Decrement queued task counter - g_atomic_int_dec_and_test(&sharp::counterQueue); + sharp::counterQueue--; // Increment processing task counter - g_atomic_int_inc(&sharp::counterProcess); - - std::map profileMap; - // Default sRGB ICC profile from https://packages.debian.org/sid/all/icc-profiles-free/filelist - profileMap.insert( - std::pair(VIPS_INTERPRETATION_sRGB, - baton->iccProfilePath + "sRGB.icc")); - // Convert to sRGB using default CMYK profile from http://www.argyllcms.com/cmyk.icm - profileMap.insert( - std::pair(VIPS_INTERPRETATION_CMYK, - baton->iccProfilePath + "cmyk.icm")); + sharp::counterProcess++; try { // Open input vips::VImage image; - ImageType inputImageType; - std::tie(image, inputImageType) = sharp::OpenInput(baton->input, baton->accessMethod); - - // Limit input images to a given number of pixels, where pixels = width * height - // Ignore if 0 - if (baton->limitInputPixels > 0 && image.width() * image.height() > baton->limitInputPixels) { - (baton->err).append("Input image exceeds pixel limit"); - return Error(); + sharp::ImageType inputImageType; + if (baton->join.empty()) { + std::tie(image, inputImageType) = sharp::OpenInput(baton->input); + } else { + std::vector images; + bool hasAlpha = false; + for (auto &join : baton->join) { + std::tie(image, inputImageType) = sharp::OpenInput(join); + image = sharp::EnsureColourspace(image, baton->colourspacePipeline); + images.push_back(image); + hasAlpha |= image.has_alpha(); + } + if (hasAlpha) { + for (auto &image : images) { + if (!image.has_alpha()) { + image = sharp::EnsureAlpha(image, 1); + } + } + } else { + baton->input->joinBackground.pop_back(); + } + inputImageType = sharp::ImageType::PNG; + image = VImage::arrayjoin(images, VImage::option() + ->set("across", baton->input->joinAcross) + ->set("shim", baton->input->joinShim) + ->set("background", baton->input->joinBackground) + ->set("halign", baton->input->joinHalign) + ->set("valign", baton->input->joinValign)); + if (baton->input->joinAnimated) { + image = image.copy(); + image.set(VIPS_META_N_PAGES, static_cast(images.size())); + image.set(VIPS_META_PAGE_HEIGHT, static_cast(image.height() / images.size())); + } + } + VipsAccess access = baton->input->access; + image = sharp::EnsureColourspace(image, baton->colourspacePipeline); + + int nPages = baton->input->pages; + if (nPages == -1) { + // Resolve the number of pages if we need to render until the end of the document + nPages = image.get_typeof(VIPS_META_N_PAGES) != 0 + ? image.get_int(VIPS_META_N_PAGES) - baton->input->page + : 1; } + // Get pre-resize page height + int pageHeight = sharp::GetPageHeight(image); + // Calculate angle of rotation - VipsAngle rotation; - if (baton->useExifOrientation) { + VipsAngle rotation = VIPS_ANGLE_D0; + VipsAngle autoRotation = VIPS_ANGLE_D0; + bool autoFlop = false; + + if (baton->input->autoOrient) { // Rotate and flip image according to Exif orientation - bool flip; - bool flop; - std::tie(rotation, flip, flop) = CalculateExifRotationAndFlip(sharp::ExifOrientation(image)); - baton->flip = baton->flip || flip; - baton->flop = baton->flop || flop; - } else { - rotation = CalculateAngleRotation(baton->angle); + std::tie(autoRotation, autoFlop) = CalculateExifRotationAndFlop(sharp::ExifOrientation(image)); } - // Rotate pre-extract - if (baton->rotateBeforePreExtract && rotation != VIPS_ANGLE_D0) { - image = image.rot(rotation); - sharp::RemoveExifOrientation(image); + rotation = CalculateAngleRotation(baton->angle); + + bool const shouldRotateBefore = baton->rotateBefore && + (rotation != VIPS_ANGLE_D0 || baton->flip || baton->flop || baton->rotationAngle != 0.0); + bool const shouldOrientBefore = (shouldRotateBefore || baton->orientBefore) && + (autoRotation != VIPS_ANGLE_D0 || autoFlop); + + if (shouldOrientBefore) { + image = sharp::StaySequential(image, autoRotation != VIPS_ANGLE_D0); + if (autoRotation != VIPS_ANGLE_D0) { + if (autoRotation != VIPS_ANGLE_D180) { + MultiPageUnsupported(nPages, "Rotate"); + } + image = image.rot(autoRotation); + autoRotation = VIPS_ANGLE_D0; + } + if (autoFlop) { + image = image.flip(VIPS_DIRECTION_HORIZONTAL); + autoFlop = false; + } + } + + if (shouldRotateBefore) { + image = sharp::StaySequential(image, rotation != VIPS_ANGLE_D0 || baton->flip || baton->rotationAngle != 0.0); + if (baton->flip) { + image = image.flip(VIPS_DIRECTION_VERTICAL); + baton->flip = false; + } + if (baton->flop) { + image = image.flip(VIPS_DIRECTION_HORIZONTAL); + baton->flop = false; + } + if (rotation != VIPS_ANGLE_D0) { + if (rotation != VIPS_ANGLE_D180) { + MultiPageUnsupported(nPages, "Rotate"); + } + image = image.rot(rotation); + rotation = VIPS_ANGLE_D0; + } + if (baton->rotationAngle != 0.0) { + MultiPageUnsupported(nPages, "Rotate"); + std::vector background; + std::tie(image, background) = sharp::ApplyAlpha(image, baton->rotationBackground, false); + image = image.rotate(baton->rotationAngle, VImage::option()->set("background", background)).copy_memory(); + baton->rotationAngle = 0.0; + } } // Trim - if (baton->trimTolerance != 0) { - image = sharp::Trim(image, baton->trimTolerance); + if (baton->trimThreshold >= 0.0) { + MultiPageUnsupported(nPages, "Trim"); + image = sharp::StaySequential(image); + image = sharp::Trim(image, baton->trimBackground, baton->trimThreshold, baton->trimLineArt); + baton->trimOffsetLeft = image.xoffset(); + baton->trimOffsetTop = image.yoffset(); } // Pre extraction if (baton->topOffsetPre != -1) { - image = image.extract_area(baton->leftOffsetPre, baton->topOffsetPre, baton->widthPre, baton->heightPre); + image = nPages > 1 + ? sharp::CropMultiPage(image, + baton->leftOffsetPre, baton->topOffsetPre, baton->widthPre, baton->heightPre, nPages, &pageHeight) + : image.extract_area(baton->leftOffsetPre, baton->topOffsetPre, baton->widthPre, baton->heightPre); } // Get pre-resize image width and height int inputWidth = image.width(); int inputHeight = image.height(); - if (!baton->rotateBeforePreExtract && - (rotation == VIPS_ANGLE_D90 || rotation == VIPS_ANGLE_D270)) { - // Swap input output width and height when rotating by 90 or 270 degrees - std::swap(inputWidth, inputHeight); + + // Is there just one page? Shrink to inputHeight instead + if (nPages == 1) { + pageHeight = inputHeight; } // Scaling calculations - double xfactor = 1.0; - double yfactor = 1.0; + double hshrink; + double vshrink; int targetResizeWidth = baton->width; int targetResizeHeight = baton->height; - if (baton->width > 0 && baton->height > 0) { - // Fixed width and height - xfactor = static_cast(inputWidth) / static_cast(baton->width); - yfactor = static_cast(inputHeight) / static_cast(baton->height); - switch (baton->canvas) { - case Canvas::CROP: - if (xfactor < yfactor) { - targetResizeHeight = static_cast(round(static_cast(inputHeight) / xfactor)); - yfactor = xfactor; - } else { - targetResizeWidth = static_cast(round(static_cast(inputWidth) / yfactor)); - xfactor = yfactor; - } - break; - case Canvas::EMBED: - if (xfactor > yfactor) { - targetResizeHeight = static_cast(round(static_cast(inputHeight) / xfactor)); - yfactor = xfactor; - } else { - targetResizeWidth = static_cast(round(static_cast(inputWidth) / yfactor)); - xfactor = yfactor; - } - break; - case Canvas::MAX: - if (xfactor > yfactor) { - targetResizeHeight = baton->height = static_cast(round(static_cast(inputHeight) / xfactor)); - yfactor = xfactor; - } else { - targetResizeWidth = baton->width = static_cast(round(static_cast(inputWidth) / yfactor)); - xfactor = yfactor; - } - break; - case Canvas::MIN: - if (xfactor < yfactor) { - targetResizeHeight = baton->height = static_cast(round(static_cast(inputHeight) / xfactor)); - yfactor = xfactor; - } else { - targetResizeWidth = baton->width = static_cast(round(static_cast(inputWidth) / yfactor)); - xfactor = yfactor; - } - break; - case Canvas::IGNORE_ASPECT: - if (!baton->rotateBeforePreExtract && - (rotation == VIPS_ANGLE_D90 || rotation == VIPS_ANGLE_D270)) { - std::swap(xfactor, yfactor); - } - break; - } - } else if (baton->width > 0) { - // Fixed width - xfactor = static_cast(inputWidth) / static_cast(baton->width); - if (baton->canvas == Canvas::IGNORE_ASPECT) { - targetResizeHeight = baton->height = inputHeight; - } else { - // Auto height - yfactor = xfactor; - targetResizeHeight = baton->height = static_cast(round(static_cast(inputHeight) / yfactor)); - } - } else if (baton->height > 0) { - // Fixed height - yfactor = static_cast(inputHeight) / static_cast(baton->height); - if (baton->canvas == Canvas::IGNORE_ASPECT) { - targetResizeWidth = baton->width = inputWidth; - } else { - // Auto width - xfactor = yfactor; - targetResizeWidth = baton->width = static_cast(round(static_cast(inputWidth) / xfactor)); - } - } else { - // Identity transform - baton->width = inputWidth; - baton->height = inputHeight; - } - // Calculate integral box shrink - int xshrink = std::max(1, static_cast(floor(xfactor))); - int yshrink = std::max(1, static_cast(floor(yfactor))); - - // Calculate residual float affine transformation - double xresidual = static_cast(xshrink) / xfactor; - double yresidual = static_cast(yshrink) / yfactor; - - // Do not enlarge the output if the input width *or* height - // are already less than the required dimensions - if (baton->withoutEnlargement) { - if (inputWidth < baton->width || inputHeight < baton->height) { - xfactor = 1.0; - yfactor = 1.0; - xshrink = 1; - yshrink = 1; - xresidual = 1.0; - yresidual = 1.0; - baton->width = inputWidth; - baton->height = inputHeight; - } + // When auto-rotating by 90 or 270 degrees, swap the target width and + // height to ensure the behavior aligns with how it would have been if + // the rotation had taken place *before* resizing. + if (autoRotation == VIPS_ANGLE_D90 || autoRotation == VIPS_ANGLE_D270) { + std::swap(targetResizeWidth, targetResizeHeight); } - // If integral x and y shrink are equal, try to use shrink-on-load for JPEG and WebP, - // but not when applying gamma correction or pre-resize extract - int shrink_on_load = 1; - if ( - xshrink == yshrink && xshrink >= 2 && - (inputImageType == ImageType::JPEG || inputImageType == ImageType::WEBP) && - baton->gamma == 0 && baton->topOffsetPre == -1 - ) { - if (xshrink >= 8) { - xfactor = xfactor / 8; - yfactor = yfactor / 8; - shrink_on_load = 8; - } else if (xshrink >= 4) { - xfactor = xfactor / 4; - yfactor = yfactor / 4; - shrink_on_load = 4; - } else if (xshrink >= 2) { - xfactor = xfactor / 2; - yfactor = yfactor / 2; - shrink_on_load = 2; + // Shrink to pageHeight, so we work for multi-page images + std::tie(hshrink, vshrink) = sharp::ResolveShrink( + inputWidth, pageHeight, targetResizeWidth, targetResizeHeight, + baton->canvas, baton->withoutEnlargement, baton->withoutReduction); + + // The jpeg preload shrink. + int jpegShrinkOnLoad = 1; + + // WebP, PDF, SVG scale + double scale = 1.0; + + // Try to reload input using shrink-on-load for JPEG, WebP, SVG and PDF, when: + // - the width or height parameters are specified; + // - gamma correction doesn't need to be applied; + // - trimming or pre-resize extract isn't required; + // - input colourspace is not specified; + bool const shouldPreShrink = (targetResizeWidth > 0 || targetResizeHeight > 0) && + baton->gamma == 0 && baton->topOffsetPre == -1 && baton->trimThreshold < 0.0 && + baton->colourspacePipeline == VIPS_INTERPRETATION_LAST && !(shouldOrientBefore || shouldRotateBefore); + + if (shouldPreShrink) { + // The common part of the shrink: the bit by which both axes must be shrunk + double shrink = std::min(hshrink, vshrink); + + if (inputImageType == sharp::ImageType::JPEG) { + // Leave at least a factor of two for the final resize step, when fastShrinkOnLoad: false + // for more consistent results and to avoid extra sharpness to the image + int factor = baton->fastShrinkOnLoad ? 1 : 2; + if (shrink >= 8 * factor) { + jpegShrinkOnLoad = 8; + } else if (shrink >= 4 * factor) { + jpegShrinkOnLoad = 4; + } else if (shrink >= 2 * factor) { + jpegShrinkOnLoad = 2; + } + // Lower shrink-on-load for known libjpeg rounding errors + if (jpegShrinkOnLoad > 1 && static_cast(shrink) == jpegShrinkOnLoad) { + jpegShrinkOnLoad /= 2; + } + } else if (inputImageType == sharp::ImageType::WEBP && baton->fastShrinkOnLoad && shrink > 1.0) { + // Avoid upscaling via webp + scale = 1.0 / shrink; + } else if (inputImageType == sharp::ImageType::SVG || + inputImageType == sharp::ImageType::PDF) { + scale = 1.0 / shrink; } } - // Help ensure a final kernel-based reduction to prevent shrink aliasing - if (shrink_on_load > 1 && (xresidual == 1.0 || yresidual == 1.0)) { - shrink_on_load = shrink_on_load / 2; - xfactor = xfactor * 2; - yfactor = yfactor * 2; - } - if (shrink_on_load > 1) { - // Reload input using shrink-on-load - vips::VOption *option = VImage::option()->set("shrink", shrink_on_load); + + // Reload input using shrink-on-load, it'll be an integer shrink + // factor for jpegload*, a double scale factor for webpload*, + // pdfload* and svgload* + if (jpegShrinkOnLoad > 1) { + vips::VOption *option = GetOptionsForImageType(inputImageType, baton->input)->set("shrink", jpegShrinkOnLoad); if (baton->input->buffer != nullptr) { + // Reload JPEG buffer VipsBlob *blob = vips_blob_new(nullptr, baton->input->buffer, baton->input->bufferLength); - if (inputImageType == ImageType::JPEG) { - // Reload JPEG buffer - image = VImage::jpegload_buffer(blob, option); - } else { - // Reload WebP buffer - image = VImage::webpload_buffer(blob, option); - } + image = VImage::jpegload_buffer(blob, option); vips_area_unref(reinterpret_cast(blob)); } else { - if (inputImageType == ImageType::JPEG) { - // Reload JPEG file - image = VImage::jpegload(const_cast(baton->input->file.data()), option); + // Reload JPEG file + image = VImage::jpegload(const_cast(baton->input->file.data()), option); + } + } else if (scale != 1.0) { + vips::VOption *option = GetOptionsForImageType(inputImageType, baton->input)->set("scale", scale); + if (inputImageType == sharp::ImageType::WEBP) { + if (baton->input->buffer != nullptr) { + // Reload WebP buffer + VipsBlob *blob = vips_blob_new(nullptr, baton->input->buffer, baton->input->bufferLength); + image = VImage::webpload_buffer(blob, option); + vips_area_unref(reinterpret_cast(blob)); } else { // Reload WebP file image = VImage::webpload(const_cast(baton->input->file.data()), option); } + } else if (inputImageType == sharp::ImageType::SVG) { + if (baton->input->buffer != nullptr) { + // Reload SVG buffer + VipsBlob *blob = vips_blob_new(nullptr, baton->input->buffer, baton->input->bufferLength); + image = VImage::svgload_buffer(blob, option); + vips_area_unref(reinterpret_cast(blob)); + } else { + // Reload SVG file + image = VImage::svgload(const_cast(baton->input->file.data()), option); + } + sharp::SetDensity(image, baton->input->density); + if (image.width() > 32767 || image.height() > 32767) { + throw vips::VError("Input SVG image will exceed 32767x32767 pixel limit when scaled"); + } + } else if (inputImageType == sharp::ImageType::PDF) { + if (baton->input->buffer != nullptr) { + // Reload PDF buffer + VipsBlob *blob = vips_blob_new(nullptr, baton->input->buffer, baton->input->bufferLength); + image = VImage::pdfload_buffer(blob, option); + vips_area_unref(reinterpret_cast(blob)); + } else { + // Reload PDF file + image = VImage::pdfload(const_cast(baton->input->file.data()), option); + } + sharp::SetDensity(image, baton->input->density); } - // Recalculate integral shrink and double residual - int shrunkOnLoadWidth = image.width(); - int shrunkOnLoadHeight = image.height(); - if (!baton->rotateBeforePreExtract && - (rotation == VIPS_ANGLE_D90 || rotation == VIPS_ANGLE_D270)) { - // Swap input output width and height when rotating by 90 or 270 degrees - std::swap(shrunkOnLoadWidth, shrunkOnLoadHeight); - } - xfactor = static_cast(shrunkOnLoadWidth) / static_cast(targetResizeWidth); - yfactor = static_cast(shrunkOnLoadHeight) / static_cast(targetResizeHeight); - xshrink = std::max(1, static_cast(floor(xfactor))); - yshrink = std::max(1, static_cast(floor(yfactor))); - xresidual = static_cast(xshrink) / xfactor; - yresidual = static_cast(yshrink) / yfactor; - if ( - !baton->rotateBeforePreExtract && - (rotation == VIPS_ANGLE_D90 || rotation == VIPS_ANGLE_D270) - ) { - std::swap(xresidual, yresidual); + } else { + if (inputImageType == sharp::ImageType::SVG && (image.width() > 32767 || image.height() > 32767)) { + throw vips::VError("Input SVG image exceeds 32767x32767 pixel limit"); } } - // Help ensure a final kernel-based reduction to prevent shrink aliasing - if (xshrink > 1 && yshrink > 1 && (xresidual == 1.0 || yresidual == 1.0)) { - xshrink = xshrink / 2; - yshrink = yshrink / 2; - xresidual = static_cast(xshrink) / xfactor; - yresidual = static_cast(yshrink) / yfactor; + if (baton->input->autoOrient) { + image = sharp::RemoveExifOrientation(image); + } + + // Any pre-shrinking may already have been done + inputWidth = image.width(); + inputHeight = image.height(); + + // After pre-shrink, but before the main shrink stage + // Reuse the initial pageHeight if we didn't pre-shrink + if (shouldPreShrink) { + pageHeight = sharp::GetPageHeight(image); + } + + // Shrink to pageHeight, so we work for multi-page images + std::tie(hshrink, vshrink) = sharp::ResolveShrink( + inputWidth, pageHeight, targetResizeWidth, targetResizeHeight, + baton->canvas, baton->withoutEnlargement, baton->withoutReduction); + + int targetHeight = static_cast(std::rint(static_cast(pageHeight) / vshrink)); + int targetPageHeight = targetHeight; + + // In toilet-roll mode, we must adjust vshrink so that we exactly hit + // pageHeight or we'll have pixels straddling pixel boundaries + if (inputHeight > pageHeight) { + targetHeight *= nPages; + vshrink = static_cast(inputHeight) / targetHeight; } // Ensure we're using a device-independent colour space - if (sharp::HasProfile(image)) { - // Convert to sRGB using embedded profile + std::pair inputProfile(nullptr, 0); + if ((baton->keepMetadata & VIPS_FOREIGN_KEEP_ICC) && baton->withIccProfile.empty()) { + // Cache input profile for use with output + inputProfile = sharp::GetProfile(image); + baton->input->ignoreIcc = true; + } + char const *processingProfile = image.interpretation() == VIPS_INTERPRETATION_RGB16 ? "p3" : "srgb"; + if ( + sharp::HasProfile(image) && + image.interpretation() != VIPS_INTERPRETATION_LABS && + image.interpretation() != VIPS_INTERPRETATION_GREY16 && + baton->colourspacePipeline != VIPS_INTERPRETATION_CMYK && + !baton->input->ignoreIcc + ) { + // Convert to sRGB/P3 using embedded profile try { - image = image.icc_transform( - const_cast(profileMap[VIPS_INTERPRETATION_sRGB].data()), VImage::option() - ->set("embedded", TRUE) + image = image.icc_transform(processingProfile, VImage::option() + ->set("embedded", true) + ->set("depth", sharp::Is16Bit(image.interpretation()) ? 16 : 8) ->set("intent", VIPS_INTENT_PERCEPTUAL)); } catch(...) { - // Ignore failure of embedded profile + sharp::VipsWarningCallback(nullptr, G_LOG_LEVEL_WARNING, "Invalid embedded profile", nullptr); } - } else if (image.interpretation() == VIPS_INTERPRETATION_CMYK) { - image = image.icc_transform( - const_cast(profileMap[VIPS_INTERPRETATION_sRGB].data()), VImage::option() - ->set("input_profile", profileMap[VIPS_INTERPRETATION_CMYK].data()) + } else if ( + image.interpretation() == VIPS_INTERPRETATION_CMYK && + baton->colourspacePipeline != VIPS_INTERPRETATION_CMYK + ) { + image = image.icc_transform(processingProfile, VImage::option() + ->set("input_profile", "cmyk") ->set("intent", VIPS_INTENT_PERCEPTUAL)); } // Flatten image to remove alpha channel - if (baton->flatten && HasAlpha(image)) { - // Scale up 8-bit values to match 16-bit input image - double const multiplier = sharp::Is16Bit(image.interpretation()) ? 256.0 : 1.0; - // Background colour - std::vector background { - baton->background[0] * multiplier, - baton->background[1] * multiplier, - baton->background[2] * multiplier - }; - image = image.flatten(VImage::option() - ->set("background", background)); - } - - // Negate the colours in the image - if (baton->negate) { - image = image.invert(); + if (baton->flatten && image.has_alpha()) { + image = sharp::Flatten(image, baton->flattenBackground); } // Gamma encoding (darken) @@ -345,252 +370,241 @@ class PipelineWorker : public Nan::AsyncWorker { image = image.colourspace(VIPS_INTERPRETATION_B_W); } - // Ensure image has an alpha channel when there is an overlay with an alpha channel - VImage overlayImage; - ImageType overlayImageType = ImageType::UNKNOWN; - bool shouldOverlayWithAlpha = FALSE; - if (baton->overlay != nullptr) { - std::tie(overlayImage, overlayImageType) = OpenInput(baton->overlay, baton->accessMethod); - if (HasAlpha(overlayImage)) { - shouldOverlayWithAlpha = !baton->overlayCutout; - if (!HasAlpha(image)) { - double const multiplier = sharp::Is16Bit(image.interpretation()) ? 256.0 : 1.0; - image = image.bandjoin( - VImage::new_matrix(image.width(), image.height()).new_from_image(255 * multiplier)); - } - } - } - - bool const shouldShrink = xshrink > 1 || yshrink > 1; - bool const shouldReduce = xresidual != 1.0 || yresidual != 1.0; + bool const shouldResize = hshrink != 1.0 || vshrink != 1.0; bool const shouldBlur = baton->blurSigma != 0.0; bool const shouldConv = baton->convKernelWidth * baton->convKernelHeight > 0; bool const shouldSharpen = baton->sharpenSigma != 0.0; - bool const shouldPremultiplyAlpha = HasAlpha(image) && - (shouldShrink || shouldReduce || shouldBlur || shouldConv || shouldSharpen || shouldOverlayWithAlpha); + bool const shouldComposite = !baton->composite.empty(); - // Premultiply image alpha channel before all transformations to avoid - // dark fringing around bright pixels - // See: http://entropymine.com/imageworsener/resizealpha/ - if (shouldPremultiplyAlpha) { - image = image.premultiply(); + if (shouldComposite && !image.has_alpha()) { + image = sharp::EnsureAlpha(image, 1); } - // Fast, integral box-shrink - if (shouldShrink) { - if (yshrink > 1) { - image = image.shrinkv(yshrink); - } - if (xshrink > 1) { - image = image.shrinkh(xshrink); - } - // Recalculate residual float based on dimensions of required vs shrunk images - int shrunkWidth = image.width(); - int shrunkHeight = image.height(); - if (!baton->rotateBeforePreExtract && - (rotation == VIPS_ANGLE_D90 || rotation == VIPS_ANGLE_D270)) { - // Swap input output width and height when rotating by 90 or 270 degrees - std::swap(shrunkWidth, shrunkHeight); - } - xresidual = static_cast(targetResizeWidth) / static_cast(shrunkWidth); - yresidual = static_cast(targetResizeHeight) / static_cast(shrunkHeight); - if ( - !baton->rotateBeforePreExtract && - (rotation == VIPS_ANGLE_D90 || rotation == VIPS_ANGLE_D270) - ) { - std::swap(xresidual, yresidual); - } - } + VipsBandFormat premultiplyFormat = image.format(); + bool const shouldPremultiplyAlpha = image.has_alpha() && + (shouldResize || shouldBlur || shouldConv || shouldSharpen); - // Use affine increase or kernel reduce with the remaining float part - if (xresidual != 1.0 || yresidual != 1.0) { - // Insert tile cache to prevent over-computation of previous operations - if (baton->accessMethod == VIPS_ACCESS_SEQUENTIAL) { - image = sharp::TileCache(image, yresidual); - } - // Perform kernel-based reduction - if (yresidual < 1.0 || xresidual < 1.0) { - VipsKernel kernel = static_cast( - vips_enum_from_nick(nullptr, VIPS_TYPE_KERNEL, baton->kernel.data())); - if ( - kernel != VIPS_KERNEL_NEAREST && kernel != VIPS_KERNEL_CUBIC && kernel != VIPS_KERNEL_LANCZOS2 && - kernel != VIPS_KERNEL_LANCZOS3 - ) { - throw vips::VError("Unknown kernel"); - } - if (yresidual < 1.0) { - image = image.reducev(1.0 / yresidual, VImage::option() - ->set("kernel", kernel) - ->set("centre", baton->centreSampling)); - } - if (xresidual < 1.0) { - image = image.reduceh(1.0 / xresidual, VImage::option() - ->set("kernel", kernel) - ->set("centre", baton->centreSampling)); - } - } - // Perform enlargement - if (yresidual > 1.0 || xresidual > 1.0) { - if (trunc(xresidual) == xresidual && trunc(yresidual) == yresidual && baton->interpolator == "nearest") { - // Fast, integral nearest neighbour enlargement - image = image.zoom(static_cast(xresidual), static_cast(yresidual)); - } else { - // Floating point affine transformation - vips::VInterpolate interpolator = vips::VInterpolate::new_from_name(baton->interpolator.data()); - if (yresidual > 1.0 && xresidual > 1.0) { - image = image.affine({xresidual, 0.0, 0.0, yresidual}, VImage::option() - ->set("interpolate", interpolator)); - } else if (yresidual > 1.0) { - image = image.affine({1.0, 0.0, 0.0, yresidual}, VImage::option() - ->set("interpolate", interpolator)); - } else if (xresidual > 1.0) { - image = image.affine({xresidual, 0.0, 0.0, 1.0}, VImage::option() - ->set("interpolate", interpolator)); - } - } - } + if (shouldPremultiplyAlpha) { + image = image.premultiply().cast(premultiplyFormat); } - // Rotate - if (!baton->rotateBeforePreExtract && rotation != VIPS_ANGLE_D0) { - image = image.rot(rotation); - sharp::RemoveExifOrientation(image); + // Resize + if (shouldResize) { + image = image.resize(1.0 / hshrink, VImage::option() + ->set("vscale", 1.0 / vshrink) + ->set("kernel", baton->kernel)); } - // Flip (mirror about Y axis) + image = sharp::StaySequential(image, + autoRotation != VIPS_ANGLE_D0 || + baton->flip || + rotation != VIPS_ANGLE_D0); + // Auto-rotate post-extract + if (autoRotation != VIPS_ANGLE_D0) { + if (autoRotation != VIPS_ANGLE_D180) { + MultiPageUnsupported(nPages, "Rotate"); + } + image = image.rot(autoRotation); + } + // Mirror vertically (up-down) about the x-axis if (baton->flip) { image = image.flip(VIPS_DIRECTION_VERTICAL); - sharp::RemoveExifOrientation(image); } - - // Flop (mirror about X axis) - if (baton->flop) { + // Mirror horizontally (left-right) about the y-axis + if (baton->flop != autoFlop) { image = image.flip(VIPS_DIRECTION_HORIZONTAL); - sharp::RemoveExifOrientation(image); + } + // Rotate post-extract 90-angle + if (rotation != VIPS_ANGLE_D0) { + if (rotation != VIPS_ANGLE_D180) { + MultiPageUnsupported(nPages, "Rotate"); + } + image = image.rot(rotation); } // Join additional color channels to the image - if (baton->joinChannelIn.size() > 0) { + if (!baton->joinChannelIn.empty()) { VImage joinImage; - ImageType joinImageType = ImageType::UNKNOWN; + sharp::ImageType joinImageType = sharp::ImageType::UNKNOWN; for (unsigned int i = 0; i < baton->joinChannelIn.size(); i++) { - std::tie(joinImage, joinImageType) = sharp::OpenInput(baton->joinChannelIn[i], baton->accessMethod); + baton->joinChannelIn[i]->access = access; + std::tie(joinImage, joinImageType) = sharp::OpenInput(baton->joinChannelIn[i]); + joinImage = sharp::EnsureColourspace(joinImage, baton->colourspacePipeline); image = image.bandjoin(joinImage); } image = image.copy(VImage::option()->set("interpretation", baton->colourspace)); + image = sharp::RemoveGifPalette(image); + } + + inputWidth = image.width(); + inputHeight = nPages > 1 ? targetPageHeight : image.height(); + + // Resolve dimensions + if (baton->width <= 0) { + baton->width = inputWidth; + } + if (baton->height <= 0) { + baton->height = inputHeight; } // Crop/embed - if (image.width() != baton->width || image.height() != baton->height) { - if (baton->canvas == Canvas::EMBED) { - // Scale up 8-bit values to match 16-bit input image - double const multiplier = sharp::Is16Bit(image.interpretation()) ? 256.0 : 1.0; - // Create background colour + if (inputWidth != baton->width || inputHeight != baton->height) { + if (baton->canvas == sharp::Canvas::EMBED) { std::vector background; - if (image.bands() > 2) { - background = { - multiplier * baton->background[0], - multiplier * baton->background[1], - multiplier * baton->background[2] - }; - } else { - // Convert sRGB to greyscale - background = { multiplier * ( - 0.2126 * baton->background[0] + - 0.7152 * baton->background[1] + - 0.0722 * baton->background[2]) - }; - } - // Add alpha channel to background colour - if (baton->background[3] < 255.0 || HasAlpha(image)) { - background.push_back(baton->background[3] * multiplier); + std::tie(image, background) = sharp::ApplyAlpha(image, baton->resizeBackground, shouldPremultiplyAlpha); + + // Embed + const auto& [left, top] = sharp::CalculateEmbedPosition( + inputWidth, inputHeight, baton->width, baton->height, baton->position); + const int width = std::max(inputWidth, baton->width); + const int height = std::max(inputHeight, baton->height); + + image = nPages > 1 + ? sharp::EmbedMultiPage(image, + left, top, width, height, VIPS_EXTEND_BACKGROUND, background, nPages, &targetPageHeight) + : image.embed(left, top, width, height, VImage::option() + ->set("extend", VIPS_EXTEND_BACKGROUND) + ->set("background", background)); + } else if (baton->canvas == sharp::Canvas::CROP) { + if (baton->width > inputWidth) { + baton->width = inputWidth; } - // Ensure background colour uses correct colourspace - background = sharp::GetRgbaAsColourspace(background, image.interpretation()); - // Add non-transparent alpha channel, if required - if (baton->background[3] < 255.0 && !HasAlpha(image)) { - image = image.bandjoin( - VImage::new_matrix(image.width(), image.height()).new_from_image(255 * multiplier)); + if (baton->height > inputHeight) { + baton->height = inputHeight; } - // Embed - int left = static_cast(round((baton->width - image.width()) / 2)); - int top = static_cast(round((baton->height - image.height()) / 2)); - image = image.embed(left, top, baton->width, baton->height, VImage::option() - ->set("extend", VIPS_EXTEND_BACKGROUND) - ->set("background", background)); - } else if (baton->canvas != Canvas::IGNORE_ASPECT) { - // Crop/max/min - if (baton->crop < 9) { + + // Crop + if (baton->position < 9) { // Gravity-based crop - int left; - int top; - std::tie(left, top) = sharp::CalculateCrop( - image.width(), image.height(), baton->width, baton->height, baton->crop); - int width = std::min(image.width(), baton->width); - int height = std::min(image.height(), baton->height); - image = image.extract_area(left, top, width, height); + const auto& [left, top] = sharp::CalculateCrop( + inputWidth, inputHeight, baton->width, baton->height, baton->position); + const int width = std::min(inputWidth, baton->width); + const int height = std::min(inputHeight, baton->height); + + image = nPages > 1 + ? sharp::CropMultiPage(image, + left, top, width, height, nPages, &targetPageHeight) + : image.extract_area(left, top, width, height); } else { + int attention_x; + int attention_y; + // Attention-based or Entropy-based crop + MultiPageUnsupported(nPages, "Resize strategy"); + image = sharp::StaySequential(image); image = image.smartcrop(baton->width, baton->height, VImage::option() - ->set("interesting", baton->crop == 16 ? VIPS_INTERESTING_ENTROPY : VIPS_INTERESTING_ATTENTION)); + ->set("interesting", baton->position == 16 ? VIPS_INTERESTING_ENTROPY : VIPS_INTERESTING_ATTENTION) + ->set("premultiplied", shouldPremultiplyAlpha) + ->set("attention_x", &attention_x) + ->set("attention_y", &attention_y)); + baton->hasCropOffset = true; + baton->cropOffsetLeft = static_cast(image.xoffset()); + baton->cropOffsetTop = static_cast(image.yoffset()); + baton->hasAttentionCenter = true; + baton->attentionX = static_cast(attention_x * jpegShrinkOnLoad / scale); + baton->attentionY = static_cast(attention_y * jpegShrinkOnLoad / scale); } } } + // Rotate post-extract non-90 angle + if (!baton->rotateBefore && baton->rotationAngle != 0.0) { + MultiPageUnsupported(nPages, "Rotate"); + image = sharp::StaySequential(image); + std::vector background; + std::tie(image, background) = sharp::ApplyAlpha(image, baton->rotationBackground, shouldPremultiplyAlpha); + image = image.rotate(baton->rotationAngle, VImage::option()->set("background", background)); + } + // Post extraction if (baton->topOffsetPost != -1) { - image = image.extract_area( - baton->leftOffsetPost, baton->topOffsetPost, baton->widthPost, baton->heightPost); + if (nPages > 1) { + image = sharp::CropMultiPage(image, + baton->leftOffsetPost, baton->topOffsetPost, baton->widthPost, baton->heightPost, + nPages, &targetPageHeight); + + // heightPost is used in the info object, so update to reflect the number of pages + baton->heightPost *= nPages; + } else { + image = image.extract_area( + baton->leftOffsetPost, baton->topOffsetPost, baton->widthPost, baton->heightPost); + } + } + + // Affine transform + if (!baton->affineMatrix.empty()) { + MultiPageUnsupported(nPages, "Affine"); + image = sharp::StaySequential(image); + std::vector background; + std::tie(image, background) = sharp::ApplyAlpha(image, baton->affineBackground, shouldPremultiplyAlpha); + vips::VInterpolate interp = vips::VInterpolate::new_from_name( + const_cast(baton->affineInterpolator.data())); + image = image.affine(baton->affineMatrix, VImage::option()->set("background", background) + ->set("idx", baton->affineIdx) + ->set("idy", baton->affineIdy) + ->set("odx", baton->affineOdx) + ->set("ody", baton->affineOdy) + ->set("interpolate", interp)); } // Extend edges if (baton->extendTop > 0 || baton->extendBottom > 0 || baton->extendLeft > 0 || baton->extendRight > 0) { - // Scale up 8-bit values to match 16-bit input image - double const multiplier = sharp::Is16Bit(image.interpretation()) ? 256.0 : 1.0; - // Create background colour - std::vector background; - if (image.bands() > 2) { - background = { - multiplier * baton->background[0], - multiplier * baton->background[1], - multiplier * baton->background[2] - }; - } else { - // Convert sRGB to greyscale - background = { multiplier * ( - 0.2126 * baton->background[0] + - 0.7152 * baton->background[1] + - 0.0722 * baton->background[2]) - }; - } - // Add alpha channel to background colour - if (baton->background[3] < 255.0 || HasAlpha(image)) { - background.push_back(baton->background[3] * multiplier); - } - // Ensure background colour uses correct colourspace - background = sharp::GetRgbaAsColourspace(background, image.interpretation()); - // Add non-transparent alpha channel, if required - if (baton->background[3] < 255.0 && !HasAlpha(image)) { - image = image.bandjoin( - VImage::new_matrix(image.width(), image.height()).new_from_image(255 * multiplier)); - } // Embed baton->width = image.width() + baton->extendLeft + baton->extendRight; - baton->height = image.height() + baton->extendTop + baton->extendBottom; + baton->height = (nPages > 1 ? targetPageHeight : image.height()) + baton->extendTop + baton->extendBottom; - image = image.embed(baton->extendLeft, baton->extendTop, baton->width, baton->height, - VImage::option()->set("extend", VIPS_EXTEND_BACKGROUND)->set("background", background)); + if (baton->extendWith == VIPS_EXTEND_BACKGROUND) { + std::vector background; + std::tie(image, background) = sharp::ApplyAlpha(image, baton->extendBackground, shouldPremultiplyAlpha); + + image = sharp::StaySequential(image, nPages > 1); + image = nPages > 1 + ? sharp::EmbedMultiPage(image, + baton->extendLeft, baton->extendTop, baton->width, baton->height, + baton->extendWith, background, nPages, &targetPageHeight) + : image.embed(baton->extendLeft, baton->extendTop, baton->width, baton->height, + VImage::option()->set("extend", baton->extendWith)->set("background", background)); + } else { + std::vector ignoredBackground(1); + image = sharp::StaySequential(image); + image = nPages > 1 + ? sharp::EmbedMultiPage(image, + baton->extendLeft, baton->extendTop, baton->width, baton->height, + baton->extendWith, ignoredBackground, nPages, &targetPageHeight) + : image.embed(baton->extendLeft, baton->extendTop, baton->width, baton->height, + VImage::option()->set("extend", baton->extendWith)); + } + } + // Median - must happen before blurring, due to the utility of blurring after thresholding + if (baton->medianSize > 0) { + image = image.median(baton->medianSize); } // Threshold - must happen before blurring, due to the utility of blurring after thresholding + // Threshold - must happen before unflatten to enable non-white unflattening if (baton->threshold != 0) { image = sharp::Threshold(image, baton->threshold, baton->thresholdGrayscale); } + // Dilate - must happen before blurring, due to the utility of dilating after thresholding + if (baton->dilateWidth != 0) { + image = sharp::Dilate(image, baton->dilateWidth); + } + + // Erode - must happen before blurring, due to the utility of eroding after thresholding + if (baton->erodeWidth != 0) { + image = sharp::Erode(image, baton->erodeWidth); + } + // Blur if (shouldBlur) { - image = sharp::Blur(image, baton->blurSigma); + image = sharp::Blur(image, baton->blurSigma, baton->precision, baton->minAmpl); + } + + // Unflatten the image + if (baton->unflatten) { + image = sharp::Unflatten(image); } // Convolve @@ -601,108 +615,161 @@ class PipelineWorker : public Nan::AsyncWorker { baton->convKernel); } + // Recomb + if (!baton->recombMatrix.empty()) { + image = sharp::Recomb(image, baton->recombMatrix); + } + + // Modulate + if (baton->brightness != 1.0 || baton->saturation != 1.0 || baton->hue != 0.0 || baton->lightness != 0.0) { + image = sharp::Modulate(image, baton->brightness, baton->saturation, baton->hue, baton->lightness); + } + // Sharpen if (shouldSharpen) { - image = sharp::Sharpen(image, baton->sharpenSigma, baton->sharpenFlat, baton->sharpenJagged); + image = sharp::Sharpen(image, baton->sharpenSigma, baton->sharpenM1, baton->sharpenM2, + baton->sharpenX1, baton->sharpenY2, baton->sharpenY3); } - // Composite with overlay, if present - if (baton->overlay != nullptr) { - // Verify overlay image is within current dimensions - if (overlayImage.width() > image.width() || overlayImage.height() > image.height()) { - throw vips::VError("Overlay image must have same dimensions or smaller"); - } - // Check if overlay is tiled - if (baton->overlayTile) { - int const overlayImageWidth = overlayImage.width(); - int const overlayImageHeight = overlayImage.height(); - int across = 0; - int down = 0; - // Use gravity in overlay - if (overlayImageWidth <= baton->width) { - across = static_cast(ceil(static_cast(image.width()) / overlayImageWidth)); + // Reverse premultiplication after all transformations + if (shouldPremultiplyAlpha) { + image = image.unpremultiply().cast(premultiplyFormat); + } + baton->premultiplied = shouldPremultiplyAlpha; + + // Composite + if (shouldComposite) { + std::vector images = { image }; + std::vector modes, xs, ys; + for (Composite *composite : baton->composite) { + VImage compositeImage; + sharp::ImageType compositeImageType = sharp::ImageType::UNKNOWN; + composite->input->access = access; + std::tie(compositeImage, compositeImageType) = sharp::OpenInput(composite->input); + + if (composite->input->autoOrient) { + // Respect EXIF Orientation + VipsAngle compositeAutoRotation = VIPS_ANGLE_D0; + bool compositeAutoFlop = false; + std::tie(compositeAutoRotation, compositeAutoFlop) = + CalculateExifRotationAndFlop(sharp::ExifOrientation(compositeImage)); + + compositeImage = sharp::RemoveExifOrientation(compositeImage); + compositeImage = sharp::StaySequential(compositeImage, compositeAutoRotation != VIPS_ANGLE_D0); + + if (compositeAutoRotation != VIPS_ANGLE_D0) { + compositeImage = compositeImage.rot(compositeAutoRotation); + } + if (compositeAutoFlop) { + compositeImage = compositeImage.flip(VIPS_DIRECTION_HORIZONTAL); + } } - if (overlayImageHeight <= baton->height) { - down = static_cast(ceil(static_cast(image.height()) / overlayImageHeight)); + + // Verify within current dimensions + if (compositeImage.width() > image.width() || compositeImage.height() > image.height()) { + throw vips::VError("Image to composite must have same dimensions or smaller"); } - if (across != 0 || down != 0) { - int left; - int top; - overlayImage = overlayImage.replicate(across, down); - if (baton->overlayXOffset >= 0 && baton->overlayYOffset >= 0) { - // the overlayX/YOffsets will now be used to CalculateCrop for extract_area - std::tie(left, top) = sharp::CalculateCrop( - overlayImage.width(), overlayImage.height(), image.width(), image.height(), - baton->overlayXOffset, baton->overlayYOffset); - } else { - // the overlayGravity will now be used to CalculateCrop for extract_area - std::tie(left, top) = sharp::CalculateCrop( - overlayImage.width(), overlayImage.height(), image.width(), image.height(), baton->overlayGravity); + // Check if overlay is tiled + if (composite->tile) { + int across = 0; + int down = 0; + // Use gravity in overlay + if (compositeImage.width() <= image.width()) { + across = static_cast(ceil(static_cast(image.width()) / compositeImage.width())); + // Ensure odd number of tiles across when gravity is centre, north or south + if (composite->gravity == 0 || composite->gravity == 1 || composite->gravity == 3) { + across |= 1; + } } - overlayImage = overlayImage.extract_area(left, top, image.width(), image.height()); - } - // the overlayGravity was used for extract_area, therefore set it back to its default value of 0 - baton->overlayGravity = 0; - } - if (baton->overlayCutout) { - // 'cut out' the image, premultiplication is not required - image = sharp::Cutout(overlayImage, image, baton->overlayGravity); - } else { - // Ensure overlay is sRGB - overlayImage = overlayImage.colourspace(VIPS_INTERPRETATION_sRGB); - // Ensure overlay matches premultiplication state - if (shouldPremultiplyAlpha) { - // Ensure overlay has alpha channel - if (!HasAlpha(overlayImage)) { - double const multiplier = sharp::Is16Bit(overlayImage.interpretation()) ? 256.0 : 1.0; - overlayImage = overlayImage.bandjoin( - VImage::new_matrix(overlayImage.width(), overlayImage.height()).new_from_image(255 * multiplier)); + if (compositeImage.height() <= image.height()) { + down = static_cast(ceil(static_cast(image.height()) / compositeImage.height())); + // Ensure odd number of tiles down when gravity is centre, east or west + if (composite->gravity == 0 || composite->gravity == 2 || composite->gravity == 4) { + down |= 1; + } } - overlayImage = overlayImage.premultiply(); + if (across != 0 || down != 0) { + int left; + int top; + compositeImage = sharp::StaySequential(compositeImage).replicate(across, down); + if (composite->hasOffset) { + std::tie(left, top) = sharp::CalculateCrop( + compositeImage.width(), compositeImage.height(), image.width(), image.height(), + composite->left, composite->top); + } else { + std::tie(left, top) = sharp::CalculateCrop( + compositeImage.width(), compositeImage.height(), image.width(), image.height(), composite->gravity); + } + compositeImage = compositeImage.extract_area(left, top, image.width(), image.height()); + } + // gravity was used for extract_area, set it back to its default value of 0 + composite->gravity = 0; } + // Ensure image to composite is with unpremultiplied alpha + compositeImage = sharp::EnsureAlpha(compositeImage, 1); + if (composite->premultiplied) compositeImage = compositeImage.unpremultiply(); + // Calculate position int left; int top; - if (baton->overlayXOffset >= 0 && baton->overlayYOffset >= 0) { - // Composite images at given offsets - std::tie(left, top) = sharp::CalculateCrop(image.width(), image.height(), - overlayImage.width(), overlayImage.height(), baton->overlayXOffset, baton->overlayYOffset); + if (composite->hasOffset) { + // Composite image at given offsets + if (composite->tile) { + std::tie(left, top) = sharp::CalculateCrop(image.width(), image.height(), + compositeImage.width(), compositeImage.height(), composite->left, composite->top); + } else { + left = composite->left; + top = composite->top; + } } else { - // Composite images with given gravity + // Composite image with given gravity std::tie(left, top) = sharp::CalculateCrop(image.width(), image.height(), - overlayImage.width(), overlayImage.height(), baton->overlayGravity); + compositeImage.width(), compositeImage.height(), composite->gravity); } - image = sharp::Composite(image, overlayImage, left, top); + images.push_back(compositeImage); + modes.push_back(composite->mode); + xs.push_back(left); + ys.push_back(top); } + image = VImage::composite(images, modes, VImage::option() + ->set("compositing_space", baton->colourspacePipeline == VIPS_INTERPRETATION_LAST + ? VIPS_INTERPRETATION_sRGB + : baton->colourspacePipeline) + ->set("x", xs) + ->set("y", ys)); + image = sharp::RemoveGifPalette(image); } - // Reverse premultiplication after all transformations: - if (shouldPremultiplyAlpha) { - image = image.unpremultiply(); - // Cast pixel values to integer - if (sharp::Is16Bit(image.interpretation())) { - image = image.cast(VIPS_FORMAT_USHORT); - } else { - image = image.cast(VIPS_FORMAT_UCHAR); - } + // Gamma decoding (brighten) + if (baton->gammaOut >= 1 && baton->gammaOut <= 3) { + image = sharp::Gamma(image, baton->gammaOut); } - baton->premultiplied = shouldPremultiplyAlpha; - // Gamma decoding (brighten) - if (baton->gamma >= 1 && baton->gamma <= 3) { - image = sharp::Gamma(image, baton->gamma); + // Linear adjustment (a * in + b) + if (!baton->linearA.empty()) { + image = sharp::Linear(image, baton->linearA, baton->linearB); } // Apply normalisation - stretch luminance to cover full dynamic range if (baton->normalise) { - image = sharp::Normalise(image); + image = sharp::StaySequential(image); + image = sharp::Normalise(image, baton->normaliseLower, baton->normaliseUpper); + } + + // Apply contrast limiting adaptive histogram equalization (CLAHE) + if (baton->claheWidth != 0 && baton->claheHeight != 0) { + image = sharp::StaySequential(image); + image = sharp::Clahe(image, baton->claheWidth, baton->claheHeight, baton->claheMaxSlope); } // Apply bitwise boolean operation between images if (baton->boolean != nullptr) { VImage booleanImage; - ImageType booleanImageType = ImageType::UNKNOWN; - std::tie(booleanImage, booleanImageType) = sharp::OpenInput(baton->boolean, baton->accessMethod); + sharp::ImageType booleanImageType = sharp::ImageType::UNKNOWN; + baton->boolean->access = access; + std::tie(booleanImage, booleanImageType) = sharp::OpenInput(baton->boolean); + booleanImage = sharp::EnsureColourspace(booleanImage, baton->colourspacePipeline); image = sharp::Boolean(image, booleanImage, baton->booleanOp); + image = sharp::RemoveGifPalette(image); } // Apply per-channel Bandbool bitwise operations after all other operations @@ -710,52 +777,126 @@ class PipelineWorker : public Nan::AsyncWorker { image = sharp::Bandbool(image, baton->bandBoolOp); } - // Extract an image channel (aka vips band) - if (baton->extractChannel > -1) { - if (baton->extractChannel >= image.bands()) { - (baton->err).append("Cannot extract channel from image. Too few channels in image."); - return Error(); - } - image = image.extract_band(baton->extractChannel); + // Tint the image + if (baton->tint[0] >= 0.0) { + image = sharp::Tint(image, baton->tint); + } + + // Remove alpha channel, if any + if (baton->removeAlpha) { + image = sharp::RemoveAlpha(image); + } + + // Ensure alpha channel, if missing + if (baton->ensureAlpha != -1) { + image = sharp::EnsureAlpha(image, baton->ensureAlpha); } - // Convert image to sRGB, if not already + + // Ensure output colour space if (sharp::Is16Bit(image.interpretation())) { image = image.cast(VIPS_FORMAT_USHORT); } if (image.interpretation() != baton->colourspace) { - // Convert colourspace, pass the current known interpretation so libvips doesn't have to guess image = image.colourspace(baton->colourspace, VImage::option()->set("source_space", image.interpretation())); - // Transform colours from embedded profile to output profile - if (baton->withMetadata && sharp::HasProfile(image) && profileMap[baton->colourspace] != std::string()) { - image = image.icc_transform(const_cast(profileMap[baton->colourspace].data()), - VImage::option()->set("embedded", TRUE)); + if (inputProfile.first != nullptr && baton->withIccProfile.empty()) { + image = sharp::SetProfile(image, inputProfile); } } - // Override EXIF Orientation tag - if (baton->withMetadata && baton->withMetadataOrientation != -1) { - sharp::SetExifOrientation(image, baton->withMetadataOrientation); + // Extract channel + if (baton->extractChannel > -1) { + if (baton->extractChannel >= image.bands()) { + if (baton->extractChannel == 3 && image.has_alpha()) { + baton->extractChannel = image.bands() - 1; + } else { + (baton->err) + .append("Cannot extract channel ").append(std::to_string(baton->extractChannel)) + .append(" from image with channels 0-").append(std::to_string(image.bands() - 1)); + return Error(); + } + } + VipsInterpretation colourspace = sharp::Is16Bit(image.interpretation()) + ? VIPS_INTERPRETATION_GREY16 + : VIPS_INTERPRETATION_B_W; + image = image + .extract_band(baton->extractChannel) + .copy(VImage::option()->set("interpretation", colourspace)); + } + + // Apply output ICC profile + if (!baton->withIccProfile.empty()) { + try { + image = image.icc_transform(const_cast(baton->withIccProfile.data()), VImage::option() + ->set("input_profile", processingProfile) + ->set("embedded", true) + ->set("depth", sharp::Is16Bit(image.interpretation()) ? 16 : 8) + ->set("intent", VIPS_INTENT_PERCEPTUAL)); + } catch(...) { + sharp::VipsWarningCallback(nullptr, G_LOG_LEVEL_WARNING, "Invalid profile", nullptr); + } } + // Negate the colours in the image + if (baton->negate) { + image = sharp::Negate(image, baton->negateAlpha); + } + + // Override EXIF Orientation tag + if (baton->withMetadataOrientation != -1) { + image = sharp::SetExifOrientation(image, baton->withMetadataOrientation); + } + // Override pixel density + if (baton->withMetadataDensity > 0) { + image = sharp::SetDensity(image, baton->withMetadataDensity); + } + // EXIF key/value pairs + if (baton->keepMetadata & VIPS_FOREIGN_KEEP_EXIF) { + image = image.copy(); + if (!baton->withExifMerge) { + image = sharp::RemoveExif(image); + } + for (const auto& [key, value] : baton->withExif) { + image.set(key.c_str(), value.c_str()); + } + } + // XMP buffer + if ((baton->keepMetadata & VIPS_FOREIGN_KEEP_XMP) && !baton->withXmp.empty()) { + image = image.copy(); + image.set(VIPS_META_XMP_NAME, nullptr, + const_cast(static_cast(baton->withXmp.c_str())), baton->withXmp.size()); + } // Number of channels used in output image baton->channels = image.bands(); baton->width = image.width(); baton->height = image.height(); + + image = sharp::SetAnimationProperties( + image, nPages, targetPageHeight, baton->delay, baton->loop); + + if (image.get_typeof(VIPS_META_PAGE_HEIGHT) == G_TYPE_INT) { + baton->pageHeightOut = image.get_int(VIPS_META_PAGE_HEIGHT); + baton->pagesOut = image.get_int(VIPS_META_N_PAGES); + } + // Output + sharp::SetTimeout(image, baton->timeoutSeconds); if (baton->fileOut.empty()) { // Buffer output - if (baton->formatOut == "jpeg" || (baton->formatOut == "input" && inputImageType == ImageType::JPEG)) { + if (baton->formatOut == "jpeg" || (baton->formatOut == "input" && inputImageType == sharp::ImageType::JPEG)) { // Write JPEG to buffer - sharp::AssertImageTypeDimensions(image, ImageType::JPEG); - VipsArea *area = VIPS_AREA(image.jpegsave_buffer(VImage::option() - ->set("strip", !baton->withMetadata) + sharp::AssertImageTypeDimensions(image, sharp::ImageType::JPEG); + VipsArea *area = reinterpret_cast(image.jpegsave_buffer(VImage::option() + ->set("keep", baton->keepMetadata) ->set("Q", baton->jpegQuality) ->set("interlace", baton->jpegProgressive) - ->set("no_subsample", baton->jpegChromaSubsampling == "4:4:4") + ->set("subsample_mode", baton->jpegChromaSubsampling == "4:4:4" + ? VIPS_FOREIGN_SUBSAMPLE_OFF + : VIPS_FOREIGN_SUBSAMPLE_ON) ->set("trellis_quant", baton->jpegTrellisQuantisation) + ->set("quant_table", baton->jpegQuantisationTable) ->set("overshoot_deringing", baton->jpegOvershootDeringing) ->set("optimize_scans", baton->jpegOptimiseScans) - ->set("optimize_coding", TRUE))); + ->set("optimize_coding", baton->jpegOptimiseCoding))); baton->bufferOut = static_cast(area->data); baton->bufferOutLength = area->length; area->free_fn = nullptr; @@ -766,69 +907,171 @@ class PipelineWorker : public Nan::AsyncWorker { } else { baton->channels = std::min(baton->channels, 3); } + } else if (baton->formatOut == "jp2" || (baton->formatOut == "input" + && inputImageType == sharp::ImageType::JP2)) { + // Write JP2 to Buffer + sharp::AssertImageTypeDimensions(image, sharp::ImageType::JP2); + VipsArea *area = reinterpret_cast(image.jp2ksave_buffer(VImage::option() + ->set("Q", baton->jp2Quality) + ->set("lossless", baton->jp2Lossless) + ->set("subsample_mode", baton->jp2ChromaSubsampling == "4:4:4" + ? VIPS_FOREIGN_SUBSAMPLE_OFF : VIPS_FOREIGN_SUBSAMPLE_ON) + ->set("tile_height", baton->jp2TileHeight) + ->set("tile_width", baton->jp2TileWidth))); + baton->bufferOut = static_cast(area->data); + baton->bufferOutLength = area->length; + area->free_fn = nullptr; + vips_area_unref(area); + baton->formatOut = "jp2"; } else if (baton->formatOut == "png" || (baton->formatOut == "input" && - (inputImageType == ImageType::PNG || inputImageType == ImageType::GIF || inputImageType == ImageType::SVG))) { + (inputImageType == sharp::ImageType::PNG || inputImageType == sharp::ImageType::SVG))) { // Write PNG to buffer - sharp::AssertImageTypeDimensions(image, ImageType::PNG); - // Strip profile - if (!baton->withMetadata) { - vips_image_remove(image.get_image(), VIPS_META_ICC_NAME); - } - VipsArea *area = VIPS_AREA(image.pngsave_buffer(VImage::option() + sharp::AssertImageTypeDimensions(image, sharp::ImageType::PNG); + VipsArea *area = reinterpret_cast(image.pngsave_buffer(VImage::option() + ->set("keep", baton->keepMetadata) ->set("interlace", baton->pngProgressive) ->set("compression", baton->pngCompressionLevel) - ->set("filter", baton->pngAdaptiveFiltering ? VIPS_FOREIGN_PNG_FILTER_ALL : VIPS_FOREIGN_PNG_FILTER_NONE))); + ->set("filter", baton->pngAdaptiveFiltering ? VIPS_FOREIGN_PNG_FILTER_ALL : VIPS_FOREIGN_PNG_FILTER_NONE) + ->set("palette", baton->pngPalette) + ->set("Q", baton->pngQuality) + ->set("effort", baton->pngEffort) + ->set("bitdepth", sharp::Is16Bit(image.interpretation()) ? 16 : baton->pngBitdepth) + ->set("dither", baton->pngDither))); baton->bufferOut = static_cast(area->data); baton->bufferOutLength = area->length; area->free_fn = nullptr; vips_area_unref(area); baton->formatOut = "png"; - } else if (baton->formatOut == "webp" || (baton->formatOut == "input" && inputImageType == ImageType::WEBP)) { + } else if (baton->formatOut == "webp" || + (baton->formatOut == "input" && inputImageType == sharp::ImageType::WEBP)) { // Write WEBP to buffer - sharp::AssertImageTypeDimensions(image, ImageType::WEBP); - VipsArea *area = VIPS_AREA(image.webpsave_buffer(VImage::option() - ->set("strip", !baton->withMetadata) + sharp::AssertImageTypeDimensions(image, sharp::ImageType::WEBP); + VipsArea *area = reinterpret_cast(image.webpsave_buffer(VImage::option() + ->set("keep", baton->keepMetadata) ->set("Q", baton->webpQuality) ->set("lossless", baton->webpLossless) ->set("near_lossless", baton->webpNearLossless) + ->set("smart_subsample", baton->webpSmartSubsample) + ->set("smart_deblock", baton->webpSmartDeblock) + ->set("preset", baton->webpPreset) + ->set("effort", baton->webpEffort) + ->set("min_size", baton->webpMinSize) + ->set("mixed", baton->webpMixed) ->set("alpha_q", baton->webpAlphaQuality))); baton->bufferOut = static_cast(area->data); baton->bufferOutLength = area->length; area->free_fn = nullptr; vips_area_unref(area); baton->formatOut = "webp"; - } else if (baton->formatOut == "tiff" || (baton->formatOut == "input" && inputImageType == ImageType::TIFF)) { + } else if (baton->formatOut == "gif" || + (baton->formatOut == "input" && inputImageType == sharp::ImageType::GIF)) { + // Write GIF to buffer + sharp::AssertImageTypeDimensions(image, sharp::ImageType::GIF); + VipsArea *area = reinterpret_cast(image.gifsave_buffer(VImage::option() + ->set("keep", baton->keepMetadata) + ->set("bitdepth", baton->gifBitdepth) + ->set("effort", baton->gifEffort) + ->set("reuse", baton->gifReuse) + ->set("interlace", baton->gifProgressive) + ->set("interframe_maxerror", baton->gifInterFrameMaxError) + ->set("interpalette_maxerror", baton->gifInterPaletteMaxError) + ->set("keep_duplicate_frames", baton->gifKeepDuplicateFrames) + ->set("dither", baton->gifDither))); + baton->bufferOut = static_cast(area->data); + baton->bufferOutLength = area->length; + area->free_fn = nullptr; + vips_area_unref(area); + baton->formatOut = "gif"; + } else if (baton->formatOut == "tiff" || + (baton->formatOut == "input" && inputImageType == sharp::ImageType::TIFF)) { // Write TIFF to buffer if (baton->tiffCompression == VIPS_FOREIGN_TIFF_COMPRESSION_JPEG) { - sharp::AssertImageTypeDimensions(image, ImageType::JPEG); + sharp::AssertImageTypeDimensions(image, sharp::ImageType::JPEG); + baton->channels = std::min(baton->channels, 3); } // Cast pixel values to float, if required if (baton->tiffPredictor == VIPS_FOREIGN_TIFF_PREDICTOR_FLOAT) { image = image.cast(VIPS_FORMAT_FLOAT); } - VipsArea *area = VIPS_AREA(image.tiffsave_buffer(VImage::option() - ->set("strip", !baton->withMetadata) + VipsArea *area = reinterpret_cast(image.tiffsave_buffer(VImage::option() + ->set("keep", baton->keepMetadata) ->set("Q", baton->tiffQuality) - ->set("squash", baton->tiffSquash) + ->set("bitdepth", baton->tiffBitdepth) ->set("compression", baton->tiffCompression) + ->set("bigtiff", baton->tiffBigtiff) + ->set("miniswhite", baton->tiffMiniswhite) ->set("predictor", baton->tiffPredictor) + ->set("pyramid", baton->tiffPyramid) + ->set("tile", baton->tiffTile) + ->set("tile_height", baton->tiffTileHeight) + ->set("tile_width", baton->tiffTileWidth) ->set("xres", baton->tiffXres) - ->set("yres", baton->tiffYres))); + ->set("yres", baton->tiffYres) + ->set("resunit", baton->tiffResolutionUnit))); baton->bufferOut = static_cast(area->data); baton->bufferOutLength = area->length; area->free_fn = nullptr; vips_area_unref(area); baton->formatOut = "tiff"; - baton->channels = std::min(baton->channels, 3); - } else if (baton->formatOut == "raw" || (baton->formatOut == "input" && inputImageType == ImageType::RAW)) { + } else if (baton->formatOut == "heif" || + (baton->formatOut == "input" && inputImageType == sharp::ImageType::HEIF)) { + // Write HEIF to buffer + sharp::AssertImageTypeDimensions(image, sharp::ImageType::HEIF); + image = sharp::RemoveAnimationProperties(image); + VipsArea *area = reinterpret_cast(image.heifsave_buffer(VImage::option() + ->set("keep", baton->keepMetadata) + ->set("Q", baton->heifQuality) + ->set("compression", baton->heifCompression) + ->set("effort", baton->heifEffort) + ->set("bitdepth", baton->heifBitdepth) + ->set("subsample_mode", baton->heifChromaSubsampling == "4:4:4" + ? VIPS_FOREIGN_SUBSAMPLE_OFF : VIPS_FOREIGN_SUBSAMPLE_ON) + ->set("lossless", baton->heifLossless))); + baton->bufferOut = static_cast(area->data); + baton->bufferOutLength = area->length; + area->free_fn = nullptr; + vips_area_unref(area); + baton->formatOut = "heif"; + } else if (baton->formatOut == "dz") { + // Write DZ to buffer + baton->tileContainer = VIPS_FOREIGN_DZ_CONTAINER_ZIP; + if (!image.has_alpha()) { + baton->tileBackground.pop_back(); + } + image = sharp::StaySequential(image, baton->tileAngle != 0); + vips::VOption *options = BuildOptionsDZ(baton); + VipsArea *area = reinterpret_cast(image.dzsave_buffer(options)); + baton->bufferOut = static_cast(area->data); + baton->bufferOutLength = area->length; + area->free_fn = nullptr; + vips_area_unref(area); + baton->formatOut = "dz"; + } else if (baton->formatOut == "jxl" || + (baton->formatOut == "input" && inputImageType == sharp::ImageType::JXL)) { + // Write JXL to buffer + image = sharp::RemoveAnimationProperties(image); + VipsArea *area = reinterpret_cast(image.jxlsave_buffer(VImage::option() + ->set("keep", baton->keepMetadata) + ->set("distance", baton->jxlDistance) + ->set("tier", baton->jxlDecodingTier) + ->set("effort", baton->jxlEffort) + ->set("lossless", baton->jxlLossless))); + baton->bufferOut = static_cast(area->data); + baton->bufferOutLength = area->length; + area->free_fn = nullptr; + vips_area_unref(area); + baton->formatOut = "jxl"; + } else if (baton->formatOut == "raw" || + (baton->formatOut == "input" && inputImageType == sharp::ImageType::RAW)) { // Write raw, uncompressed image data to buffer if (baton->greyscale || image.interpretation() == VIPS_INTERPRETATION_B_W) { // Extract first band for greyscale image image = image[0]; + baton->channels = 1; } - if (image.format() != VIPS_FORMAT_UCHAR) { - // Cast pixels to uint8 (unsigned char) - image = image.cast(VIPS_FORMAT_UCHAR); + if (image.format() != baton->rawDepth) { + // Cast pixels to requested format + image = image.cast(baton->rawDepth); } // Get raw image data baton->bufferOut = static_cast(image.write_to_memory(&baton->bufferOutLength)); @@ -841,6 +1084,7 @@ class PipelineWorker : public Nan::AsyncWorker { // Unsupported output format (baton->err).append("Unsupported output format "); if (baton->formatOut == "input") { + (baton->err).append("when trying to match input format of "); (baton->err).append(ImageTypeId(inputImageType)); } else { (baton->err).append(baton->formatOut); @@ -852,117 +1096,165 @@ class PipelineWorker : public Nan::AsyncWorker { bool const isJpeg = sharp::IsJpeg(baton->fileOut); bool const isPng = sharp::IsPng(baton->fileOut); bool const isWebp = sharp::IsWebp(baton->fileOut); + bool const isGif = sharp::IsGif(baton->fileOut); bool const isTiff = sharp::IsTiff(baton->fileOut); + bool const isJp2 = sharp::IsJp2(baton->fileOut); + bool const isHeif = sharp::IsHeif(baton->fileOut); + bool const isJxl = sharp::IsJxl(baton->fileOut); bool const isDz = sharp::IsDz(baton->fileOut); bool const isDzZip = sharp::IsDzZip(baton->fileOut); bool const isV = sharp::IsV(baton->fileOut); - bool const matchInput = baton->formatOut == "input" && - !(isJpeg || isPng || isWebp || isTiff || isDz || isDzZip || isV); - if (baton->formatOut == "jpeg" || isJpeg || (matchInput && inputImageType == ImageType::JPEG)) { + bool const mightMatchInput = baton->formatOut == "input"; + bool const willMatchInput = mightMatchInput && + !(isJpeg || isPng || isWebp || isGif || isTiff || isJp2 || isHeif || isDz || isDzZip || isV); + + if (baton->formatOut == "jpeg" || (mightMatchInput && isJpeg) || + (willMatchInput && inputImageType == sharp::ImageType::JPEG)) { // Write JPEG to file - sharp::AssertImageTypeDimensions(image, ImageType::JPEG); + sharp::AssertImageTypeDimensions(image, sharp::ImageType::JPEG); image.jpegsave(const_cast(baton->fileOut.data()), VImage::option() - ->set("strip", !baton->withMetadata) + ->set("keep", baton->keepMetadata) ->set("Q", baton->jpegQuality) ->set("interlace", baton->jpegProgressive) - ->set("no_subsample", baton->jpegChromaSubsampling == "4:4:4") + ->set("subsample_mode", baton->jpegChromaSubsampling == "4:4:4" + ? VIPS_FOREIGN_SUBSAMPLE_OFF + : VIPS_FOREIGN_SUBSAMPLE_ON) ->set("trellis_quant", baton->jpegTrellisQuantisation) + ->set("quant_table", baton->jpegQuantisationTable) ->set("overshoot_deringing", baton->jpegOvershootDeringing) ->set("optimize_scans", baton->jpegOptimiseScans) - ->set("optimize_coding", TRUE)); + ->set("optimize_coding", baton->jpegOptimiseCoding)); baton->formatOut = "jpeg"; baton->channels = std::min(baton->channels, 3); - } else if (baton->formatOut == "png" || isPng || (matchInput && - (inputImageType == ImageType::PNG || inputImageType == ImageType::GIF || inputImageType == ImageType::SVG))) { + } else if (baton->formatOut == "jp2" || (mightMatchInput && isJp2) || + (willMatchInput && (inputImageType == sharp::ImageType::JP2))) { + // Write JP2 to file + sharp::AssertImageTypeDimensions(image, sharp::ImageType::JP2); + image.jp2ksave(const_cast(baton->fileOut.data()), VImage::option() + ->set("Q", baton->jp2Quality) + ->set("lossless", baton->jp2Lossless) + ->set("subsample_mode", baton->jp2ChromaSubsampling == "4:4:4" + ? VIPS_FOREIGN_SUBSAMPLE_OFF : VIPS_FOREIGN_SUBSAMPLE_ON) + ->set("tile_height", baton->jp2TileHeight) + ->set("tile_width", baton->jp2TileWidth)); + baton->formatOut = "jp2"; + } else if (baton->formatOut == "png" || (mightMatchInput && isPng) || (willMatchInput && + (inputImageType == sharp::ImageType::PNG || inputImageType == sharp::ImageType::SVG))) { // Write PNG to file - sharp::AssertImageTypeDimensions(image, ImageType::PNG); - // Strip profile - if (!baton->withMetadata) { - vips_image_remove(image.get_image(), VIPS_META_ICC_NAME); - } + sharp::AssertImageTypeDimensions(image, sharp::ImageType::PNG); image.pngsave(const_cast(baton->fileOut.data()), VImage::option() + ->set("keep", baton->keepMetadata) ->set("interlace", baton->pngProgressive) ->set("compression", baton->pngCompressionLevel) - ->set("filter", baton->pngAdaptiveFiltering ? VIPS_FOREIGN_PNG_FILTER_ALL : VIPS_FOREIGN_PNG_FILTER_NONE)); + ->set("filter", baton->pngAdaptiveFiltering ? VIPS_FOREIGN_PNG_FILTER_ALL : VIPS_FOREIGN_PNG_FILTER_NONE) + ->set("palette", baton->pngPalette) + ->set("Q", baton->pngQuality) + ->set("bitdepth", sharp::Is16Bit(image.interpretation()) ? 16 : baton->pngBitdepth) + ->set("effort", baton->pngEffort) + ->set("dither", baton->pngDither)); baton->formatOut = "png"; - } else if (baton->formatOut == "webp" || isWebp || (matchInput && inputImageType == ImageType::WEBP)) { + } else if (baton->formatOut == "webp" || (mightMatchInput && isWebp) || + (willMatchInput && inputImageType == sharp::ImageType::WEBP)) { // Write WEBP to file - AssertImageTypeDimensions(image, ImageType::WEBP); + sharp::AssertImageTypeDimensions(image, sharp::ImageType::WEBP); image.webpsave(const_cast(baton->fileOut.data()), VImage::option() - ->set("strip", !baton->withMetadata) + ->set("keep", baton->keepMetadata) ->set("Q", baton->webpQuality) ->set("lossless", baton->webpLossless) ->set("near_lossless", baton->webpNearLossless) + ->set("smart_subsample", baton->webpSmartSubsample) + ->set("smart_deblock", baton->webpSmartDeblock) + ->set("preset", baton->webpPreset) + ->set("effort", baton->webpEffort) + ->set("min_size", baton->webpMinSize) + ->set("mixed", baton->webpMixed) ->set("alpha_q", baton->webpAlphaQuality)); baton->formatOut = "webp"; - } else if (baton->formatOut == "tiff" || isTiff || (matchInput && inputImageType == ImageType::TIFF)) { + } else if (baton->formatOut == "gif" || (mightMatchInput && isGif) || + (willMatchInput && inputImageType == sharp::ImageType::GIF)) { + // Write GIF to file + sharp::AssertImageTypeDimensions(image, sharp::ImageType::GIF); + image.gifsave(const_cast(baton->fileOut.data()), VImage::option() + ->set("keep", baton->keepMetadata) + ->set("bitdepth", baton->gifBitdepth) + ->set("effort", baton->gifEffort) + ->set("reuse", baton->gifReuse) + ->set("interlace", baton->gifProgressive) + ->set("interframe_maxerror", baton->gifInterFrameMaxError) + ->set("interpalette_maxerror", baton->gifInterPaletteMaxError) + ->set("keep_duplicate_frames", baton->gifKeepDuplicateFrames) + ->set("dither", baton->gifDither)); + baton->formatOut = "gif"; + } else if (baton->formatOut == "tiff" || (mightMatchInput && isTiff) || + (willMatchInput && inputImageType == sharp::ImageType::TIFF)) { // Write TIFF to file if (baton->tiffCompression == VIPS_FOREIGN_TIFF_COMPRESSION_JPEG) { - sharp::AssertImageTypeDimensions(image, ImageType::JPEG); + sharp::AssertImageTypeDimensions(image, sharp::ImageType::JPEG); + baton->channels = std::min(baton->channels, 3); } // Cast pixel values to float, if required if (baton->tiffPredictor == VIPS_FOREIGN_TIFF_PREDICTOR_FLOAT) { image = image.cast(VIPS_FORMAT_FLOAT); } image.tiffsave(const_cast(baton->fileOut.data()), VImage::option() - ->set("strip", !baton->withMetadata) + ->set("keep", baton->keepMetadata) ->set("Q", baton->tiffQuality) - ->set("squash", baton->tiffSquash) + ->set("bitdepth", baton->tiffBitdepth) ->set("compression", baton->tiffCompression) + ->set("bigtiff", baton->tiffBigtiff) + ->set("miniswhite", baton->tiffMiniswhite) ->set("predictor", baton->tiffPredictor) + ->set("pyramid", baton->tiffPyramid) + ->set("tile", baton->tiffTile) + ->set("tile_height", baton->tiffTileHeight) + ->set("tile_width", baton->tiffTileWidth) ->set("xres", baton->tiffXres) - ->set("yres", baton->tiffYres)); + ->set("yres", baton->tiffYres) + ->set("resunit", baton->tiffResolutionUnit)); baton->formatOut = "tiff"; - baton->channels = std::min(baton->channels, 3); + } else if (baton->formatOut == "heif" || (mightMatchInput && isHeif) || + (willMatchInput && inputImageType == sharp::ImageType::HEIF)) { + // Write HEIF to file + sharp::AssertImageTypeDimensions(image, sharp::ImageType::HEIF); + image = sharp::RemoveAnimationProperties(image); + image.heifsave(const_cast(baton->fileOut.data()), VImage::option() + ->set("keep", baton->keepMetadata) + ->set("Q", baton->heifQuality) + ->set("compression", baton->heifCompression) + ->set("effort", baton->heifEffort) + ->set("bitdepth", baton->heifBitdepth) + ->set("subsample_mode", baton->heifChromaSubsampling == "4:4:4" + ? VIPS_FOREIGN_SUBSAMPLE_OFF : VIPS_FOREIGN_SUBSAMPLE_ON) + ->set("lossless", baton->heifLossless)); + baton->formatOut = "heif"; + } else if (baton->formatOut == "jxl" || (mightMatchInput && isJxl) || + (willMatchInput && inputImageType == sharp::ImageType::JXL)) { + // Write JXL to file + image = sharp::RemoveAnimationProperties(image); + image.jxlsave(const_cast(baton->fileOut.data()), VImage::option() + ->set("keep", baton->keepMetadata) + ->set("distance", baton->jxlDistance) + ->set("tier", baton->jxlDecodingTier) + ->set("effort", baton->jxlEffort) + ->set("lossless", baton->jxlLossless)); + baton->formatOut = "jxl"; } else if (baton->formatOut == "dz" || isDz || isDzZip) { + // Write DZ to file if (isDzZip) { baton->tileContainer = VIPS_FOREIGN_DZ_CONTAINER_ZIP; } - // Forward format options through suffix - std::string suffix; - if (baton->tileFormat == "png") { - std::vector> options { - {"interlace", baton->pngProgressive ? "TRUE" : "FALSE"}, - {"compression", std::to_string(baton->pngCompressionLevel)}, - {"filter", baton->pngAdaptiveFiltering ? "all" : "none"} - }; - suffix = AssembleSuffixString(".png", options); - } else if (baton->tileFormat == "webp") { - std::vector> options { - {"Q", std::to_string(baton->webpQuality)}, - {"alpha_q", std::to_string(baton->webpAlphaQuality)}, - {"lossless", baton->webpLossless ? "TRUE" : "FALSE"}, - {"near_lossless", baton->webpNearLossless ? "TRUE" : "FALSE"} - }; - suffix = AssembleSuffixString(".webp", options); - } else { - std::string extname = baton->tileLayout == VIPS_FOREIGN_DZ_LAYOUT_GOOGLE - || baton->tileLayout == VIPS_FOREIGN_DZ_LAYOUT_ZOOMIFY - ? ".jpg" : ".jpeg"; - std::vector> options { - {"Q", std::to_string(baton->jpegQuality)}, - {"interlace", baton->jpegProgressive ? "TRUE" : "FALSE"}, - {"no_subsample", baton->jpegChromaSubsampling == "4:4:4" ? "TRUE": "FALSE"}, - {"trellis_quant", baton->jpegTrellisQuantisation ? "TRUE" : "FALSE"}, - {"overshoot_deringing", baton->jpegOvershootDeringing ? "TRUE": "FALSE"}, - {"optimize_scans", baton->jpegOptimiseScans ? "TRUE": "FALSE"}, - {"optimize_coding", "TRUE"} - }; - suffix = AssembleSuffixString(extname, options); + if (!image.has_alpha()) { + baton->tileBackground.pop_back(); } - // Write DZ to file - image.dzsave(const_cast(baton->fileOut.data()), VImage::option() - ->set("strip", !baton->withMetadata) - ->set("tile_size", baton->tileSize) - ->set("overlap", baton->tileOverlap) - ->set("container", baton->tileContainer) - ->set("layout", baton->tileLayout) - ->set("suffix", const_cast(suffix.data()))); + image = sharp::StaySequential(image, baton->tileAngle != 0); + vips::VOption *options = BuildOptionsDZ(baton); + image.dzsave(const_cast(baton->fileOut.data()), options); baton->formatOut = "dz"; - } else if (baton->formatOut == "v" || isV || (matchInput && inputImageType == ImageType::VIPS)) { + } else if (baton->formatOut == "v" || (mightMatchInput && isV) || + (willMatchInput && inputImageType == sharp::ImageType::VIPS)) { // Write V to file image.vipssave(const_cast(baton->fileOut.data()), VImage::option() - ->set("strip", !baton->withMetadata)); + ->set("keep", baton->keepMetadata)); baton->formatOut = "v"; } else { // Unsupported output format @@ -971,23 +1263,39 @@ class PipelineWorker : public Nan::AsyncWorker { } } } catch (vips::VError const &err) { - (baton->err).append(err.what()); + char const *what = err.what(); + if (what && what[0]) { + (baton->err).append(what); + } else { + if (baton->input->failOn == VIPS_FAIL_ON_WARNING) { + (baton->err).append("Warning treated as error due to failOn setting"); + baton->errUseWarning = true; + } else { + (baton->err).append("Unknown error"); + } + } } // Clean up libvips' per-request data and threads vips_error_clear(); vips_thread_shutdown(); } - void HandleOKCallback() { - using Nan::New; - using Nan::Set; - Nan::HandleScope(); + void OnOK() { + Napi::Env env = Env(); + Napi::HandleScope scope(env); - v8::Local argv[3] = { Nan::Null(), Nan::Null(), Nan::Null() }; - if (!baton->err.empty()) { - // Error - argv[0] = Nan::Error(baton->err.data()); - } else { + // Handle warnings + std::string warning = sharp::VipsWarningPop(); + while (!warning.empty()) { + if (baton->errUseWarning) { + (baton->err).append("\n").append(warning); + } else { + debuglog.Call(Receiver().Value(), { Napi::String::New(env, warning) }); + } + warning = sharp::VipsWarningPop(); + } + + if (baton->err.empty()) { int width = baton->width; int height = baton->height; if (baton->topOffsetPre != -1 && (baton->width == -1 || baton->height == -1)) { @@ -999,93 +1307,107 @@ class PipelineWorker : public Nan::AsyncWorker { height = baton->heightPost; } // Info Object - v8::Local info = New(); - Set(info, New("format").ToLocalChecked(), New(baton->formatOut).ToLocalChecked()); - Set(info, New("width").ToLocalChecked(), New(static_cast(width))); - Set(info, New("height").ToLocalChecked(), New(static_cast(height))); - Set(info, New("channels").ToLocalChecked(), New(static_cast(baton->channels))); - Set(info, New("premultiplied").ToLocalChecked(), New(baton->premultiplied)); - if (baton->cropCalcLeft != -1 && baton->cropCalcLeft != -1) { - Set(info, New("cropCalcLeft").ToLocalChecked(), New(static_cast(baton->cropCalcLeft))); - Set(info, New("cropCalcTop").ToLocalChecked(), New(static_cast(baton->cropCalcTop))); + Napi::Object info = Napi::Object::New(env); + info.Set("format", baton->formatOut); + info.Set("width", static_cast(width)); + info.Set("height", static_cast(height)); + info.Set("channels", static_cast(baton->channels)); + if (baton->formatOut == "raw") { + info.Set("depth", vips_enum_nick(VIPS_TYPE_BAND_FORMAT, baton->rawDepth)); + } + info.Set("premultiplied", baton->premultiplied); + if (baton->hasCropOffset) { + info.Set("cropOffsetLeft", static_cast(baton->cropOffsetLeft)); + info.Set("cropOffsetTop", static_cast(baton->cropOffsetTop)); + } + if (baton->hasAttentionCenter) { + info.Set("attentionX", static_cast(baton->attentionX)); + info.Set("attentionY", static_cast(baton->attentionY)); + } + if (baton->trimThreshold >= 0.0) { + info.Set("trimOffsetLeft", static_cast(baton->trimOffsetLeft)); + info.Set("trimOffsetTop", static_cast(baton->trimOffsetTop)); + } + if (baton->input->textAutofitDpi) { + info.Set("textAutofitDpi", static_cast(baton->input->textAutofitDpi)); + } + if (baton->pageHeightOut) { + info.Set("pageHeight", static_cast(baton->pageHeightOut)); + info.Set("pages", static_cast(baton->pagesOut)); } if (baton->bufferOutLength > 0) { - // Pass ownership of output data to Buffer instance - argv[1] = Nan::NewBuffer( - static_cast(baton->bufferOut), baton->bufferOutLength, sharp::FreeCallback, nullptr) - .ToLocalChecked(); // Add buffer size to info - Set(info, New("size").ToLocalChecked(), New(static_cast(baton->bufferOutLength))); - argv[2] = info; + info.Set("size", static_cast(baton->bufferOutLength)); + // Pass ownership of output data to Buffer instance + Napi::Buffer data = Napi::Buffer::NewOrCopy(env, static_cast(baton->bufferOut), + baton->bufferOutLength, sharp::FreeCallback); + Callback().Call(Receiver().Value(), { env.Null(), data, info }); } else { // Add file size to info - GStatBuf st; - if (g_stat(baton->fileOut.data(), &st) == 0) { - Set(info, New("size").ToLocalChecked(), New(static_cast(st.st_size))); + if (baton->formatOut != "dz" || sharp::IsDzZip(baton->fileOut)) { + try { + uint32_t const size = static_cast( + std::filesystem::file_size(std::filesystem::u8path(baton->fileOut))); + info.Set("size", size); + } catch (...) {} } - argv[1] = info; + Callback().Call(Receiver().Value(), { env.Null(), info }); } + } else { + Callback().Call(Receiver().Value(), { Napi::Error::New(env, sharp::TrimEnd(baton->err)).Value() }); } - // Dispose of Persistent wrapper around input Buffers so they can be garbage collected - std::accumulate(buffersToPersist.begin(), buffersToPersist.end(), 0, - [this](uint32_t index, v8::Local const buffer) -> uint32_t { - GetFromPersistent(index); - return index + 1; - }); + // Delete baton delete baton->input; - delete baton->overlay; delete baton->boolean; - for_each(baton->joinChannelIn.begin(), baton->joinChannelIn.end(), - [this](sharp::InputDescriptor *joinChannelIn) { - delete joinChannelIn; - }); - delete baton; - - // Handle warnings - std::string warning = sharp::VipsWarningPop(); - while (!warning.empty()) { - v8::Local message[1] = { New(warning).ToLocalChecked() }; - debuglog->Call(1, message); - warning = sharp::VipsWarningPop(); + for (Composite *composite : baton->composite) { + delete composite->input; + delete composite; + } + for (sharp::InputDescriptor *input : baton->joinChannelIn) { + delete input; } + for (sharp::InputDescriptor *input : baton->join) { + delete input; + } + delete baton; // Decrement processing task counter - g_atomic_int_dec_and_test(&sharp::counterProcess); - v8::Local queueLength[1] = { New(sharp::counterQueue) }; - queueListener->Call(1, queueLength); - delete queueListener; - - // Return to JavaScript - callback->Call(3, argv); + sharp::counterProcess--; + Napi::Number queueLength = Napi::Number::New(env, static_cast(sharp::counterQueue)); + queueListener.Call(Receiver().Value(), { queueLength }); } private: PipelineBaton *baton; - Nan::Callback *debuglog; - Nan::Callback *queueListener; - std::vector> buffersToPersist; + Napi::FunctionReference debuglog; + Napi::FunctionReference queueListener; + + void MultiPageUnsupported(int const pages, std::string op) { + if (pages > 1) { + throw vips::VError(op + " is not supported for multi-page images"); + } + } /* Calculate the angle of rotation and need-to-flip for the given Exif orientation By default, returns zero, i.e. no rotation. */ - std::tuple - CalculateExifRotationAndFlip(int const exifOrientation) { + std::tuple + CalculateExifRotationAndFlop(int const exifOrientation) { VipsAngle rotate = VIPS_ANGLE_D0; - bool flip = FALSE; - bool flop = FALSE; + bool flop = false; switch (exifOrientation) { case 6: rotate = VIPS_ANGLE_D90; break; case 3: rotate = VIPS_ANGLE_D180; break; case 8: rotate = VIPS_ANGLE_D270; break; - case 2: flop = TRUE; break; // flop 1 - case 7: flip = TRUE; rotate = VIPS_ANGLE_D90; break; // flip 6 - case 4: flop = TRUE; rotate = VIPS_ANGLE_D180; break; // flop 3 - case 5: flip = TRUE; rotate = VIPS_ANGLE_D270; break; // flip 8 + case 2: flop = true; break; + case 7: flop = true; rotate = VIPS_ANGLE_D270; break; + case 4: flop = true; rotate = VIPS_ANGLE_D180; break; + case 5: flop = true; rotate = VIPS_ANGLE_D90; break; } - return std::make_tuple(rotate, flip, flop); + return std::make_tuple(rotate, flop); } /* @@ -1107,21 +1429,84 @@ class PipelineWorker : public Nan::AsyncWorker { /* Assemble the suffix argument to dzsave, which is the format (by extname) - alongisde comma-separated arguments to the corresponding `formatsave` vips + alongside comma-separated arguments to the corresponding `formatsave` vips action. */ std::string AssembleSuffixString(std::string extname, std::vector> options) { std::string argument; - for (auto const &option : options) { + for (const auto& [key, value] : options) { if (!argument.empty()) { argument += ","; } - argument += option.first + "=" + option.second; + argument += key + "=" + value; } return extname + "[" + argument + "]"; } + /* + Build VOption for dzsave + */ + vips::VOption* + BuildOptionsDZ(PipelineBaton *baton) { + // Forward format options through suffix + std::string suffix; + if (baton->tileFormat == "png") { + std::vector> options { + {"interlace", baton->pngProgressive ? "true" : "false"}, + {"compression", std::to_string(baton->pngCompressionLevel)}, + {"filter", baton->pngAdaptiveFiltering ? "all" : "none"} + }; + suffix = AssembleSuffixString(".png", options); + } else if (baton->tileFormat == "webp") { + std::vector> options { + {"Q", std::to_string(baton->webpQuality)}, + {"alpha_q", std::to_string(baton->webpAlphaQuality)}, + {"lossless", baton->webpLossless ? "true" : "false"}, + {"near_lossless", baton->webpNearLossless ? "true" : "false"}, + {"smart_subsample", baton->webpSmartSubsample ? "true" : "false"}, + {"smart_deblock", baton->webpSmartDeblock ? "true" : "false"}, + {"preset", vips_enum_nick(VIPS_TYPE_FOREIGN_WEBP_PRESET, baton->webpPreset)}, + {"min_size", baton->webpMinSize ? "true" : "false"}, + {"mixed", baton->webpMixed ? "true" : "false"}, + {"effort", std::to_string(baton->webpEffort)} + }; + suffix = AssembleSuffixString(".webp", options); + } else { + std::vector> options { + {"Q", std::to_string(baton->jpegQuality)}, + {"interlace", baton->jpegProgressive ? "true" : "false"}, + {"subsample_mode", baton->jpegChromaSubsampling == "4:4:4" ? "off" : "on"}, + {"trellis_quant", baton->jpegTrellisQuantisation ? "true" : "false"}, + {"quant_table", std::to_string(baton->jpegQuantisationTable)}, + {"overshoot_deringing", baton->jpegOvershootDeringing ? "true": "false"}, + {"optimize_scans", baton->jpegOptimiseScans ? "true": "false"}, + {"optimize_coding", baton->jpegOptimiseCoding ? "true": "false"} + }; + std::string extname = baton->tileLayout == VIPS_FOREIGN_DZ_LAYOUT_DZ ? ".jpeg" : ".jpg"; + suffix = AssembleSuffixString(extname, options); + } + vips::VOption *options = VImage::option() + ->set("keep", baton->keepMetadata) + ->set("tile_size", baton->tileSize) + ->set("overlap", baton->tileOverlap) + ->set("container", baton->tileContainer) + ->set("layout", baton->tileLayout) + ->set("suffix", const_cast(suffix.data())) + ->set("angle", CalculateAngleRotation(baton->tileAngle)) + ->set("background", baton->tileBackground) + ->set("centre", baton->tileCentre) + ->set("id", const_cast(baton->tileId.data())) + ->set("skip_blanks", baton->tileSkipBlanks); + if (baton->tileDepth < VIPS_FOREIGN_DZ_DEPTH_LAST) { + options->set("depth", baton->tileDepth); + } + if (!baton->tileBasename.empty()) { + options->set("basename", const_cast(baton->tileBasename.data())); + } + return options; + } + /* Clear all thread-local data. */ @@ -1135,202 +1520,295 @@ class PipelineWorker : public Nan::AsyncWorker { /* pipeline(options, output, callback) */ -NAN_METHOD(pipeline) { - using sharp::HasAttr; - using sharp::AttrTo; - using sharp::AttrAs; - using sharp::AttrAsStr; - using sharp::CreateInputDescriptor; - - // Input Buffers must not undergo GC compaction during processing - std::vector> buffersToPersist; - +Napi::Value pipeline(const Napi::CallbackInfo& info) { // V8 objects are converted to non-V8 types held in the baton struct PipelineBaton *baton = new PipelineBaton; - v8::Local options = info[0].As(); + Napi::Object options = info[size_t(0)].As(); // Input - baton->input = CreateInputDescriptor(AttrAs(options, "input"), buffersToPersist); - - // ICC profile to use when input CMYK image has no embedded profile - baton->iccProfilePath = AttrAsStr(options, "iccProfilePath"); - baton->accessMethod = AttrTo(options, "sequentialRead") ? - VIPS_ACCESS_SEQUENTIAL : VIPS_ACCESS_RANDOM; - // Limit input images to a given number of pixels, where pixels = width * height - baton->limitInputPixels = AttrTo(options, "limitInputPixels"); + baton->input = sharp::CreateInputDescriptor(options.Get("input").As()); + // Join images together + if (sharp::HasAttr(options, "join")) { + Napi::Array join = options.Get("join").As(); + for (unsigned int i = 0; i < join.Length(); i++) { + baton->join.push_back( + sharp::CreateInputDescriptor(join.Get(i).As())); + } + } // Extract image options - baton->topOffsetPre = AttrTo(options, "topOffsetPre"); - baton->leftOffsetPre = AttrTo(options, "leftOffsetPre"); - baton->widthPre = AttrTo(options, "widthPre"); - baton->heightPre = AttrTo(options, "heightPre"); - baton->topOffsetPost = AttrTo(options, "topOffsetPost"); - baton->leftOffsetPost = AttrTo(options, "leftOffsetPost"); - baton->widthPost = AttrTo(options, "widthPost"); - baton->heightPost = AttrTo(options, "heightPost"); + baton->topOffsetPre = sharp::AttrAsInt32(options, "topOffsetPre"); + baton->leftOffsetPre = sharp::AttrAsInt32(options, "leftOffsetPre"); + baton->widthPre = sharp::AttrAsInt32(options, "widthPre"); + baton->heightPre = sharp::AttrAsInt32(options, "heightPre"); + baton->topOffsetPost = sharp::AttrAsInt32(options, "topOffsetPost"); + baton->leftOffsetPost = sharp::AttrAsInt32(options, "leftOffsetPost"); + baton->widthPost = sharp::AttrAsInt32(options, "widthPost"); + baton->heightPost = sharp::AttrAsInt32(options, "heightPost"); // Output image dimensions - baton->width = AttrTo(options, "width"); - baton->height = AttrTo(options, "height"); + baton->width = sharp::AttrAsInt32(options, "width"); + baton->height = sharp::AttrAsInt32(options, "height"); // Canvas option - std::string canvas = AttrAsStr(options, "canvas"); + std::string canvas = sharp::AttrAsStr(options, "canvas"); if (canvas == "crop") { - baton->canvas = Canvas::CROP; + baton->canvas = sharp::Canvas::CROP; } else if (canvas == "embed") { - baton->canvas = Canvas::EMBED; + baton->canvas = sharp::Canvas::EMBED; } else if (canvas == "max") { - baton->canvas = Canvas::MAX; + baton->canvas = sharp::Canvas::MAX; } else if (canvas == "min") { - baton->canvas = Canvas::MIN; + baton->canvas = sharp::Canvas::MIN; } else if (canvas == "ignore_aspect") { - baton->canvas = Canvas::IGNORE_ASPECT; + baton->canvas = sharp::Canvas::IGNORE_ASPECT; } - // Background colour - v8::Local background = AttrAs(options, "background"); - for (unsigned int i = 0; i < 4; i++) { - baton->background[i] = AttrTo(background, i); - } - // Overlay options - if (HasAttr(options, "overlay")) { - baton->overlay = CreateInputDescriptor(AttrAs(options, "overlay"), buffersToPersist); - baton->overlayGravity = AttrTo(options, "overlayGravity"); - baton->overlayXOffset = AttrTo(options, "overlayXOffset"); - baton->overlayYOffset = AttrTo(options, "overlayYOffset"); - baton->overlayTile = AttrTo(options, "overlayTile"); - baton->overlayCutout = AttrTo(options, "overlayCutout"); + // Composite + Napi::Array compositeArray = options.Get("composite").As(); + for (unsigned int i = 0; i < compositeArray.Length(); i++) { + Napi::Object compositeObject = compositeArray.Get(i).As(); + Composite *composite = new Composite; + composite->input = sharp::CreateInputDescriptor(compositeObject.Get("input").As()); + composite->mode = sharp::AttrAsEnum(compositeObject, "blend", VIPS_TYPE_BLEND_MODE); + composite->gravity = sharp::AttrAsUint32(compositeObject, "gravity"); + composite->left = sharp::AttrAsInt32(compositeObject, "left"); + composite->top = sharp::AttrAsInt32(compositeObject, "top"); + composite->hasOffset = sharp::AttrAsBool(compositeObject, "hasOffset"); + composite->tile = sharp::AttrAsBool(compositeObject, "tile"); + composite->premultiplied = sharp::AttrAsBool(compositeObject, "premultiplied"); + baton->composite.push_back(composite); } // Resize options - baton->withoutEnlargement = AttrTo(options, "withoutEnlargement"); - baton->crop = AttrTo(options, "crop"); - baton->kernel = AttrAsStr(options, "kernel"); - baton->interpolator = AttrAsStr(options, "interpolator"); - baton->centreSampling = AttrTo(options, "centreSampling"); + baton->withoutEnlargement = sharp::AttrAsBool(options, "withoutEnlargement"); + baton->withoutReduction = sharp::AttrAsBool(options, "withoutReduction"); + baton->position = sharp::AttrAsInt32(options, "position"); + baton->resizeBackground = sharp::AttrAsVectorOfDouble(options, "resizeBackground"); + baton->kernel = sharp::AttrAsEnum(options, "kernel", VIPS_TYPE_KERNEL); + baton->fastShrinkOnLoad = sharp::AttrAsBool(options, "fastShrinkOnLoad"); // Join Channel Options - if (HasAttr(options, "joinChannelIn")) { - v8::Local joinChannelObject = Nan::Get(options, Nan::New("joinChannelIn").ToLocalChecked()) - .ToLocalChecked().As(); - v8::Local joinChannelArray = joinChannelObject.As(); - int joinChannelArrayLength = AttrTo(joinChannelObject, "length"); - for (int i = 0; i < joinChannelArrayLength; i++) { + if (options.Has("joinChannelIn")) { + Napi::Array joinChannelArray = options.Get("joinChannelIn").As(); + for (unsigned int i = 0; i < joinChannelArray.Length(); i++) { baton->joinChannelIn.push_back( - CreateInputDescriptor( - Nan::Get(joinChannelArray, i).ToLocalChecked().As(), - buffersToPersist)); + sharp::CreateInputDescriptor(joinChannelArray.Get(i).As())); } } // Operators - baton->flatten = AttrTo(options, "flatten"); - baton->negate = AttrTo(options, "negate"); - baton->blurSigma = AttrTo(options, "blurSigma"); - baton->sharpenSigma = AttrTo(options, "sharpenSigma"); - baton->sharpenFlat = AttrTo(options, "sharpenFlat"); - baton->sharpenJagged = AttrTo(options, "sharpenJagged"); - baton->threshold = AttrTo(options, "threshold"); - baton->thresholdGrayscale = AttrTo(options, "thresholdGrayscale"); - baton->trimTolerance = AttrTo(options, "trimTolerance"); - baton->gamma = AttrTo(options, "gamma"); - baton->greyscale = AttrTo(options, "greyscale"); - baton->normalise = AttrTo(options, "normalise"); - baton->useExifOrientation = AttrTo(options, "useExifOrientation"); - baton->angle = AttrTo(options, "angle"); - baton->rotateBeforePreExtract = AttrTo(options, "rotateBeforePreExtract"); - baton->flip = AttrTo(options, "flip"); - baton->flop = AttrTo(options, "flop"); - baton->extendTop = AttrTo(options, "extendTop"); - baton->extendBottom = AttrTo(options, "extendBottom"); - baton->extendLeft = AttrTo(options, "extendLeft"); - baton->extendRight = AttrTo(options, "extendRight"); - baton->extractChannel = AttrTo(options, "extractChannel"); - if (HasAttr(options, "boolean")) { - baton->boolean = CreateInputDescriptor(AttrAs(options, "boolean"), buffersToPersist); - baton->booleanOp = sharp::GetBooleanOperation(AttrAsStr(options, "booleanOp")); + baton->flatten = sharp::AttrAsBool(options, "flatten"); + baton->flattenBackground = sharp::AttrAsVectorOfDouble(options, "flattenBackground"); + baton->unflatten = sharp::AttrAsBool(options, "unflatten"); + baton->negate = sharp::AttrAsBool(options, "negate"); + baton->negateAlpha = sharp::AttrAsBool(options, "negateAlpha"); + baton->blurSigma = sharp::AttrAsDouble(options, "blurSigma"); + baton->precision = sharp::AttrAsEnum(options, "precision", VIPS_TYPE_PRECISION); + baton->minAmpl = sharp::AttrAsDouble(options, "minAmpl"); + baton->brightness = sharp::AttrAsDouble(options, "brightness"); + baton->saturation = sharp::AttrAsDouble(options, "saturation"); + baton->hue = sharp::AttrAsInt32(options, "hue"); + baton->lightness = sharp::AttrAsDouble(options, "lightness"); + baton->medianSize = sharp::AttrAsUint32(options, "medianSize"); + baton->sharpenSigma = sharp::AttrAsDouble(options, "sharpenSigma"); + baton->sharpenM1 = sharp::AttrAsDouble(options, "sharpenM1"); + baton->sharpenM2 = sharp::AttrAsDouble(options, "sharpenM2"); + baton->sharpenX1 = sharp::AttrAsDouble(options, "sharpenX1"); + baton->sharpenY2 = sharp::AttrAsDouble(options, "sharpenY2"); + baton->sharpenY3 = sharp::AttrAsDouble(options, "sharpenY3"); + baton->threshold = sharp::AttrAsInt32(options, "threshold"); + baton->thresholdGrayscale = sharp::AttrAsBool(options, "thresholdGrayscale"); + baton->trimBackground = sharp::AttrAsVectorOfDouble(options, "trimBackground"); + baton->trimThreshold = sharp::AttrAsDouble(options, "trimThreshold"); + baton->trimLineArt = sharp::AttrAsBool(options, "trimLineArt"); + baton->gamma = sharp::AttrAsDouble(options, "gamma"); + baton->gammaOut = sharp::AttrAsDouble(options, "gammaOut"); + baton->linearA = sharp::AttrAsVectorOfDouble(options, "linearA"); + baton->linearB = sharp::AttrAsVectorOfDouble(options, "linearB"); + baton->dilateWidth = sharp::AttrAsUint32(options, "dilateWidth"); + baton->erodeWidth = sharp::AttrAsUint32(options, "erodeWidth"); + baton->greyscale = sharp::AttrAsBool(options, "greyscale"); + baton->normalise = sharp::AttrAsBool(options, "normalise"); + baton->normaliseLower = sharp::AttrAsUint32(options, "normaliseLower"); + baton->normaliseUpper = sharp::AttrAsUint32(options, "normaliseUpper"); + baton->tint = sharp::AttrAsVectorOfDouble(options, "tint"); + baton->claheWidth = sharp::AttrAsUint32(options, "claheWidth"); + baton->claheHeight = sharp::AttrAsUint32(options, "claheHeight"); + baton->claheMaxSlope = sharp::AttrAsUint32(options, "claheMaxSlope"); + baton->angle = sharp::AttrAsInt32(options, "angle"); + baton->rotationAngle = sharp::AttrAsDouble(options, "rotationAngle"); + baton->rotationBackground = sharp::AttrAsVectorOfDouble(options, "rotationBackground"); + baton->rotateBefore = sharp::AttrAsBool(options, "rotateBefore"); + baton->orientBefore = sharp::AttrAsBool(options, "orientBefore"); + baton->flip = sharp::AttrAsBool(options, "flip"); + baton->flop = sharp::AttrAsBool(options, "flop"); + baton->extendTop = sharp::AttrAsInt32(options, "extendTop"); + baton->extendBottom = sharp::AttrAsInt32(options, "extendBottom"); + baton->extendLeft = sharp::AttrAsInt32(options, "extendLeft"); + baton->extendRight = sharp::AttrAsInt32(options, "extendRight"); + baton->extendBackground = sharp::AttrAsVectorOfDouble(options, "extendBackground"); + baton->extendWith = sharp::AttrAsEnum(options, "extendWith", VIPS_TYPE_EXTEND); + baton->extractChannel = sharp::AttrAsInt32(options, "extractChannel"); + baton->affineMatrix = sharp::AttrAsVectorOfDouble(options, "affineMatrix"); + baton->affineBackground = sharp::AttrAsVectorOfDouble(options, "affineBackground"); + baton->affineIdx = sharp::AttrAsDouble(options, "affineIdx"); + baton->affineIdy = sharp::AttrAsDouble(options, "affineIdy"); + baton->affineOdx = sharp::AttrAsDouble(options, "affineOdx"); + baton->affineOdy = sharp::AttrAsDouble(options, "affineOdy"); + baton->affineInterpolator = sharp::AttrAsStr(options, "affineInterpolator"); + baton->removeAlpha = sharp::AttrAsBool(options, "removeAlpha"); + baton->ensureAlpha = sharp::AttrAsDouble(options, "ensureAlpha"); + if (options.Has("boolean")) { + baton->boolean = sharp::CreateInputDescriptor(options.Get("boolean").As()); + baton->booleanOp = sharp::AttrAsEnum(options, "booleanOp", VIPS_TYPE_OPERATION_BOOLEAN); } - if (HasAttr(options, "bandBoolOp")) { - baton->bandBoolOp = sharp::GetBooleanOperation(AttrAsStr(options, "bandBoolOp")); + if (options.Has("bandBoolOp")) { + baton->bandBoolOp = sharp::AttrAsEnum(options, "bandBoolOp", VIPS_TYPE_OPERATION_BOOLEAN); } - if (HasAttr(options, "convKernel")) { - v8::Local kernel = AttrAs(options, "convKernel"); - baton->convKernelWidth = AttrTo(kernel, "width"); - baton->convKernelHeight = AttrTo(kernel, "height"); - baton->convKernelScale = AttrTo(kernel, "scale"); - baton->convKernelOffset = AttrTo(kernel, "offset"); + if (options.Has("convKernel")) { + Napi::Object kernel = options.Get("convKernel").As(); + baton->convKernelWidth = sharp::AttrAsUint32(kernel, "width"); + baton->convKernelHeight = sharp::AttrAsUint32(kernel, "height"); + baton->convKernelScale = sharp::AttrAsDouble(kernel, "scale"); + baton->convKernelOffset = sharp::AttrAsDouble(kernel, "offset"); size_t const kernelSize = static_cast(baton->convKernelWidth * baton->convKernelHeight); - baton->convKernel = std::unique_ptr(new double[kernelSize]); - v8::Local kdata = AttrAs(kernel, "kernel"); + baton->convKernel.resize(kernelSize); + Napi::Array kdata = kernel.Get("kernel").As(); for (unsigned int i = 0; i < kernelSize; i++) { - baton->convKernel[i] = AttrTo(kdata, i); + baton->convKernel[i] = sharp::AttrAsDouble(kdata, i); + } + } + if (options.Has("recombMatrix")) { + Napi::Array recombMatrix = options.Get("recombMatrix").As(); + unsigned int matrixElements = recombMatrix.Length(); + baton->recombMatrix.resize(matrixElements); + for (unsigned int i = 0; i < matrixElements; i++) { + baton->recombMatrix[i] = sharp::AttrAsDouble(recombMatrix, i); } } - baton->colourspace = sharp::GetInterpretation(AttrAsStr(options, "colourspace")); + baton->colourspacePipeline = sharp::AttrAsEnum( + options, "colourspacePipeline", VIPS_TYPE_INTERPRETATION); + if (baton->colourspacePipeline == VIPS_INTERPRETATION_ERROR) { + baton->colourspacePipeline = VIPS_INTERPRETATION_LAST; + } + baton->colourspace = sharp::AttrAsEnum(options, "colourspace", VIPS_TYPE_INTERPRETATION); if (baton->colourspace == VIPS_INTERPRETATION_ERROR) { baton->colourspace = VIPS_INTERPRETATION_sRGB; } // Output - baton->formatOut = AttrAsStr(options, "formatOut"); - baton->fileOut = AttrAsStr(options, "fileOut"); - baton->withMetadata = AttrTo(options, "withMetadata"); - baton->withMetadataOrientation = AttrTo(options, "withMetadataOrientation"); - // Format-specific - baton->jpegQuality = AttrTo(options, "jpegQuality"); - baton->jpegProgressive = AttrTo(options, "jpegProgressive"); - baton->jpegChromaSubsampling = AttrAsStr(options, "jpegChromaSubsampling"); - baton->jpegTrellisQuantisation = AttrTo(options, "jpegTrellisQuantisation"); - baton->jpegOvershootDeringing = AttrTo(options, "jpegOvershootDeringing"); - baton->jpegOptimiseScans = AttrTo(options, "jpegOptimiseScans"); - baton->pngProgressive = AttrTo(options, "pngProgressive"); - baton->pngCompressionLevel = AttrTo(options, "pngCompressionLevel"); - baton->pngAdaptiveFiltering = AttrTo(options, "pngAdaptiveFiltering"); - baton->webpQuality = AttrTo(options, "webpQuality"); - baton->webpAlphaQuality = AttrTo(options, "webpAlphaQuality"); - baton->webpLossless = AttrTo(options, "webpLossless"); - baton->webpNearLossless = AttrTo(options, "webpNearLossless"); - baton->tiffQuality = AttrTo(options, "tiffQuality"); - baton->tiffSquash = AttrTo(options, "tiffSquash"); - baton->tiffXres = AttrTo(options, "tiffXres"); - baton->tiffYres = AttrTo(options, "tiffYres"); - // tiff compression options - baton->tiffCompression = static_cast( - vips_enum_from_nick(nullptr, VIPS_TYPE_FOREIGN_TIFF_COMPRESSION, - AttrAsStr(options, "tiffCompression").data())); - baton->tiffPredictor = static_cast( - vips_enum_from_nick(nullptr, VIPS_TYPE_FOREIGN_TIFF_PREDICTOR, - AttrAsStr(options, "tiffPredictor").data())); - - // Tile output - baton->tileSize = AttrTo(options, "tileSize"); - baton->tileOverlap = AttrTo(options, "tileOverlap"); - std::string tileContainer = AttrAsStr(options, "tileContainer"); - if (tileContainer == "zip") { - baton->tileContainer = VIPS_FOREIGN_DZ_CONTAINER_ZIP; - } else { - baton->tileContainer = VIPS_FOREIGN_DZ_CONTAINER_FS; - } - std::string tileLayout = AttrAsStr(options, "tileLayout"); - if (tileLayout == "google") { - baton->tileLayout = VIPS_FOREIGN_DZ_LAYOUT_GOOGLE; - } else if (tileLayout == "zoomify") { - baton->tileLayout = VIPS_FOREIGN_DZ_LAYOUT_ZOOMIFY; - } else { - baton->tileLayout = VIPS_FOREIGN_DZ_LAYOUT_DZ; + baton->formatOut = sharp::AttrAsStr(options, "formatOut"); + baton->fileOut = sharp::AttrAsStr(options, "fileOut"); + baton->keepMetadata = sharp::AttrAsUint32(options, "keepMetadata"); + baton->withMetadataOrientation = sharp::AttrAsUint32(options, "withMetadataOrientation"); + baton->withMetadataDensity = sharp::AttrAsDouble(options, "withMetadataDensity"); + baton->withIccProfile = sharp::AttrAsStr(options, "withIccProfile"); + Napi::Object withExif = options.Get("withExif").As(); + Napi::Array withExifKeys = withExif.GetPropertyNames(); + for (unsigned int i = 0; i < withExifKeys.Length(); i++) { + std::string k = sharp::AttrAsStr(withExifKeys, i); + if (withExif.HasOwnProperty(k)) { + baton->withExif.insert(std::make_pair(k, sharp::AttrAsStr(withExif, k))); + } } - baton->tileFormat = AttrAsStr(options, "tileFormat"); - // Force random access for certain operations - if (baton->accessMethod == VIPS_ACCESS_SEQUENTIAL && ( - baton->trimTolerance != 0 || baton->normalise || - baton->crop == 16 || baton->crop == 17)) { - baton->accessMethod = VIPS_ACCESS_RANDOM; + baton->withExifMerge = sharp::AttrAsBool(options, "withExifMerge"); + baton->withXmp = sharp::AttrAsStr(options, "withXmp"); + baton->timeoutSeconds = sharp::AttrAsUint32(options, "timeoutSeconds"); + baton->loop = sharp::AttrAsUint32(options, "loop"); + baton->delay = sharp::AttrAsInt32Vector(options, "delay"); + // Format-specific + baton->jpegQuality = sharp::AttrAsUint32(options, "jpegQuality"); + baton->jpegProgressive = sharp::AttrAsBool(options, "jpegProgressive"); + baton->jpegChromaSubsampling = sharp::AttrAsStr(options, "jpegChromaSubsampling"); + baton->jpegTrellisQuantisation = sharp::AttrAsBool(options, "jpegTrellisQuantisation"); + baton->jpegQuantisationTable = sharp::AttrAsUint32(options, "jpegQuantisationTable"); + baton->jpegOvershootDeringing = sharp::AttrAsBool(options, "jpegOvershootDeringing"); + baton->jpegOptimiseScans = sharp::AttrAsBool(options, "jpegOptimiseScans"); + baton->jpegOptimiseCoding = sharp::AttrAsBool(options, "jpegOptimiseCoding"); + baton->pngProgressive = sharp::AttrAsBool(options, "pngProgressive"); + baton->pngCompressionLevel = sharp::AttrAsUint32(options, "pngCompressionLevel"); + baton->pngAdaptiveFiltering = sharp::AttrAsBool(options, "pngAdaptiveFiltering"); + baton->pngPalette = sharp::AttrAsBool(options, "pngPalette"); + baton->pngQuality = sharp::AttrAsUint32(options, "pngQuality"); + baton->pngEffort = sharp::AttrAsUint32(options, "pngEffort"); + baton->pngBitdepth = sharp::AttrAsUint32(options, "pngBitdepth"); + baton->pngDither = sharp::AttrAsDouble(options, "pngDither"); + baton->jp2Quality = sharp::AttrAsUint32(options, "jp2Quality"); + baton->jp2Lossless = sharp::AttrAsBool(options, "jp2Lossless"); + baton->jp2TileHeight = sharp::AttrAsUint32(options, "jp2TileHeight"); + baton->jp2TileWidth = sharp::AttrAsUint32(options, "jp2TileWidth"); + baton->jp2ChromaSubsampling = sharp::AttrAsStr(options, "jp2ChromaSubsampling"); + baton->webpQuality = sharp::AttrAsUint32(options, "webpQuality"); + baton->webpAlphaQuality = sharp::AttrAsUint32(options, "webpAlphaQuality"); + baton->webpLossless = sharp::AttrAsBool(options, "webpLossless"); + baton->webpNearLossless = sharp::AttrAsBool(options, "webpNearLossless"); + baton->webpSmartSubsample = sharp::AttrAsBool(options, "webpSmartSubsample"); + baton->webpSmartDeblock = sharp::AttrAsBool(options, "webpSmartDeblock"); + baton->webpPreset = sharp::AttrAsEnum(options, "webpPreset", VIPS_TYPE_FOREIGN_WEBP_PRESET); + baton->webpEffort = sharp::AttrAsUint32(options, "webpEffort"); + baton->webpMinSize = sharp::AttrAsBool(options, "webpMinSize"); + baton->webpMixed = sharp::AttrAsBool(options, "webpMixed"); + baton->gifBitdepth = sharp::AttrAsUint32(options, "gifBitdepth"); + baton->gifEffort = sharp::AttrAsUint32(options, "gifEffort"); + baton->gifDither = sharp::AttrAsDouble(options, "gifDither"); + baton->gifInterFrameMaxError = sharp::AttrAsDouble(options, "gifInterFrameMaxError"); + baton->gifInterPaletteMaxError = sharp::AttrAsDouble(options, "gifInterPaletteMaxError"); + baton->gifKeepDuplicateFrames = sharp::AttrAsBool(options, "gifKeepDuplicateFrames"); + baton->gifReuse = sharp::AttrAsBool(options, "gifReuse"); + baton->gifProgressive = sharp::AttrAsBool(options, "gifProgressive"); + baton->tiffQuality = sharp::AttrAsUint32(options, "tiffQuality"); + baton->tiffBigtiff = sharp::AttrAsBool(options, "tiffBigtiff"); + baton->tiffPyramid = sharp::AttrAsBool(options, "tiffPyramid"); + baton->tiffMiniswhite = sharp::AttrAsBool(options, "tiffMiniswhite"); + baton->tiffBitdepth = sharp::AttrAsUint32(options, "tiffBitdepth"); + baton->tiffTile = sharp::AttrAsBool(options, "tiffTile"); + baton->tiffTileWidth = sharp::AttrAsUint32(options, "tiffTileWidth"); + baton->tiffTileHeight = sharp::AttrAsUint32(options, "tiffTileHeight"); + baton->tiffXres = sharp::AttrAsDouble(options, "tiffXres"); + baton->tiffYres = sharp::AttrAsDouble(options, "tiffYres"); + if (baton->tiffXres == 1.0 && baton->tiffYres == 1.0 && baton->withMetadataDensity > 0) { + baton->tiffXres = baton->tiffYres = baton->withMetadataDensity / 25.4; } + baton->tiffCompression = sharp::AttrAsEnum( + options, "tiffCompression", VIPS_TYPE_FOREIGN_TIFF_COMPRESSION); + baton->tiffPredictor = sharp::AttrAsEnum( + options, "tiffPredictor", VIPS_TYPE_FOREIGN_TIFF_PREDICTOR); + baton->tiffResolutionUnit = sharp::AttrAsEnum( + options, "tiffResolutionUnit", VIPS_TYPE_FOREIGN_TIFF_RESUNIT); + baton->heifQuality = sharp::AttrAsUint32(options, "heifQuality"); + baton->heifLossless = sharp::AttrAsBool(options, "heifLossless"); + baton->heifCompression = sharp::AttrAsEnum( + options, "heifCompression", VIPS_TYPE_FOREIGN_HEIF_COMPRESSION); + baton->heifEffort = sharp::AttrAsUint32(options, "heifEffort"); + baton->heifChromaSubsampling = sharp::AttrAsStr(options, "heifChromaSubsampling"); + baton->heifBitdepth = sharp::AttrAsUint32(options, "heifBitdepth"); + baton->jxlDistance = sharp::AttrAsDouble(options, "jxlDistance"); + baton->jxlDecodingTier = sharp::AttrAsUint32(options, "jxlDecodingTier"); + baton->jxlEffort = sharp::AttrAsUint32(options, "jxlEffort"); + baton->jxlLossless = sharp::AttrAsBool(options, "jxlLossless"); + baton->rawDepth = sharp::AttrAsEnum(options, "rawDepth", VIPS_TYPE_BAND_FORMAT); + baton->tileSize = sharp::AttrAsUint32(options, "tileSize"); + baton->tileOverlap = sharp::AttrAsUint32(options, "tileOverlap"); + baton->tileAngle = sharp::AttrAsInt32(options, "tileAngle"); + baton->tileBackground = sharp::AttrAsVectorOfDouble(options, "tileBackground"); + baton->tileSkipBlanks = sharp::AttrAsInt32(options, "tileSkipBlanks"); + baton->tileContainer = sharp::AttrAsEnum( + options, "tileContainer", VIPS_TYPE_FOREIGN_DZ_CONTAINER); + baton->tileLayout = sharp::AttrAsEnum(options, "tileLayout", VIPS_TYPE_FOREIGN_DZ_LAYOUT); + baton->tileFormat = sharp::AttrAsStr(options, "tileFormat"); + baton->tileDepth = sharp::AttrAsEnum(options, "tileDepth", VIPS_TYPE_FOREIGN_DZ_DEPTH); + baton->tileCentre = sharp::AttrAsBool(options, "tileCentre"); + baton->tileId = sharp::AttrAsStr(options, "tileId"); + baton->tileBasename = sharp::AttrAsStr(options, "tileBasename"); // Function to notify of libvips warnings - Nan::Callback *debuglog = new Nan::Callback(AttrAs(options, "debuglog")); + Napi::Function debuglog = options.Get("debuglog").As(); // Function to notify of queue length changes - Nan::Callback *queueListener = new Nan::Callback(AttrAs(options, "queueListener")); + Napi::Function queueListener = options.Get("queueListener").As(); // Join queue for worker thread - Nan::Callback *callback = new Nan::Callback(info[1].As()); - Nan::AsyncQueueWorker(new PipelineWorker(callback, baton, debuglog, queueListener, buffersToPersist)); + Napi::Function callback = info[size_t(1)].As(); + PipelineWorker *worker = new PipelineWorker(callback, baton, debuglog, queueListener); + worker->Receiver().Set("options", options); + worker->Queue(); // Increment queued task counter - g_atomic_int_inc(&sharp::counterQueue); - v8::Local queueLength[1] = { Nan::New(sharp::counterQueue) }; - queueListener->Call(1, queueLength); + Napi::Number queueLength = Napi::Number::New(info.Env(), static_cast(++sharp::counterQueue)); + queueListener.Call(info.This(), { queueLength }); + + return info.Env().Undefined(); } diff --git a/src/pipeline.h b/src/pipeline.h index f0fdab518..ff9465987 100644 --- a/src/pipeline.h +++ b/src/pipeline.h @@ -1,53 +1,54 @@ -// Copyright 2013, 2014, 2015, 2016, 2017 Lovell Fuller and contributors. -// -// Licensed under the Apache License, Version 2.0 (the "License"); -// you may not use this file except in compliance with the License. -// You may obtain a copy of the License at -// -// http://www.apache.org/licenses/LICENSE-2.0 -// -// Unless required by applicable law or agreed to in writing, software -// distributed under the License is distributed on an "AS IS" BASIS, -// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -// See the License for the specific language governing permissions and -// limitations under the License. +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ #ifndef SRC_PIPELINE_H_ #define SRC_PIPELINE_H_ #include #include +#include #include -#include +#include #include #include "./common.h" -NAN_METHOD(pipeline); +Napi::Value pipeline(const Napi::CallbackInfo& info); -enum class Canvas { - CROP, - EMBED, - MAX, - MIN, - IGNORE_ASPECT +struct Composite { + sharp::InputDescriptor *input; + VipsBlendMode mode; + int gravity; + int left; + int top; + bool hasOffset; + bool tile; + bool premultiplied; + + Composite(): + input(nullptr), + mode(VIPS_BLEND_MODE_OVER), + gravity(0), + left(0), + top(0), + hasOffset(false), + tile(false), + premultiplied(false) {} }; struct PipelineBaton { sharp::InputDescriptor *input; - std::string iccProfilePath; - int limitInputPixels; + std::vector join; std::string formatOut; std::string fileOut; void *bufferOut; size_t bufferOutLength; - sharp::InputDescriptor *overlay; - int overlayGravity; - int overlayXOffset; - int overlayYOffset; - bool overlayTile; - bool overlayCutout; + int pageHeightOut; + int pagesOut; + std::vector composite; std::vector joinChannelIn; int topOffsetPre; int leftOffsetPre; @@ -60,61 +61,155 @@ struct PipelineBaton { int width; int height; int channels; - Canvas canvas; - int crop; - int cropCalcLeft; - int cropCalcTop; + VipsKernel kernel; + sharp::Canvas canvas; + int position; + std::vector resizeBackground; + bool hasCropOffset; + int cropOffsetLeft; + int cropOffsetTop; + bool hasAttentionCenter; + int attentionX; + int attentionY; bool premultiplied; - std::string kernel; - std::string interpolator; - bool centreSampling; - double background[4]; + bool tileCentre; + bool fastShrinkOnLoad; + std::vector tint; bool flatten; + std::vector flattenBackground; + bool unflatten; bool negate; + bool negateAlpha; double blurSigma; + VipsPrecision precision; + double minAmpl; + double brightness; + double saturation; + int hue; + double lightness; + int medianSize; double sharpenSigma; - double sharpenFlat; - double sharpenJagged; + double sharpenM1; + double sharpenM2; + double sharpenX1; + double sharpenY2; + double sharpenY3; int threshold; bool thresholdGrayscale; - int trimTolerance; + std::vector trimBackground; + double trimThreshold; + bool trimLineArt; + int trimOffsetLeft; + int trimOffsetTop; + std::vector linearA; + std::vector linearB; + int dilateWidth; + int erodeWidth; double gamma; + double gammaOut; bool greyscale; bool normalise; - bool useExifOrientation; + int normaliseLower; + int normaliseUpper; + int claheWidth; + int claheHeight; + int claheMaxSlope; int angle; - bool rotateBeforePreExtract; + double rotationAngle; + std::vector rotationBackground; + bool rotateBefore; + bool orientBefore; bool flip; bool flop; int extendTop; int extendBottom; int extendLeft; int extendRight; + std::vector extendBackground; + VipsExtend extendWith; bool withoutEnlargement; - VipsAccess accessMethod; + bool withoutReduction; + std::vector affineMatrix; + std::vector affineBackground; + double affineIdx; + double affineIdy; + double affineOdx; + double affineOdy; + std::string affineInterpolator; int jpegQuality; bool jpegProgressive; std::string jpegChromaSubsampling; bool jpegTrellisQuantisation; + int jpegQuantisationTable; bool jpegOvershootDeringing; bool jpegOptimiseScans; + bool jpegOptimiseCoding; bool pngProgressive; int pngCompressionLevel; bool pngAdaptiveFiltering; + bool pngPalette; + int pngQuality; + int pngEffort; + int pngBitdepth; + double pngDither; + int jp2Quality; + bool jp2Lossless; + int jp2TileHeight; + int jp2TileWidth; + std::string jp2ChromaSubsampling; int webpQuality; int webpAlphaQuality; bool webpNearLossless; bool webpLossless; + bool webpSmartSubsample; + bool webpSmartDeblock; + VipsForeignWebpPreset webpPreset; + int webpEffort; + bool webpMinSize; + bool webpMixed; + int gifBitdepth; + int gifEffort; + double gifDither; + double gifInterFrameMaxError; + double gifInterPaletteMaxError; + bool gifKeepDuplicateFrames; + bool gifReuse; + bool gifProgressive; int tiffQuality; VipsForeignTiffCompression tiffCompression; + bool tiffBigtiff; VipsForeignTiffPredictor tiffPredictor; - bool tiffSquash; + bool tiffPyramid; + int tiffBitdepth; + bool tiffMiniswhite; + bool tiffTile; + int tiffTileHeight; + int tiffTileWidth; double tiffXres; double tiffYres; + VipsForeignTiffResunit tiffResolutionUnit; + int heifQuality; + VipsForeignHeifCompression heifCompression; + int heifEffort; + std::string heifChromaSubsampling; + bool heifLossless; + int heifBitdepth; + double jxlDistance; + int jxlDecodingTier; + int jxlEffort; + bool jxlLossless; + VipsBandFormat rawDepth; std::string err; - bool withMetadata; + bool errUseWarning; + int keepMetadata; int withMetadataOrientation; - std::unique_ptr convKernel; + double withMetadataDensity; + std::string withIccProfile; + std::unordered_map withExif; + bool withExifMerge; + std::string withXmp; + int timeoutSeconds; + std::vector convKernel; int convKernelWidth; int convKernelHeight; double convKernelScale; @@ -123,71 +218,170 @@ struct PipelineBaton { VipsOperationBoolean booleanOp; VipsOperationBoolean bandBoolOp; int extractChannel; + bool removeAlpha; + double ensureAlpha; + VipsInterpretation colourspacePipeline; VipsInterpretation colourspace; + std::vector delay; + int loop; int tileSize; int tileOverlap; VipsForeignDzContainer tileContainer; VipsForeignDzLayout tileLayout; std::string tileFormat; + int tileAngle; + std::vector tileBackground; + int tileSkipBlanks; + VipsForeignDzDepth tileDepth; + std::string tileId; + std::string tileBasename; + std::vector recombMatrix; PipelineBaton(): input(nullptr), - limitInputPixels(0), bufferOutLength(0), - overlay(nullptr), - overlayGravity(0), - overlayXOffset(-1), - overlayYOffset(-1), - overlayTile(false), - overlayCutout(false), + pageHeightOut(0), + pagesOut(0), topOffsetPre(-1), topOffsetPost(-1), channels(0), - canvas(Canvas::CROP), - crop(0), - cropCalcLeft(-1), - cropCalcTop(-1), + kernel(VIPS_KERNEL_LANCZOS3), + canvas(sharp::Canvas::CROP), + position(0), + resizeBackground{ 0.0, 0.0, 0.0, 255.0 }, + hasCropOffset(false), + cropOffsetLeft(0), + cropOffsetTop(0), + hasAttentionCenter(false), + attentionX(0), + attentionY(0), premultiplied(false), - centreSampling(false), + tint{ -1.0, 0.0, 0.0, 0.0 }, flatten(false), + flattenBackground{ 0.0, 0.0, 0.0 }, + unflatten(false), negate(false), + negateAlpha(true), blurSigma(0.0), + brightness(1.0), + saturation(1.0), + hue(0), + lightness(0), + medianSize(0), sharpenSigma(0.0), - sharpenFlat(1.0), - sharpenJagged(2.0), + sharpenM1(1.0), + sharpenM2(2.0), + sharpenX1(2.0), + sharpenY2(10.0), + sharpenY3(20.0), threshold(0), thresholdGrayscale(true), - trimTolerance(0), + trimBackground{}, + trimThreshold(-1.0), + trimLineArt(false), + trimOffsetLeft(0), + trimOffsetTop(0), + linearA{}, + linearB{}, + dilateWidth(0), + erodeWidth(0), gamma(0.0), greyscale(false), normalise(false), - useExifOrientation(false), + normaliseLower(1), + normaliseUpper(99), + claheWidth(0), + claheHeight(0), + claheMaxSlope(3), angle(0), + rotationAngle(0.0), + rotationBackground{ 0.0, 0.0, 0.0, 255.0 }, flip(false), flop(false), extendTop(0), extendBottom(0), extendLeft(0), extendRight(0), + extendBackground{ 0.0, 0.0, 0.0, 255.0 }, + extendWith(VIPS_EXTEND_BACKGROUND), withoutEnlargement(false), + withoutReduction(false), + affineMatrix{ 1.0, 0.0, 0.0, 1.0 }, + affineBackground{ 0.0, 0.0, 0.0, 255.0 }, + affineIdx(0), + affineIdy(0), + affineOdx(0), + affineOdy(0), + affineInterpolator("bicubic"), jpegQuality(80), jpegProgressive(false), jpegChromaSubsampling("4:2:0"), jpegTrellisQuantisation(false), + jpegQuantisationTable(0), jpegOvershootDeringing(false), jpegOptimiseScans(false), + jpegOptimiseCoding(true), pngProgressive(false), pngCompressionLevel(6), - pngAdaptiveFiltering(true), + pngAdaptiveFiltering(false), + pngPalette(false), + pngQuality(100), + pngEffort(7), + pngBitdepth(8), + pngDither(1.0), + jp2Quality(80), + jp2Lossless(false), + jp2TileHeight(512), + jp2TileWidth(512), + jp2ChromaSubsampling("4:4:4"), webpQuality(80), + webpAlphaQuality(100), + webpNearLossless(false), + webpLossless(false), + webpSmartSubsample(false), + webpSmartDeblock(false), + webpPreset(VIPS_FOREIGN_WEBP_PRESET_DEFAULT), + webpEffort(4), + webpMinSize(false), + webpMixed(false), + gifBitdepth(8), + gifEffort(7), + gifDither(1.0), + gifInterFrameMaxError(0.0), + gifInterPaletteMaxError(3.0), + gifKeepDuplicateFrames(false), + gifReuse(true), + gifProgressive(false), tiffQuality(80), tiffCompression(VIPS_FOREIGN_TIFF_COMPRESSION_JPEG), - tiffPredictor(VIPS_FOREIGN_TIFF_PREDICTOR_NONE), - tiffSquash(false), + tiffBigtiff(false), + tiffPredictor(VIPS_FOREIGN_TIFF_PREDICTOR_HORIZONTAL), + tiffPyramid(false), + tiffBitdepth(8), + tiffMiniswhite(false), + tiffTile(false), + tiffTileHeight(256), + tiffTileWidth(256), tiffXres(1.0), tiffYres(1.0), - withMetadata(false), + tiffResolutionUnit(VIPS_FOREIGN_TIFF_RESUNIT_INCH), + heifQuality(50), + heifCompression(VIPS_FOREIGN_HEIF_COMPRESSION_AV1), + heifEffort(4), + heifChromaSubsampling("4:4:4"), + heifLossless(false), + heifBitdepth(8), + jxlDistance(1.0), + jxlDecodingTier(0), + jxlEffort(7), + jxlLossless(false), + rawDepth(VIPS_FORMAT_UCHAR), + errUseWarning(false), + keepMetadata(0), withMetadataOrientation(-1), + withMetadataDensity(0.0), + withExifMerge(true), + timeoutSeconds(0), convKernelWidth(0), convKernelHeight(0), convKernelScale(0.0), @@ -196,16 +390,19 @@ struct PipelineBaton { booleanOp(VIPS_OPERATION_BOOLEAN_LAST), bandBoolOp(VIPS_OPERATION_BOOLEAN_LAST), extractChannel(-1), + removeAlpha(false), + ensureAlpha(-1.0), + colourspacePipeline(VIPS_INTERPRETATION_LAST), colourspace(VIPS_INTERPRETATION_LAST), + loop(-1), tileSize(256), tileOverlap(0), tileContainer(VIPS_FOREIGN_DZ_CONTAINER_FS), - tileLayout(VIPS_FOREIGN_DZ_LAYOUT_DZ) { - background[0] = 0.0; - background[1] = 0.0; - background[2] = 0.0; - background[3] = 255.0; - } + tileLayout(VIPS_FOREIGN_DZ_LAYOUT_DZ), + tileAngle(0), + tileBackground{ 255.0, 255.0, 255.0, 255.0 }, + tileSkipBlanks(-1), + tileDepth(VIPS_FOREIGN_DZ_DEPTH_LAST) {} }; #endif // SRC_PIPELINE_H_ diff --git a/src/sharp.cc b/src/sharp.cc index c9b0dd180..7678975c9 100644 --- a/src/sharp.cc +++ b/src/sharp.cc @@ -1,51 +1,43 @@ -// Copyright 2013, 2014, 2015, 2016, 2017 Lovell Fuller and contributors. -// -// Licensed under the Apache License, Version 2.0 (the "License"); -// you may not use this file except in compliance with the License. -// You may obtain a copy of the License at -// -// http://www.apache.org/licenses/LICENSE-2.0 -// -// Unless required by applicable law or agreed to in writing, software -// distributed under the License is distributed on an "AS IS" BASIS, -// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -// See the License for the specific language governing permissions and -// limitations under the License. +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -#include -#include +#include + +#include #include -#include "common.h" -#include "metadata.h" -#include "pipeline.h" -#include "utilities.h" +#include "./common.h" +#include "./metadata.h" +#include "./pipeline.h" +#include "./stats.h" +#include "./utilities.h" -NAN_MODULE_INIT(init) { - vips_init("sharp"); +Napi::Object init(Napi::Env env, Napi::Object exports) { + static std::once_flag sharp_vips_init_once; + std::call_once(sharp_vips_init_once, []() { + vips_init("sharp"); + }); g_log_set_handler("VIPS", static_cast(G_LOG_LEVEL_WARNING), static_cast(sharp::VipsWarningCallback), nullptr); // Methods available to JavaScript - Nan::Set(target, Nan::New("metadata").ToLocalChecked(), - Nan::GetFunction(Nan::New(metadata)).ToLocalChecked()); - Nan::Set(target, Nan::New("pipeline").ToLocalChecked(), - Nan::GetFunction(Nan::New(pipeline)).ToLocalChecked()); - Nan::Set(target, Nan::New("cache").ToLocalChecked(), - Nan::GetFunction(Nan::New(cache)).ToLocalChecked()); - Nan::Set(target, Nan::New("concurrency").ToLocalChecked(), - Nan::GetFunction(Nan::New(concurrency)).ToLocalChecked()); - Nan::Set(target, Nan::New("counters").ToLocalChecked(), - Nan::GetFunction(Nan::New(counters)).ToLocalChecked()); - Nan::Set(target, Nan::New("simd").ToLocalChecked(), - Nan::GetFunction(Nan::New(simd)).ToLocalChecked()); - Nan::Set(target, Nan::New("libvipsVersion").ToLocalChecked(), - Nan::GetFunction(Nan::New(libvipsVersion)).ToLocalChecked()); - Nan::Set(target, Nan::New("format").ToLocalChecked(), - Nan::GetFunction(Nan::New(format)).ToLocalChecked()); - Nan::Set(target, Nan::New("_maxColourDistance").ToLocalChecked(), - Nan::GetFunction(Nan::New(_maxColourDistance)).ToLocalChecked()); + exports.Set("metadata", Napi::Function::New(env, metadata)); + exports.Set("pipeline", Napi::Function::New(env, pipeline)); + exports.Set("cache", Napi::Function::New(env, cache)); + exports.Set("concurrency", Napi::Function::New(env, concurrency)); + exports.Set("counters", Napi::Function::New(env, counters)); + exports.Set("simd", Napi::Function::New(env, simd)); + exports.Set("libvipsVersion", Napi::Function::New(env, libvipsVersion)); + exports.Set("format", Napi::Function::New(env, format)); + exports.Set("block", Napi::Function::New(env, block)); + exports.Set("_maxColourDistance", Napi::Function::New(env, _maxColourDistance)); + exports.Set("_isUsingJemalloc", Napi::Function::New(env, _isUsingJemalloc)); + exports.Set("_isUsingX64V2", Napi::Function::New(env, _isUsingX64V2)); + exports.Set("stats", Napi::Function::New(env, stats)); + return exports; } -NODE_MODULE(sharp, init) +NODE_API_MODULE(sharp, init) diff --git a/src/stats.cc b/src/stats.cc new file mode 100644 index 000000000..b1fd27a7c --- /dev/null +++ b/src/stats.cc @@ -0,0 +1,186 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +#include +#include +#include +#include + +#include +#include + +#include "./common.h" +#include "./stats.h" + +class StatsWorker : public Napi::AsyncWorker { + public: + StatsWorker(Napi::Function callback, StatsBaton *baton, Napi::Function debuglog) : + Napi::AsyncWorker(callback), baton(baton), debuglog(Napi::Persistent(debuglog)) {} + ~StatsWorker() {} + + const int STAT_MIN_INDEX = 0; + const int STAT_MAX_INDEX = 1; + const int STAT_SUM_INDEX = 2; + const int STAT_SQ_SUM_INDEX = 3; + const int STAT_MEAN_INDEX = 4; + const int STAT_STDEV_INDEX = 5; + const int STAT_MINX_INDEX = 6; + const int STAT_MINY_INDEX = 7; + const int STAT_MAXX_INDEX = 8; + const int STAT_MAXY_INDEX = 9; + + void Execute() { + // Decrement queued task counter + sharp::counterQueue--; + + vips::VImage image; + sharp::ImageType imageType = sharp::ImageType::UNKNOWN; + try { + std::tie(image, imageType) = OpenInput(baton->input); + } catch (vips::VError const &err) { + (baton->err).append(err.what()); + } + if (imageType != sharp::ImageType::UNKNOWN) { + try { + vips::VImage stats = image.stats(); + int const bands = image.bands(); + for (int b = 1; b <= bands; b++) { + ChannelStats cStats( + static_cast(stats.getpoint(STAT_MIN_INDEX, b).front()), + static_cast(stats.getpoint(STAT_MAX_INDEX, b).front()), + stats.getpoint(STAT_SUM_INDEX, b).front(), + stats.getpoint(STAT_SQ_SUM_INDEX, b).front(), + stats.getpoint(STAT_MEAN_INDEX, b).front(), + stats.getpoint(STAT_STDEV_INDEX, b).front(), + static_cast(stats.getpoint(STAT_MINX_INDEX, b).front()), + static_cast(stats.getpoint(STAT_MINY_INDEX, b).front()), + static_cast(stats.getpoint(STAT_MAXX_INDEX, b).front()), + static_cast(stats.getpoint(STAT_MAXY_INDEX, b).front())); + baton->channelStats.push_back(cStats); + } + // Image is not opaque when alpha layer is present and contains a non-mamixa value + if (image.has_alpha()) { + double const minAlpha = static_cast(stats.getpoint(STAT_MIN_INDEX, bands).front()); + if (minAlpha != vips_interpretation_max_alpha(image.interpretation())) { + baton->isOpaque = false; + } + } + // Convert to greyscale + vips::VImage greyscale = image.colourspace(VIPS_INTERPRETATION_B_W)[0]; + // Estimate entropy via histogram of greyscale value frequency + baton->entropy = std::abs(greyscale.hist_find().hist_entropy()); + // Estimate sharpness via standard deviation of greyscale laplacian + if (image.width() > 1 || image.height() > 1) { + VImage laplacian = VImage::new_matrixv(3, 3, + 0.0, 1.0, 0.0, + 1.0, -4.0, 1.0, + 0.0, 1.0, 0.0); + laplacian.set("scale", 9.0); + baton->sharpness = greyscale.conv(laplacian).deviate(); + } + // Most dominant sRGB colour via 4096-bin 3D histogram + vips::VImage hist = sharp::RemoveAlpha(image) + .colourspace(VIPS_INTERPRETATION_sRGB) + .hist_find_ndim(VImage::option()->set("bins", 16)); + std::complex maxpos = hist.maxpos(); + int const dx = static_cast(std::real(maxpos)); + int const dy = static_cast(std::imag(maxpos)); + std::vector pel = hist(dx, dy); + int const dz = std::distance(pel.begin(), std::find(pel.begin(), pel.end(), hist.max())); + baton->dominantRed = dx * 16 + 8; + baton->dominantGreen = dy * 16 + 8; + baton->dominantBlue = dz * 16 + 8; + } catch (vips::VError const &err) { + (baton->err).append(err.what()); + } + } + + // Clean up + vips_error_clear(); + vips_thread_shutdown(); + } + + void OnOK() { + Napi::Env env = Env(); + Napi::HandleScope scope(env); + + // Handle warnings + std::string warning = sharp::VipsWarningPop(); + while (!warning.empty()) { + debuglog.Call(Receiver().Value(), { Napi::String::New(env, warning) }); + warning = sharp::VipsWarningPop(); + } + + if (baton->err.empty()) { + // Stats Object + Napi::Object info = Napi::Object::New(env); + Napi::Array channels = Napi::Array::New(env); + + std::vector::iterator it; + int i = 0; + for (it = baton->channelStats.begin(); it < baton->channelStats.end(); it++, i++) { + Napi::Object channelStat = Napi::Object::New(env); + channelStat.Set("min", it->min); + channelStat.Set("max", it->max); + channelStat.Set("sum", it->sum); + channelStat.Set("squaresSum", it->squaresSum); + channelStat.Set("mean", it->mean); + channelStat.Set("stdev", it->stdev); + channelStat.Set("minX", it->minX); + channelStat.Set("minY", it->minY); + channelStat.Set("maxX", it->maxX); + channelStat.Set("maxY", it->maxY); + channels.Set(i, channelStat); + } + + info.Set("channels", channels); + info.Set("isOpaque", baton->isOpaque); + info.Set("entropy", baton->entropy); + info.Set("sharpness", baton->sharpness); + Napi::Object dominant = Napi::Object::New(env); + dominant.Set("r", baton->dominantRed); + dominant.Set("g", baton->dominantGreen); + dominant.Set("b", baton->dominantBlue); + info.Set("dominant", dominant); + Callback().Call(Receiver().Value(), { env.Null(), info }); + } else { + Callback().Call(Receiver().Value(), { Napi::Error::New(env, sharp::TrimEnd(baton->err)).Value() }); + } + + delete baton->input; + delete baton; + } + + private: + StatsBaton* baton; + Napi::FunctionReference debuglog; +}; + +/* + stats(options, callback) +*/ +Napi::Value stats(const Napi::CallbackInfo& info) { + // V8 objects are converted to non-V8 types held in the baton struct + StatsBaton *baton = new StatsBaton; + Napi::Object options = info[size_t(0)].As(); + + // Input + baton->input = sharp::CreateInputDescriptor(options.Get("input").As()); + baton->input->access = VIPS_ACCESS_RANDOM; + + // Function to notify of libvips warnings + Napi::Function debuglog = options.Get("debuglog").As(); + + // Join queue for worker thread + Napi::Function callback = info[size_t(1)].As(); + StatsWorker *worker = new StatsWorker(callback, baton, debuglog); + worker->Receiver().Set("options", options); + worker->Queue(); + + // Increment queued task counter + sharp::counterQueue++; + + return info.Env().Undefined(); +} diff --git a/src/stats.h b/src/stats.h new file mode 100644 index 000000000..88e13c60c --- /dev/null +++ b/src/stats.h @@ -0,0 +1,62 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +#ifndef SRC_STATS_H_ +#define SRC_STATS_H_ + +#include +#include +#include + +#include "./common.h" + +struct ChannelStats { + // stats per channel + int min; + int max; + double sum; + double squaresSum; + double mean; + double stdev; + int minX; + int minY; + int maxX; + int maxY; + + ChannelStats(int minVal, int maxVal, double sumVal, double squaresSumVal, + double meanVal, double stdevVal, int minXVal, int minYVal, int maxXVal, int maxYVal): + min(minVal), max(maxVal), sum(sumVal), squaresSum(squaresSumVal), // NOLINT(build/include_what_you_use) + mean(meanVal), stdev(stdevVal), minX(minXVal), minY(minYVal), maxX(maxXVal), maxY(maxYVal) {} +}; + +struct StatsBaton { + // Input + sharp::InputDescriptor *input; + + // Output + std::vector channelStats; + bool isOpaque; + double entropy; + double sharpness; + int dominantRed; + int dominantGreen; + int dominantBlue; + + std::string err; + + StatsBaton(): + input(nullptr), + isOpaque(true), + entropy(0.0), + sharpness(0.0), + dominantRed(0), + dominantGreen(0), + dominantBlue(0) + {} +}; + +Napi::Value stats(const Napi::CallbackInfo& info); + +#endif // SRC_STATS_H_ diff --git a/src/utilities.cc b/src/utilities.cc index 023474294..4154c08ac 100644 --- a/src/utilities.cc +++ b/src/utilities.cc @@ -1,205 +1,195 @@ -// Copyright 2013, 2014, 2015, 2016, 2017 Lovell Fuller and contributors. -// -// Licensed under the Apache License, Version 2.0 (the "License"); -// you may not use this file except in compliance with the License. -// You may obtain a copy of the License at -// -// http://www.apache.org/licenses/LICENSE-2.0 -// -// Unless required by applicable law or agreed to in writing, software -// distributed under the License is distributed on an "AS IS" BASIS, -// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -// See the License for the specific language governing permissions and -// limitations under the License. +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ #include +#include #include -#include -#include +#include #include #include -#include "common.h" -#include "operations.h" -#include "utilities.h" - -using v8::Boolean; -using v8::Integer; -using v8::Local; -using v8::Number; -using v8::Object; -using v8::String; - -using Nan::HandleScope; -using Nan::New; -using Nan::Set; -using Nan::ThrowError; -using Nan::To; -using Nan::Utf8String; +#include "./common.h" +#include "./operations.h" +#include "./utilities.h" /* Get and set cache limits */ -NAN_METHOD(cache) { - HandleScope(); +Napi::Value cache(const Napi::CallbackInfo& info) { + Napi::Env env = info.Env(); // Set memory limit - if (info[0]->IsInt32()) { - vips_cache_set_max_mem(To(info[0]).FromJust() * 1048576); + if (info[size_t(0)].IsNumber()) { + vips_cache_set_max_mem(info[size_t(0)].As().Int32Value() * 1048576); } // Set file limit - if (info[1]->IsInt32()) { - vips_cache_set_max_files(To(info[1]).FromJust()); + if (info[size_t(1)].IsNumber()) { + vips_cache_set_max_files(info[size_t(1)].As().Int32Value()); } // Set items limit - if (info[2]->IsInt32()) { - vips_cache_set_max(To(info[2]).FromJust()); + if (info[size_t(2)].IsNumber()) { + vips_cache_set_max(info[size_t(2)].As().Int32Value()); } // Get memory stats - Local memory = New(); - Set(memory, New("current").ToLocalChecked(), - New(static_cast(round(vips_tracked_get_mem() / 1048576)))); - Set(memory, New("high").ToLocalChecked(), - New(static_cast(round(vips_tracked_get_mem_highwater() / 1048576)))); - Set(memory, New("max").ToLocalChecked(), - New(static_cast(round(vips_cache_get_max_mem() / 1048576)))); + Napi::Object memory = Napi::Object::New(env); + memory.Set("current", round(vips_tracked_get_mem() / 1048576)); + memory.Set("high", round(vips_tracked_get_mem_highwater() / 1048576)); + memory.Set("max", round(vips_cache_get_max_mem() / 1048576)); // Get file stats - Local files = New(); - Set(files, New("current").ToLocalChecked(), New(vips_tracked_get_files())); - Set(files, New("max").ToLocalChecked(), New(vips_cache_get_max_files())); + Napi::Object files = Napi::Object::New(env); + files.Set("current", vips_tracked_get_files()); + files.Set("max", vips_cache_get_max_files()); // Get item stats - Local items = New(); - Set(items, New("current").ToLocalChecked(), New(vips_cache_get_size())); - Set(items, New("max").ToLocalChecked(), New(vips_cache_get_max())); + Napi::Object items = Napi::Object::New(env); + items.Set("current", vips_cache_get_size()); + items.Set("max", vips_cache_get_max()); - Local cache = New(); - Set(cache, New("memory").ToLocalChecked(), memory); - Set(cache, New("files").ToLocalChecked(), files); - Set(cache, New("items").ToLocalChecked(), items); - info.GetReturnValue().Set(cache); + Napi::Object cache = Napi::Object::New(env); + cache.Set("memory", memory); + cache.Set("files", files); + cache.Set("items", items); + return cache; } /* Get and set size of thread pool */ -NAN_METHOD(concurrency) { - HandleScope(); - +Napi::Value concurrency(const Napi::CallbackInfo& info) { // Set concurrency - if (info[0]->IsInt32()) { - vips_concurrency_set(To(info[0]).FromJust()); + if (info[size_t(0)].IsNumber()) { + vips_concurrency_set(info[size_t(0)].As().Int32Value()); } // Get concurrency - info.GetReturnValue().Set(New(vips_concurrency_get())); + return Napi::Number::New(info.Env(), vips_concurrency_get()); } /* Get internal counters (queued tasks, processing tasks) */ -NAN_METHOD(counters) { - using sharp::counterProcess; - using sharp::counterQueue; - - HandleScope(); - Local counters = New(); - Set(counters, New("queue").ToLocalChecked(), New(counterQueue)); - Set(counters, New("process").ToLocalChecked(), New(counterProcess)); - info.GetReturnValue().Set(counters); +Napi::Value counters(const Napi::CallbackInfo& info) { + Napi::Object counters = Napi::Object::New(info.Env()); + counters.Set("queue", static_cast(sharp::counterQueue)); + counters.Set("process", static_cast(sharp::counterProcess)); + return counters; } /* Get and set use of SIMD vector unit instructions */ -NAN_METHOD(simd) { - HandleScope(); - +Napi::Value simd(const Napi::CallbackInfo& info) { // Set state - if (info[0]->IsBoolean()) { - vips_vector_set_enabled(To(info[0]).FromJust()); + if (info[size_t(0)].IsBoolean()) { + vips_vector_set_enabled(info[size_t(0)].As().Value()); } // Get state - info.GetReturnValue().Set(New(vips_vector_isenabled())); + return Napi::Boolean::New(info.Env(), vips_vector_isenabled()); } /* Get libvips version */ -NAN_METHOD(libvipsVersion) { - HandleScope(); - char version[9]; - g_snprintf(version, sizeof(version), "%d.%d.%d", vips_version(0), vips_version(1), vips_version(2)); - info.GetReturnValue().Set(New(version).ToLocalChecked()); +Napi::Value libvipsVersion(const Napi::CallbackInfo& info) { + Napi::Env env = info.Env(); + Napi::Object version = Napi::Object::New(env); + + char semver[9]; + std::snprintf(semver, sizeof(semver), "%d.%d.%d", vips_version(0), vips_version(1), vips_version(2)); + version.Set("semver", Napi::String::New(env, semver)); +#ifdef SHARP_USE_GLOBAL_LIBVIPS + version.Set("isGlobal", Napi::Boolean::New(env, true)); +#else + version.Set("isGlobal", Napi::Boolean::New(env, false)); +#endif +#ifdef __EMSCRIPTEN__ + version.Set("isWasm", Napi::Boolean::New(env, true)); +#else + version.Set("isWasm", Napi::Boolean::New(env, false)); +#endif + return version; } /* Get available input/output file/buffer/stream formats */ -NAN_METHOD(format) { - HandleScope(); - - // Attribute names - Local attrId = New("id").ToLocalChecked(); - Local attrInput = New("input").ToLocalChecked(); - Local attrOutput = New("output").ToLocalChecked(); - Local attrFile = New("file").ToLocalChecked(); - Local attrBuffer = New("buffer").ToLocalChecked(); - Local attrStream = New("stream").ToLocalChecked(); - - // Which load/save operations are available for each compressed format? - Local format = New(); - for (std::string f : { - "jpeg", "png", "webp", "tiff", "magick", "openslide", "dz", "ppm", "fits", "gif", "svg", "pdf", "v" +Napi::Value format(const Napi::CallbackInfo& info) { + Napi::Env env = info.Env(); + Napi::Object format = Napi::Object::New(env); + for (std::string const f : { + "jpeg", "png", "webp", "tiff", "magick", "openslide", "dz", + "ppm", "fits", "gif", "svg", "heif", "pdf", "vips", "jp2k", "jxl", "rad", "dcraw" }) { // Input - Local hasInputFile = - New(vips_type_find("VipsOperation", (f + "load").c_str())); - Local hasInputBuffer = - New(vips_type_find("VipsOperation", (f + "load_buffer").c_str())); - Local input = New(); - Set(input, attrFile, hasInputFile); - Set(input, attrBuffer, hasInputBuffer); - Set(input, attrStream, hasInputBuffer); + const VipsObjectClass *oc = vips_class_find("VipsOperation", (f + "load").c_str()); + Napi::Boolean hasInputFile = Napi::Boolean::New(env, oc); + Napi::Boolean hasInputBuffer = + Napi::Boolean::New(env, vips_type_find("VipsOperation", (f + "load_buffer").c_str())); + Napi::Object input = Napi::Object::New(env); + input.Set("file", hasInputFile); + input.Set("buffer", hasInputBuffer); + input.Set("stream", hasInputBuffer); + if (hasInputFile) { + const VipsForeignClass *fc = VIPS_FOREIGN_CLASS(oc); + if (fc->suffs) { + Napi::Array fileSuffix = Napi::Array::New(env); + const char **suffix = fc->suffs; + for (int i = 0; *suffix; i++, suffix++) { + fileSuffix.Set(i, Napi::String::New(env, *suffix)); + } + input.Set("fileSuffix", fileSuffix); + } + } // Output - Local hasOutputFile = - New(vips_type_find("VipsOperation", (f + "save").c_str())); - Local hasOutputBuffer = - New(vips_type_find("VipsOperation", (f + "save_buffer").c_str())); - Local output = New(); - Set(output, attrFile, hasOutputFile); - Set(output, attrBuffer, hasOutputBuffer); - Set(output, attrStream, hasOutputBuffer); + Napi::Boolean hasOutputFile = + Napi::Boolean::New(env, vips_type_find("VipsOperation", (f + "save").c_str())); + Napi::Boolean hasOutputBuffer = + Napi::Boolean::New(env, vips_type_find("VipsOperation", (f + "save_buffer").c_str())); + Napi::Object output = Napi::Object::New(env); + output.Set("file", hasOutputFile); + output.Set("buffer", hasOutputBuffer); + output.Set("stream", hasOutputBuffer); // Other attributes - Local container = New(); - Local formatId = New(f).ToLocalChecked(); - Set(container, attrId, formatId); - Set(container, attrInput, input); - Set(container, attrOutput, output); + Napi::Object container = Napi::Object::New(env); + container.Set("id", f); + container.Set("input", input); + container.Set("output", output); // Add to set of formats - Set(format, formatId, container); + format.Set(f, container); } // Raw, uncompressed data - Local raw = New(); - Local rawId = New("raw").ToLocalChecked(); - Set(raw, attrId, rawId); - Set(format, rawId, raw); - Local supported = New(true); - Local unsupported = New(false); - Local rawInput = New(); - Set(rawInput, attrFile, unsupported); - Set(rawInput, attrBuffer, supported); - Set(rawInput, attrStream, supported); - Set(raw, attrInput, rawInput); - Local rawOutput = New(); - Set(rawOutput, attrFile, unsupported); - Set(rawOutput, attrBuffer, supported); - Set(rawOutput, attrStream, supported); - Set(raw, attrOutput, rawOutput); + Napi::Boolean supported = Napi::Boolean::New(env, true); + Napi::Boolean unsupported = Napi::Boolean::New(env, false); + Napi::Object rawInput = Napi::Object::New(env); + rawInput.Set("file", unsupported); + rawInput.Set("buffer", supported); + rawInput.Set("stream", supported); + Napi::Object rawOutput = Napi::Object::New(env); + rawOutput.Set("file", unsupported); + rawOutput.Set("buffer", supported); + rawOutput.Set("stream", supported); + Napi::Object raw = Napi::Object::New(env); + raw.Set("id", "raw"); + raw.Set("input", rawInput); + raw.Set("output", rawOutput); + format.Set("raw", raw); - info.GetReturnValue().Set(format); + return format; +} + +/* + (Un)block libvips operations at runtime. +*/ +void block(const Napi::CallbackInfo& info) { + Napi::Array ops = info[size_t(0)].As(); + bool const state = info[size_t(1)].As().Value(); + for (unsigned int i = 0; i < ops.Length(); i++) { + vips_operation_block_set(ops.Get(i).As().Utf8Value().c_str(), state); + } } /* @@ -207,65 +197,92 @@ NAN_METHOD(format) { Calculates the maximum colour distance using the DE2000 algorithm between two images of the same dimensions and number of channels. */ -NAN_METHOD(_maxColourDistance) { - using vips::VImage; - using vips::VError; - using sharp::DetermineImageType; - using sharp::ImageType; - using sharp::HasAlpha; - - HandleScope(); +Napi::Value _maxColourDistance(const Napi::CallbackInfo& info) { + Napi::Env env = info.Env(); // Open input files VImage image1; - ImageType imageType1 = DetermineImageType(*Utf8String(info[0])); - if (imageType1 != ImageType::UNKNOWN) { + sharp::ImageType imageType1 = sharp::DetermineImageType(info[size_t(0)].As().Utf8Value().data()); + if (imageType1 != sharp::ImageType::UNKNOWN) { try { - image1 = VImage::new_from_file(*Utf8String(info[0])); + image1 = VImage::new_from_file(info[size_t(0)].As().Utf8Value().c_str()); } catch (...) { - return ThrowError("Input file 1 has corrupt header"); + throw Napi::Error::New(env, "Input file 1 has corrupt header"); } } else { - return ThrowError("Input file 1 is of an unsupported image format"); + throw Napi::Error::New(env, "Input file 1 is of an unsupported image format"); } VImage image2; - ImageType imageType2 = DetermineImageType(*Utf8String(info[1])); - if (imageType2 != ImageType::UNKNOWN) { + sharp::ImageType imageType2 = sharp::DetermineImageType(info[size_t(1)].As().Utf8Value().data()); + if (imageType2 != sharp::ImageType::UNKNOWN) { try { - image2 = VImage::new_from_file(*Utf8String(info[1])); + image2 = VImage::new_from_file(info[size_t(1)].As().Utf8Value().c_str()); } catch (...) { - return ThrowError("Input file 2 has corrupt header"); + throw Napi::Error::New(env, "Input file 2 has corrupt header"); } } else { - return ThrowError("Input file 2 is of an unsupported image format"); + throw Napi::Error::New(env, "Input file 2 is of an unsupported image format"); } // Ensure same number of channels if (image1.bands() != image2.bands()) { - return ThrowError("mismatchedBands"); + throw Napi::Error::New(env, "mismatchedBands"); } // Ensure same dimensions if (image1.width() != image2.width() || image1.height() != image2.height()) { - return ThrowError("mismatchedDimensions"); + throw Napi::Error::New(env, "mismatchedDimensions"); } double maxColourDistance; try { // Premultiply and remove alpha - if (HasAlpha(image1)) { + if (image1.has_alpha()) { image1 = image1.premultiply().extract_band(1, VImage::option()->set("n", image1.bands() - 1)); } - if (HasAlpha(image2)) { + if (image2.has_alpha()) { image2 = image2.premultiply().extract_band(1, VImage::option()->set("n", image2.bands() - 1)); } // Calculate colour distance maxColourDistance = image1.dE00(image2).max(); - } catch (VError err) { - return ThrowError(err.what()); + } catch (vips::VError const &err) { + throw Napi::Error::New(env, err.what()); } // Clean up libvips' per-request data and threads vips_error_clear(); vips_thread_shutdown(); - info.GetReturnValue().Set(New(maxColourDistance)); + return Napi::Number::New(env, maxColourDistance); +} + +#if defined(__GNUC__) +// mallctl will be resolved by the runtime linker when jemalloc is being used +extern "C" { + int mallctl(const char *name, void *oldp, size_t *oldlenp, void *newp, size_t newlen) __attribute__((weak)); +} +Napi::Value _isUsingJemalloc(const Napi::CallbackInfo& info) { + Napi::Env env = info.Env(); + return Napi::Boolean::New(env, mallctl != nullptr); +} +#else +Napi::Value _isUsingJemalloc(const Napi::CallbackInfo& info) { + Napi::Env env = info.Env(); + return Napi::Boolean::New(env, false); +} +#endif + +#if defined(__GNUC__) && defined(__x86_64__) +// Are SSE 4.2 intrinsics available at runtime? +Napi::Value _isUsingX64V2(const Napi::CallbackInfo& info) { + Napi::Env env = info.Env(); + unsigned int eax, ebx, ecx, edx; + __asm__ __volatile__("cpuid" + : "=a"(eax), "=b"(ebx), "=c"(ecx), "=d"(edx) + : "a"(1)); + return Napi::Boolean::New(env, (ecx & 1U << 20) != 0); +} +#else +Napi::Value _isUsingX64V2(const Napi::CallbackInfo& info) { + Napi::Env env = info.Env(); + return Napi::Boolean::New(env, false); } +#endif diff --git a/src/utilities.h b/src/utilities.h index f3cbd0c5b..a1719fa23 100644 --- a/src/utilities.h +++ b/src/utilities.h @@ -1,28 +1,22 @@ -// Copyright 2013, 2014, 2015, 2016, 2017 Lovell Fuller and contributors. -// -// Licensed under the Apache License, Version 2.0 (the "License"); -// you may not use this file except in compliance with the License. -// You may obtain a copy of the License at -// -// http://www.apache.org/licenses/LICENSE-2.0 -// -// Unless required by applicable law or agreed to in writing, software -// distributed under the License is distributed on an "AS IS" BASIS, -// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -// See the License for the specific language governing permissions and -// limitations under the License. +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ #ifndef SRC_UTILITIES_H_ #define SRC_UTILITIES_H_ -#include +#include -NAN_METHOD(cache); -NAN_METHOD(concurrency); -NAN_METHOD(counters); -NAN_METHOD(simd); -NAN_METHOD(libvipsVersion); -NAN_METHOD(format); -NAN_METHOD(_maxColourDistance); +Napi::Value cache(const Napi::CallbackInfo& info); +Napi::Value concurrency(const Napi::CallbackInfo& info); +Napi::Value counters(const Napi::CallbackInfo& info); +Napi::Value simd(const Napi::CallbackInfo& info); +Napi::Value libvipsVersion(const Napi::CallbackInfo& info); +Napi::Value format(const Napi::CallbackInfo& info); +void block(const Napi::CallbackInfo& info); +Napi::Value _maxColourDistance(const Napi::CallbackInfo& info); +Napi::Value _isUsingJemalloc(const Napi::CallbackInfo& info); +Napi::Value _isUsingX64V2(const Napi::CallbackInfo& info); #endif // SRC_UTILITIES_H_ diff --git a/test/bench/Dockerfile b/test/bench/Dockerfile new file mode 100644 index 000000000..290db857a --- /dev/null +++ b/test/bench/Dockerfile @@ -0,0 +1,28 @@ +FROM ubuntu:25.04 +ARG BRANCH=main + +# Install basic dependencies +RUN apt-get -y update && apt-get install -y build-essential curl git ca-certificates gnupg + +# Install latest Node.js LTS +RUN curl -fsSL https://deb.nodesource.com/setup_24.x -o nodesource_setup.sh +RUN bash nodesource_setup.sh +RUN apt-get install -y nodejs + +# Install benchmark dependencies +RUN apt-get install -y imagemagick libmagick++-dev graphicsmagick + +# Install sharp +RUN mkdir /tmp/sharp +RUN cd /tmp && git clone --single-branch --branch $BRANCH https://github.com/lovell/sharp.git +RUN cd /tmp/sharp && npm install && npm run build + +# Install benchmark test +RUN cd /tmp/sharp/test/bench && npm install --omit optional + +RUN cat /etc/os-release | grep VERSION= +RUN node -v + +WORKDIR /tmp/sharp/test/bench + +CMD [ "node", "perf" ] diff --git a/test/bench/package.json b/test/bench/package.json index 5976d75b4..e6a97a4f8 100644 --- a/test/bench/package.json +++ b/test/bench/package.json @@ -7,20 +7,16 @@ "scripts": { "test": "node perf && node random && node parallel" }, - "devDependencies": { - "async": "^2.5.0", - "benchmark": "^2.1.4", - "gm": "^1.23.0", - "imagemagick": "^0.1.3", - "imagemagick-native": "^1.9.3", - "images": "^3.0.0", - "jimp": "^0.2.28", - "mapnik": "^3.6.2", - "pajk-lwip": "^0.2.0", - "semver": "^5.3.0" + "dependencies": { + "async": "3.2.6", + "benchmark": "2.1.4", + "gm": "1.25.1", + "imagemagick": "0.1.3", + "jimp": "1.6.0" }, - "license": "Apache-2.0", - "engines": { - "node": ">=4" - } + "optionalDependencies": { + "@tensorflow/tfjs-node": "4.22.0", + "mapnik": "4.5.9" + }, + "license": "Apache-2.0" } diff --git a/test/bench/parallel.js b/test/bench/parallel.js index 8f671252d..88d7d4bac 100644 --- a/test/bench/parallel.js +++ b/test/bench/parallel.js @@ -1,8 +1,11 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ process.env.UV_THREADPOOL_SIZE = 64; -const assert = require('assert'); +const assert = require('node:assert'); const async = require('async'); const sharp = require('../../'); @@ -12,33 +15,30 @@ const width = 720; const height = 480; sharp.concurrency(1); -sharp.simd(true); -const timer = setInterval(function () { +const timer = setInterval(() => { console.dir(sharp.counters()); }, 100); -async.mapSeries([1, 1, 2, 4, 8, 16, 32, 64], function (parallelism, next) { - const start = new Date().getTime(); +async.mapSeries([1, 1, 2, 4, 8, 16, 32, 64], (parallelism, next) => { + const start = Date.now(); async.times(parallelism, - function (id, callback) { - /* jslint unused: false */ - sharp(fixtures.inputJpg).resize(width, height).toBuffer(function (err, buffer) { + (_id, callback) => { + sharp(fixtures.inputJpg).resize(width, height).toBuffer((err, buffer) => { buffer = null; - callback(err, new Date().getTime() - start); + callback(err, Date.now() - start); }); }, - function (err, ids) { + (err, ids) => { assert(!err); assert(ids.length === parallelism); - const mean = ids.reduce(function (a, b) { - return a + b; - }) / ids.length; - console.log(parallelism + ' parallel calls: fastest=' + ids[0] + 'ms slowest=' + ids[ids.length - 1] + 'ms mean=' + mean + 'ms'); + ids.sort(); + const mean = ids.reduce((a, b) => a + b) / ids.length; + console.log(`${parallelism} parallel calls: fastest=${ids[0]}ms slowest=${ids[ids.length - 1]}ms mean=${mean}ms`); next(); } ); -}, function () { +}, () => { clearInterval(timer); console.dir(sharp.counters()); }); diff --git a/test/bench/perf.js b/test/bench/perf.js index 0a0739b3a..375006ef0 100644 --- a/test/bench/perf.js +++ b/test/bench/perf.js @@ -1,148 +1,84 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const fs = require('fs'); +const fs = require('node:fs'); +const { execSync } = require('node:child_process'); const async = require('async'); -const assert = require('assert'); const Benchmark = require('benchmark'); +const safeRequire = (name) => { + try { + return require(name); + } catch (_err) {} + return null; +}; + // Contenders const sharp = require('../../'); const gm = require('gm'); const imagemagick = require('imagemagick'); -const mapnik = require('mapnik'); -const jimp = require('jimp'); -let images; -try { - images = require('images'); -} catch (err) { - console.log('Excluding node-images'); -} -let imagemagickNative; -try { - imagemagickNative = require('imagemagick-native'); -} catch (err) { - console.log('Excluding imagemagick-native'); -} -let lwip; -try { - lwip = require('pajk-lwip'); -} catch (err) { - console.log('Excluding lwip'); -} +const mapnik = safeRequire('mapnik'); +const { Jimp, JimpMime } = require('jimp'); + +process.env.TF_CPP_MIN_LOG_LEVEL = 1; +const tfjs = safeRequire('@tensorflow/tfjs-node'); const fixtures = require('../fixtures'); +const outputJpg = fixtures.path('output.jpg'); +const outputPng = fixtures.path('output.png'); +const outputWebP = fixtures.path('output.webp'); + const width = 720; const height = 588; +const heightPng = 540; // Disable libvips cache to ensure tests are as fair as they can be sharp.cache(false); -// Enable use of SIMD -sharp.simd(true); + +// Spawn one thread per physical CPU core +const physicalCores = Number(execSync('lscpu -p | egrep -v "^#" | sort -u -t, -k 2,4 | wc -l', { encoding: 'utf-8' }).trim()); +console.log(`Detected ${physicalCores} physical cores`); +sharp.concurrency(physicalCores); async.series({ - 'jpeg': function (callback) { + jpeg: (callback) => { const inputJpgBuffer = fs.readFileSync(fixtures.inputJpg); const jpegSuite = new Benchmark.Suite('jpeg'); // jimp jpegSuite.add('jimp-buffer-buffer', { defer: true, - fn: function (deferred) { - jimp.read(inputJpgBuffer, function (err, image) { - if (err) { - throw err; - } else { - image - .resize(width, height) - .quality(80) - .getBuffer(jimp.MIME_JPEG, function (err) { - if (err) { - throw err; - } else { - deferred.resolve(); - } - }); - } - }); + fn: async (deferred) => { + const image = await Jimp.read(inputJpgBuffer); + await image + .resize({ w: width, h: height, mode: Jimp.RESIZE_BICUBIC }) + .getBuffer(JimpMime.jpeg, { quality: 80 }); + deferred.resolve(); } }).add('jimp-file-file', { defer: true, - fn: function (deferred) { - jimp.read(fixtures.inputJpg, function (err, image) { - if (err) { - throw err; - } else { - image - .resize(width, height) - .quality(80) - .write(fixtures.outputJpg, function (err) { - if (err) { - throw err; - } else { - deferred.resolve(); - } - }); - } - }); + fn: async (deferred) => { + const image = await Jimp.read(fixtures.inputJpg); + await image + .resize({ w: width, h: height, mode: Jimp.RESIZE_BICUBIC }) + .getBuffer(JimpMime.jpeg, { quality: 80 }); + deferred.resolve(); } }); - // lwip - if (typeof lwip !== 'undefined') { - jpegSuite.add('lwip-file-file', { - defer: true, - fn: function (deferred) { - lwip.open(fixtures.inputJpg, function (err, image) { - if (err) { - throw err; - } - image.resize(width, height, 'lanczos', function (err, image) { - if (err) { - throw err; - } - image.writeFile(fixtures.outputJpg, {quality: 80}, function (err) { - if (err) { - throw err; - } - deferred.resolve(); - }); - }); - }); - } - }).add('lwip-buffer-buffer', { - defer: true, - fn: function (deferred) { - lwip.open(inputJpgBuffer, 'jpg', function (err, image) { - if (err) { - throw err; - } - image.resize(width, height, 'lanczos', function (err, image) { - if (err) { - throw err; - } - image.toBuffer('jpg', {quality: 80}, function (err, buffer) { - if (err) { - throw err; - } - assert.notStrictEqual(null, buffer); - deferred.resolve(); - }); - }); - }); - } - }); - } // mapnik - jpegSuite.add('mapnik-file-file', { + mapnik && jpegSuite.add('mapnik-file-file', { defer: true, - fn: function (deferred) { - mapnik.Image.open(fixtures.inputJpg, function (err, img) { + fn: (deferred) => { + mapnik.Image.open(fixtures.inputJpg, (err, img) => { if (err) throw err; img .resize(width, height, { scaling_method: mapnik.imageScaling.lanczos }) - .save(fixtures.outputJpg, 'jpeg:quality=80', function (err) { + .save(outputJpg, 'jpeg:quality=80', (err) => { if (err) throw err; deferred.resolve(); }); @@ -150,14 +86,14 @@ async.series({ } }).add('mapnik-buffer-buffer', { defer: true, - fn: function (deferred) { - mapnik.Image.fromBytes(inputJpgBuffer, { max_size: 3000 }, function (err, img) { + fn: (deferred) => { + mapnik.Image.fromBytes(inputJpgBuffer, { max_size: 3000 }, (err, img) => { if (err) throw err; img .resize(width, height, { scaling_method: mapnik.imageScaling.lanczos }) - .encode('jpeg:quality=80', function (err) { + .encode('jpeg:quality=80', (err) => { if (err) throw err; deferred.resolve(); }); @@ -167,16 +103,16 @@ async.series({ // imagemagick jpegSuite.add('imagemagick-file-file', { defer: true, - fn: function (deferred) { + fn: (deferred) => { imagemagick.resize({ srcPath: fixtures.inputJpg, - dstPath: fixtures.outputJpg, + dstPath: outputJpg, quality: 0.8, - width: width, - height: height, + width, + height, format: 'jpg', filter: 'Lanczos' - }, function (err) { + }, (err) => { if (err) { throw err; } else { @@ -185,38 +121,15 @@ async.series({ }); } }); - // imagemagick-native - if (typeof imagemagickNative !== 'undefined') { - jpegSuite.add('imagemagick-native-buffer-buffer', { - defer: true, - fn: function (deferred) { - imagemagickNative.convert({ - srcData: inputJpgBuffer, - quality: 80, - width: width, - height: height, - format: 'JPEG', - filter: 'Lanczos' - }, function (err, buffer) { - if (err) { - throw err; - } else { - assert.notStrictEqual(null, buffer); - deferred.resolve(); - } - }); - } - }); - } // gm jpegSuite.add('gm-buffer-file', { defer: true, - fn: function (deferred) { + fn: (deferred) => { gm(inputJpgBuffer) .filter('Lanczos') .resize(width, height) .quality(80) - .write(fixtures.outputJpg, function (err) { + .write(outputJpg, (err) => { if (err) { throw err; } else { @@ -226,28 +139,27 @@ async.series({ } }).add('gm-buffer-buffer', { defer: true, - fn: function (deferred) { + fn: (deferred) => { gm(inputJpgBuffer) .filter('Lanczos') .resize(width, height) .quality(80) - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('gm-file-file', { defer: true, - fn: function (deferred) { + fn: (deferred) => { gm(fixtures.inputJpg) .filter('Lanczos') .resize(width, height) .quality(80) - .write(fixtures.outputJpg, function (err) { + .write(outputJpg, (err) => { if (err) { throw err; } else { @@ -257,36 +169,45 @@ async.series({ } }).add('gm-file-buffer', { defer: true, - fn: function (deferred) { + fn: (deferred) => { gm(fixtures.inputJpg) .filter('Lanczos') .resize(width, height) .quality(80) - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }); - // images - if (typeof images !== 'undefined') { - jpegSuite.add('images-file-file', function () { - images(fixtures.inputJpg) - .resize(width, height) - .save(fixtures.outputJpg, { quality: 80 }); - }); - } + // tfjs + tfjs && jpegSuite.add('tfjs-node-buffer-buffer', { + defer: true, + fn: (deferred) => { + const decoded = tfjs.node.decodeJpeg(inputJpgBuffer); + const resized = tfjs.image.resizeBilinear(decoded, [height, width]); + tfjs + .node + .encodeJpeg(resized, 'rgb', 80) + .then(() => { + deferred.resolve(); + tfjs.disposeVariables(); + }) + .catch((err) => { + throw err; + }); + } + }); // sharp jpegSuite.add('sharp-buffer-file', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) .resize(width, height) - .toFile(fixtures.outputJpg, function (err) { + .toFile(outputJpg, (err) => { if (err) { throw err; } else { @@ -296,24 +217,23 @@ async.series({ } }).add('sharp-buffer-buffer', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) .resize(width, height) - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-file-file', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(fixtures.inputJpg) .resize(width, height) - .toFile(fixtures.outputJpg, function (err) { + .toFile(outputJpg, (err) => { if (err) { throw err; } else { @@ -323,10 +243,10 @@ async.series({ } }).add('sharp-stream-stream', { defer: true, - fn: function (deferred) { + fn: (deferred) => { const readable = fs.createReadStream(fixtures.inputJpg); - const writable = fs.createWriteStream(fixtures.outputJpg); - writable.on('finish', function () { + const writable = fs.createWriteStream(outputJpg); + writable.on('finish', () => { deferred.resolve(); }); const pipeline = sharp() @@ -335,407 +255,375 @@ async.series({ } }).add('sharp-file-buffer', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(fixtures.inputJpg) .resize(width, height) - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-promise', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) .resize(width, height) .toBuffer() - .then(function (buffer) { - assert.notStrictEqual(null, buffer); + .then(() => { deferred.resolve(); + }) + .catch((err) => { + throw err; }); } - }).on('cycle', function (event) { - console.log('jpeg ' + String(event.target)); + }).on('cycle', (event) => { + console.log(`jpeg ${String(event.target)}`); }).on('complete', function () { callback(null, this.filter('fastest').map('name')); }).run(); }, // Effect of applying operations - operations: function (callback) { + operations: (callback) => { const inputJpgBuffer = fs.readFileSync(fixtures.inputJpg); const operationsSuite = new Benchmark.Suite('operations'); operationsSuite.add('sharp-sharpen-mild', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) .resize(width, height) .sharpen() - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-sharpen-radius', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) .resize(width, height) .sharpen(3, 1, 3) - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-blur-mild', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) .resize(width, height) .blur() - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-blur-radius', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) .resize(width, height) .blur(3) - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-gamma', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) .resize(width, height) .gamma() - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-normalise', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) .resize(width, height) .normalise() - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-greyscale', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) .resize(width, height) .greyscale() - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-greyscale-gamma', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) .resize(width, height) .gamma() .greyscale() - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-progressive', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) .resize(width, height) .jpeg({ progressive: true }) - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-without-chroma-subsampling', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) .resize(width, height) .jpeg({ chromaSubsampling: '4:4:4' }) - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-rotate', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) .rotate(90) .resize(width, height) - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-without-simd', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp.simd(false); sharp(inputJpgBuffer) .resize(width, height) - .toBuffer(function (err, buffer) { + .toBuffer((err) => { sharp.simd(true); if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } - }).add('sharp-sequentialRead', { + }).add('sharp-random-access-read', { defer: true, - fn: function (deferred) { - sharp(inputJpgBuffer) - .sequentialRead() + fn: (deferred) => { + sharp(inputJpgBuffer, { sequentialRead: false }) .resize(width, height) - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-crop-entropy', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) - .resize(width, height) - .crop(sharp.strategy.entropy) - .toBuffer(function (err, buffer) { + .resize(width, height, { + fit: 'cover', + position: sharp.strategy.entropy + }) + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-crop-attention', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) - .resize(width, height) - .crop(sharp.strategy.attention) - .toBuffer(function (err, buffer) { + .resize(width, height, { + fit: 'cover', + position: sharp.strategy.attention + }) + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } - }).on('cycle', function (event) { - console.log('operations ' + String(event.target)); + }).on('cycle', (event) => { + console.log(`operations ${String(event.target)}`); }).on('complete', function () { callback(null, this.filter('fastest').map('name')); }).run(); }, - // Comparitive speed of kernels - kernels: function (callback) { + // Comparative speed of kernels + kernels: (callback) => { const inputJpgBuffer = fs.readFileSync(fixtures.inputJpg); (new Benchmark.Suite('kernels')).add('sharp-cubic', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) .resize(width, height, { kernel: 'cubic' }) - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-lanczos2', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) .resize(width, height, { kernel: 'lanczos2' }) - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-lanczos3', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputJpgBuffer) .resize(width, height, { kernel: 'lanczos3' }) - .toBuffer(function (err, buffer) { + .toBuffer((err) => { + if (err) { + throw err; + } else { + deferred.resolve(); + } + }); + } + }).add('sharp-mks2013', { + defer: true, + fn: (deferred) => { + sharp(inputJpgBuffer) + .resize(width, height, { kernel: 'mks2013' }) + .toBuffer((err) => { + if (err) { + throw err; + } else { + deferred.resolve(); + } + }); + } + }).add('sharp-mks2021', { + defer: true, + fn: (deferred) => { + sharp(inputJpgBuffer) + .resize(width, height, { kernel: 'mks2021' }) + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } - }).on('cycle', function (event) { - console.log('kernels ' + String(event.target)); + }).on('cycle', (event) => { + console.log(`kernels ${String(event.target)}`); }).on('complete', function () { callback(null, this.filter('fastest').map('name')); }).run(); }, // PNG - png: function (callback) { - const inputPngBuffer = fs.readFileSync(fixtures.inputPng); + png: (callback) => { + const inputPngBuffer = fs.readFileSync(fixtures.inputPngAlphaPremultiplicationLarge); const pngSuite = new Benchmark.Suite('png'); + const minSamples = 64; // jimp pngSuite.add('jimp-buffer-buffer', { defer: true, - fn: function (deferred) { - jimp.read(inputPngBuffer, function (err, image) { - if (err) { - throw err; - } else { - image - .resize(width, height) - .getBuffer(jimp.MIME_PNG, function (err) { - if (err) { - throw err; - } else { - deferred.resolve(); - } - }); - } - }); + fn: async (deferred) => { + const image = await Jimp.read(inputPngBuffer); + await image + .resize({ w: width, h: heightPng, mode: Jimp.RESIZE_BICUBIC }) + .getBuffer(JimpMime.png, { deflateLevel: 6, filterType: 0 }); + deferred.resolve(); } }).add('jimp-file-file', { defer: true, - fn: function (deferred) { - jimp.read(fixtures.inputPng, function (err, image) { - if (err) { - throw err; - } else { - image - .resize(width, height) - .write(fixtures.outputPng, function (err) { - if (err) { - throw err; - } else { - deferred.resolve(); - } - }); - } - }); + fn: async (deferred) => { + const image = await Jimp.read(fixtures.inputPngAlphaPremultiplicationLarge); + await image + .resize({ w: width, h: heightPng, mode: Jimp.RESIZE_BICUBIC }) + .write(outputPng, { deflateLevel: 6, filterType: 0 }); + deferred.resolve(); } }); - // lwip - if (typeof lwip !== 'undefined') { - pngSuite.add('lwip-buffer-buffer', { - defer: true, - fn: function (deferred) { - lwip.open(inputPngBuffer, 'png', function (err, image) { - if (err) { - throw err; - } - image.resize(width, height, 'lanczos', function (err, image) { - if (err) { - throw err; - } - image.toBuffer('png', function (err, buffer) { - if (err) { - throw err; - } - assert.notStrictEqual(null, buffer); - deferred.resolve(); - }); - }); - }); - } - }); - } // mapnik - pngSuite.add('mapnik-file-file', { + mapnik && pngSuite.add('mapnik-file-file', { defer: true, - fn: function (deferred) { - mapnik.Image.open(fixtures.inputPng, function (err, img) { + fn: (deferred) => { + mapnik.Image.open(fixtures.inputPngAlphaPremultiplicationLarge, (err, img) => { if (err) throw err; - img.premultiply(function (err, img) { + img.premultiply((err, img) => { if (err) throw err; - img.resize(width, height, { + img.resize(width, heightPng, { scaling_method: mapnik.imageScaling.lanczos - }, function (err, img) { + }, (err, img) => { if (err) throw err; - img.demultiply(function (err, img) { + img.demultiply((err, img) => { if (err) throw err; - img.save(fixtures.outputPng, 'png', function (err) { + img.save(outputPng, 'png32:f=no:z=6', (err) => { if (err) throw err; deferred.resolve(); }); @@ -746,18 +634,18 @@ async.series({ } }).add('mapnik-buffer-buffer', { defer: true, - fn: function (deferred) { - mapnik.Image.fromBytes(inputPngBuffer, { max_size: 3000 }, function (err, img) { + fn: (deferred) => { + mapnik.Image.fromBytes(inputPngBuffer, { max_size: 3000 }, (err, img) => { if (err) throw err; - img.premultiply(function (err, img) { + img.premultiply((err, img) => { if (err) throw err; - img.resize(width, height, { + img.resize(width, heightPng, { scaling_method: mapnik.imageScaling.lanczos - }, function (err, img) { + }, (err, img) => { if (err) throw err; - img.demultiply(function (err, img) { + img.demultiply((err, img) => { if (err) throw err; - img.encode('png', function (err) { + img.encode('png32:f=no:z=6', (err) => { if (err) throw err; deferred.resolve(); }); @@ -770,14 +658,18 @@ async.series({ // imagemagick pngSuite.add('imagemagick-file-file', { defer: true, - fn: function (deferred) { + fn: (deferred) => { imagemagick.resize({ - srcPath: fixtures.inputPng, - dstPath: fixtures.outputPng, - width: width, - height: height, - filter: 'Lanczos' - }, function (err) { + srcPath: fixtures.inputPngAlphaPremultiplicationLarge, + dstPath: outputPng, + width, + height: heightPng, + filter: 'Lanczos', + customArgs: [ + '-define', 'PNG:compression-level=6', + '-define', 'PNG:compression-filter=0' + ] + }, (err) => { if (err) { throw err; } else { @@ -786,30 +678,16 @@ async.series({ }); } }); - // imagemagick-native - if (typeof imagemagickNative !== 'undefined') { - pngSuite.add('imagemagick-native-buffer-buffer', { - defer: true, - fn: function (deferred) { - imagemagickNative.convert({ - srcData: inputPngBuffer, - width: width, - height: height, - format: 'PNG', - filter: 'Lanczos' - }); - deferred.resolve(); - } - }); - } // gm pngSuite.add('gm-file-file', { defer: true, - fn: function (deferred) { - gm(fixtures.inputPng) + fn: (deferred) => { + gm(fixtures.inputPngAlphaPremultiplicationLarge) .filter('Lanczos') - .resize(width, height) - .write(fixtures.outputPng, function (err) { + .resize(width, heightPng) + .define('PNG:compression-level=6') + .define('PNG:compression-filter=0') + .write(outputPng, (err) => { if (err) { throw err; } else { @@ -819,35 +697,30 @@ async.series({ } }).add('gm-file-buffer', { defer: true, - fn: function (deferred) { - gm(fixtures.inputPng) + fn: (deferred) => { + gm(fixtures.inputPngAlphaPremultiplicationLarge) .filter('Lanczos') - .resize(width, height) - .toBuffer(function (err, buffer) { + .resize(width, heightPng) + .define('PNG:compression-level=6') + .define('PNG:compression-filter=0') + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }); - // images - if (typeof images !== 'undefined') { - pngSuite.add('images-file-file', function () { - images(fixtures.inputPng) - .resize(width, height) - .save(fixtures.outputPng); - }); - } // sharp pngSuite.add('sharp-buffer-file', { defer: true, - fn: function (deferred) { + minSamples, + fn: (deferred) => { sharp(inputPngBuffer) - .resize(width, height) - .toFile(fixtures.outputPng, function (err) { + .resize(width, heightPng) + .png({ compressionLevel: 6 }) + .toFile(outputPng, (err) => { if (err) { throw err; } else { @@ -857,24 +730,27 @@ async.series({ } }).add('sharp-buffer-buffer', { defer: true, - fn: function (deferred) { + minSamples, + fn: (deferred) => { sharp(inputPngBuffer) - .resize(width, height) - .toBuffer(function (err, buffer) { + .resize(width, heightPng) + .png({ compressionLevel: 6 }) + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-file-file', { defer: true, - fn: function (deferred) { - sharp(fixtures.inputPng) - .resize(width, height) - .toFile(fixtures.outputPng, function (err) { + minSamples, + fn: (deferred) => { + sharp(fixtures.inputPngAlphaPremultiplicationLarge) + .resize(width, heightPng) + .png({ compressionLevel: 6 }) + .toFile(outputPng, (err) => { if (err) { throw err; } else { @@ -884,64 +760,80 @@ async.series({ } }).add('sharp-file-buffer', { defer: true, - fn: function (deferred) { - sharp(fixtures.inputPng) - .resize(width, height) - .toBuffer(function (err, buffer) { + minSamples, + fn: (deferred) => { + sharp(fixtures.inputPngAlphaPremultiplicationLarge) + .resize(width, heightPng) + .png({ compressionLevel: 6 }) + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-progressive', { defer: true, - fn: function (deferred) { + minSamples, + fn: (deferred) => { sharp(inputPngBuffer) - .resize(width, height) - .png({ progressive: true }) - .toBuffer(function (err, buffer) { + .resize(width, heightPng) + .png({ compressionLevel: 6, progressive: true }) + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } - }).add('sharp-withoutAdaptiveFiltering', { + }).add('sharp-adaptiveFiltering', { defer: true, - fn: function (deferred) { + minSamples, + fn: (deferred) => { sharp(inputPngBuffer) - .resize(width, height) - .png({ adaptiveFiltering: false }) - .toBuffer(function (err, buffer) { + .resize(width, heightPng) + .png({ adaptiveFiltering: true, compressionLevel: 6 }) + .toBuffer((err) => { + if (err) { + throw err; + } else { + deferred.resolve(); + } + }); + } + }).add('sharp-compressionLevel=9', { + defer: true, + minSamples, + fn: (deferred) => { + sharp(inputPngBuffer) + .resize(width, heightPng) + .png({ compressionLevel: 9 }) + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }); - pngSuite.on('cycle', function (event) { - console.log(' png ' + String(event.target)); + pngSuite.on('cycle', (event) => { + console.log(` png ${String(event.target)}`); }).on('complete', function () { callback(null, this.filter('fastest').map('name')); }).run(); }, // WebP - webp: function (callback) { + webp: (callback) => { const inputWebPBuffer = fs.readFileSync(fixtures.inputWebP); (new Benchmark.Suite('webp')).add('sharp-buffer-file', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputWebPBuffer) .resize(width, height) - .toFile(fixtures.outputWebP, function (err) { + .toFile(outputWebP, (err) => { if (err) { throw err; } else { @@ -951,24 +843,23 @@ async.series({ } }).add('sharp-buffer-buffer', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(inputWebPBuffer) .resize(width, height) - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } }).add('sharp-file-file', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(fixtures.inputWebP) .resize(width, height) - .toFile(fixtures.outputWebP, function (err) { + .toFile(outputWebP, (err) => { if (err) { throw err; } else { @@ -978,29 +869,30 @@ async.series({ } }).add('sharp-file-buffer', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(fixtures.inputWebP) .resize(width, height) - .toBuffer(function (err, buffer) { + .toBuffer((err) => { if (err) { throw err; } else { - assert.notStrictEqual(null, buffer); deferred.resolve(); } }); } - }).on('cycle', function (event) { - console.log('webp ' + String(event.target)); + }).on('cycle', (event) => { + console.log(`webp ${String(event.target)}`); }).on('complete', function () { callback(null, this.filter('fastest').map('name')); }).run(); } -}, function (err, results) { - assert(!err, err); - Object.keys(results).forEach(function (format) { +}, (err, results) => { + if (err) { + throw err; + } + Object.keys(results).forEach((format) => { if (results[format].toString().substr(0, 5) !== 'sharp') { - console.log('sharp was slower than ' + results[format] + ' for ' + format); + console.log(`sharp was slower than ${results[format]} for ${format}`); } }); console.dir(sharp.cache()); diff --git a/test/bench/random.js b/test/bench/random.js index 426cd0a1e..33a2198da 100644 --- a/test/bench/random.js +++ b/test/bench/random.js @@ -1,35 +1,35 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ const imagemagick = require('imagemagick'); const gm = require('gm'); -const assert = require('assert'); +const assert = require('node:assert'); const Benchmark = require('benchmark'); const sharp = require('../../'); const fixtures = require('../fixtures'); sharp.cache(false); -sharp.simd(true); const min = 320; const max = 960; -const randomDimension = function () { - return Math.ceil((Math.random() * (max - min)) + min); -}; +const randomDimension = () => Math.ceil((Math.random() * (max - min)) + min); new Benchmark.Suite('random').add('imagemagick', { defer: true, - fn: function (deferred) { + fn: (deferred) => { imagemagick.resize({ srcPath: fixtures.inputJpg, - dstPath: fixtures.outputJpg, + dstPath: fixtures.path('output.jpg'), quality: 0.8, width: randomDimension(), height: randomDimension(), format: 'jpg', filter: 'Lanczos' - }, function (err) { + }, (err) => { if (err) { throw err; } else { @@ -39,12 +39,12 @@ new Benchmark.Suite('random').add('imagemagick', { } }).add('gm', { defer: true, - fn: function (deferred) { + fn: (deferred) => { gm(fixtures.inputJpg) .resize(randomDimension(), randomDimension()) .filter('Lanczos') .quality(80) - .toBuffer(function (err, buffer) { + .toBuffer((err, buffer) => { if (err) { throw err; } else { @@ -55,10 +55,10 @@ new Benchmark.Suite('random').add('imagemagick', { } }).add('sharp', { defer: true, - fn: function (deferred) { + fn: (deferred) => { sharp(fixtures.inputJpg) .resize(randomDimension(), randomDimension()) - .toBuffer(function (err, buffer) { + .toBuffer((err, buffer) => { if (err) { throw err; } else { @@ -67,9 +67,9 @@ new Benchmark.Suite('random').add('imagemagick', { } }); } -}).on('cycle', function (event) { +}).on('cycle', (event) => { console.log(String(event.target)); }).on('complete', function () { const winner = this.filter('fastest').map('name'); - assert.strictEqual('sharp', String(winner), 'sharp was slower than ' + winner); + assert.strictEqual('sharp', String(winner), `sharp was slower than ${winner}`); }).run(); diff --git a/test/bench/run-with-docker.sh b/test/bench/run-with-docker.sh new file mode 100755 index 000000000..d28975bfb --- /dev/null +++ b/test/bench/run-with-docker.sh @@ -0,0 +1,13 @@ +#!/usr/bin/env bash +set -e + +if ! type docker >/dev/null; then + echo "Please install docker" + exit 1 +fi + +BRANCH=$(git branch --show-current) +echo "Running sharp performance tests using $BRANCH branch" + +docker build --build-arg "BRANCH=$BRANCH" -t sharp-test-bench . +docker run --rm -it sharp-test-bench diff --git a/test/fixtures/16-bit-grey-alpha.png b/test/fixtures/16-bit-grey-alpha.png new file mode 100644 index 000000000..fab9f5036 Binary files /dev/null and b/test/fixtures/16-bit-grey-alpha.png differ diff --git a/test/fixtures/2569067123_aca715a2ee_o.png b/test/fixtures/2569067123_aca715a2ee_o.png new file mode 100644 index 000000000..1262ec6e1 Binary files /dev/null and b/test/fixtures/2569067123_aca715a2ee_o.png differ diff --git a/test/fixtures/65536-uint32-limit.png b/test/fixtures/65536-uint32-limit.png new file mode 100644 index 000000000..e86f608ed Binary files /dev/null and b/test/fixtures/65536-uint32-limit.png differ diff --git a/test/fixtures/Flag_of_the_Netherlands-16bit.png b/test/fixtures/Flag_of_the_Netherlands-16bit.png new file mode 100644 index 000000000..61ee36d53 Binary files /dev/null and b/test/fixtures/Flag_of_the_Netherlands-16bit.png differ diff --git a/test/fixtures/Flag_of_the_Netherlands-alpha.png b/test/fixtures/Flag_of_the_Netherlands-alpha.png new file mode 100644 index 000000000..35a21b4ac Binary files /dev/null and b/test/fixtures/Flag_of_the_Netherlands-alpha.png differ diff --git a/test/fixtures/Flag_of_the_Netherlands.png b/test/fixtures/Flag_of_the_Netherlands.png new file mode 100644 index 000000000..c336a3e29 Binary files /dev/null and b/test/fixtures/Flag_of_the_Netherlands.png differ diff --git a/test/fixtures/G31D_MULTI.TIF b/test/fixtures/G31D_MULTI.TIF new file mode 100644 index 000000000..017195d9d Binary files /dev/null and b/test/fixtures/G31D_MULTI.TIF differ diff --git a/test/fixtures/Landscape_9.jpg b/test/fixtures/Landscape_9.jpg new file mode 100644 index 000000000..07eafa791 Binary files /dev/null and b/test/fixtures/Landscape_9.jpg differ diff --git a/test/fixtures/XCMYK 2017.icc b/test/fixtures/XCMYK 2017.icc new file mode 100644 index 000000000..ec9aae7d4 Binary files /dev/null and b/test/fixtures/XCMYK 2017.icc differ diff --git a/test/fixtures/alpha-layer-1-fill-low-alpha.png b/test/fixtures/alpha-layer-1-fill-low-alpha.png deleted file mode 100644 index db8fc5f50..000000000 Binary files a/test/fixtures/alpha-layer-1-fill-low-alpha.png and /dev/null differ diff --git a/test/fixtures/alpha-layer-2-ink-low-alpha.png b/test/fixtures/alpha-layer-2-ink-low-alpha.png deleted file mode 100644 index 31bf629df..000000000 Binary files a/test/fixtures/alpha-layer-2-ink-low-alpha.png and /dev/null differ diff --git a/test/fixtures/alpha-layer-2-ink.jpg b/test/fixtures/alpha-layer-2-ink.jpg new file mode 100644 index 000000000..ca96a1af4 Binary files /dev/null and b/test/fixtures/alpha-layer-2-ink.jpg differ diff --git a/test/fixtures/alpha-layer-2-ink.png b/test/fixtures/alpha-layer-2-ink.png deleted file mode 100644 index 6bfb355f6..000000000 Binary files a/test/fixtures/alpha-layer-2-ink.png and /dev/null differ diff --git a/test/fixtures/animated-loop-3.gif b/test/fixtures/animated-loop-3.gif new file mode 100644 index 000000000..006d14aa0 Binary files /dev/null and b/test/fixtures/animated-loop-3.gif differ diff --git a/test/fixtures/animated-loop-3.webp b/test/fixtures/animated-loop-3.webp new file mode 100644 index 000000000..2ea7f77aa Binary files /dev/null and b/test/fixtures/animated-loop-3.webp differ diff --git a/test/fixtures/bgbn4a08.png b/test/fixtures/bgbn4a08.png new file mode 100644 index 000000000..7cbefc3bf Binary files /dev/null and b/test/fixtures/bgbn4a08.png differ diff --git a/test/fixtures/bggn4a16.png b/test/fixtures/bggn4a16.png new file mode 100644 index 000000000..13fd85ba1 Binary files /dev/null and b/test/fixtures/bggn4a16.png differ diff --git a/test/fixtures/big-height.webp b/test/fixtures/big-height.webp new file mode 100644 index 000000000..752a16964 Binary files /dev/null and b/test/fixtures/big-height.webp differ diff --git a/test/fixtures/bonne.geo.tif b/test/fixtures/bonne.geo.tif new file mode 100644 index 000000000..1592bf86c Binary files /dev/null and b/test/fixtures/bonne.geo.tif differ diff --git a/test/fixtures/centered_image.jpeg b/test/fixtures/centered_image.jpeg new file mode 100644 index 000000000..4c3afa6bf Binary files /dev/null and b/test/fixtures/centered_image.jpeg differ diff --git a/test/fixtures/cielab-dagams.tiff b/test/fixtures/cielab-dagams.tiff index 3a7680250..452f3a026 100644 Binary files a/test/fixtures/cielab-dagams.tiff and b/test/fixtures/cielab-dagams.tiff differ diff --git a/test/fixtures/circle.svg b/test/fixtures/circle.svg new file mode 100644 index 000000000..7807e32fd --- /dev/null +++ b/test/fixtures/circle.svg @@ -0,0 +1,3 @@ + + + diff --git a/test/fixtures/concert.jpg b/test/fixtures/concert.jpg new file mode 100644 index 000000000..7591c9199 Binary files /dev/null and b/test/fixtures/concert.jpg differ diff --git a/test/fixtures/d.png b/test/fixtures/d.png new file mode 100644 index 000000000..765420660 Binary files /dev/null and b/test/fixtures/d.png differ diff --git a/test/fixtures/dot-and-lines.png b/test/fixtures/dot-and-lines.png new file mode 100644 index 000000000..5c50d2456 Binary files /dev/null and b/test/fixtures/dot-and-lines.png differ diff --git a/test/fixtures/embedgravitybird.png b/test/fixtures/embedgravitybird.png new file mode 100644 index 000000000..cd381c9ee Binary files /dev/null and b/test/fixtures/embedgravitybird.png differ diff --git a/test/fixtures/expected/16-bit-grey-alpha-identity.png b/test/fixtures/expected/16-bit-grey-alpha-identity.png new file mode 100644 index 000000000..766074b3d Binary files /dev/null and b/test/fixtures/expected/16-bit-grey-alpha-identity.png differ diff --git a/test/fixtures/expected/Landscape_1-recomb-saturation.jpg b/test/fixtures/expected/Landscape_1-recomb-saturation.jpg new file mode 100644 index 000000000..888102835 Binary files /dev/null and b/test/fixtures/expected/Landscape_1-recomb-saturation.jpg differ diff --git a/test/fixtures/expected/Landscape_1-recomb-sepia.jpg b/test/fixtures/expected/Landscape_1-recomb-sepia.jpg new file mode 100644 index 000000000..9ca75b633 Binary files /dev/null and b/test/fixtures/expected/Landscape_1-recomb-sepia.jpg differ diff --git a/test/fixtures/expected/Landscape_1-recomb-sepia2.jpg b/test/fixtures/expected/Landscape_1-recomb-sepia2.jpg new file mode 100644 index 000000000..7d0f1996a Binary files /dev/null and b/test/fixtures/expected/Landscape_1-recomb-sepia2.jpg differ diff --git a/test/fixtures/expected/Landscape_1_flip-out.jpg b/test/fixtures/expected/Landscape_1_flip-out.jpg new file mode 100644 index 000000000..5d9c7b1ce Binary files /dev/null and b/test/fixtures/expected/Landscape_1_flip-out.jpg differ diff --git a/test/fixtures/expected/Landscape_1_flip_flop-out.jpg b/test/fixtures/expected/Landscape_1_flip_flop-out.jpg new file mode 100644 index 000000000..8df75ba73 Binary files /dev/null and b/test/fixtures/expected/Landscape_1_flip_flop-out.jpg differ diff --git a/test/fixtures/expected/Landscape_1_flop-out.jpg b/test/fixtures/expected/Landscape_1_flop-out.jpg new file mode 100644 index 000000000..f68ba94cf Binary files /dev/null and b/test/fixtures/expected/Landscape_1_flop-out.jpg differ diff --git a/test/fixtures/expected/Landscape_1_rotate180-out.jpg b/test/fixtures/expected/Landscape_1_rotate180-out.jpg new file mode 100644 index 000000000..8df75ba73 Binary files /dev/null and b/test/fixtures/expected/Landscape_1_rotate180-out.jpg differ diff --git a/test/fixtures/expected/Landscape_1_rotate270-out.jpg b/test/fixtures/expected/Landscape_1_rotate270-out.jpg new file mode 100644 index 000000000..378b021da Binary files /dev/null and b/test/fixtures/expected/Landscape_1_rotate270-out.jpg differ diff --git a/test/fixtures/expected/Landscape_1_rotate45-out.jpg b/test/fixtures/expected/Landscape_1_rotate45-out.jpg new file mode 100644 index 000000000..e195b9d4f Binary files /dev/null and b/test/fixtures/expected/Landscape_1_rotate45-out.jpg differ diff --git a/test/fixtures/expected/Landscape_1_rotate90-out.jpg b/test/fixtures/expected/Landscape_1_rotate90-out.jpg new file mode 100644 index 000000000..e01b9088c Binary files /dev/null and b/test/fixtures/expected/Landscape_1_rotate90-out.jpg differ diff --git a/test/fixtures/expected/Landscape_2_flip-out.jpg b/test/fixtures/expected/Landscape_2_flip-out.jpg new file mode 100644 index 000000000..1a54946ff Binary files /dev/null and b/test/fixtures/expected/Landscape_2_flip-out.jpg differ diff --git a/test/fixtures/expected/Landscape_2_flip_flop-out.jpg b/test/fixtures/expected/Landscape_2_flip_flop-out.jpg new file mode 100644 index 000000000..382b73246 Binary files /dev/null and b/test/fixtures/expected/Landscape_2_flip_flop-out.jpg differ diff --git a/test/fixtures/expected/Landscape_2_flop-out.jpg b/test/fixtures/expected/Landscape_2_flop-out.jpg new file mode 100644 index 000000000..5d260769f Binary files /dev/null and b/test/fixtures/expected/Landscape_2_flop-out.jpg differ diff --git a/test/fixtures/expected/Landscape_2_rotate180-out.jpg b/test/fixtures/expected/Landscape_2_rotate180-out.jpg new file mode 100644 index 000000000..382b73246 Binary files /dev/null and b/test/fixtures/expected/Landscape_2_rotate180-out.jpg differ diff --git a/test/fixtures/expected/Landscape_2_rotate270-out.jpg b/test/fixtures/expected/Landscape_2_rotate270-out.jpg new file mode 100644 index 000000000..6627376bf Binary files /dev/null and b/test/fixtures/expected/Landscape_2_rotate270-out.jpg differ diff --git a/test/fixtures/expected/Landscape_2_rotate45-out.jpg b/test/fixtures/expected/Landscape_2_rotate45-out.jpg new file mode 100644 index 000000000..d46a3540a Binary files /dev/null and b/test/fixtures/expected/Landscape_2_rotate45-out.jpg differ diff --git a/test/fixtures/expected/Landscape_2_rotate90-out.jpg b/test/fixtures/expected/Landscape_2_rotate90-out.jpg new file mode 100644 index 000000000..0600f6ebe Binary files /dev/null and b/test/fixtures/expected/Landscape_2_rotate90-out.jpg differ diff --git a/test/fixtures/expected/Landscape_3_flip-out.jpg b/test/fixtures/expected/Landscape_3_flip-out.jpg new file mode 100644 index 000000000..2a9d24045 Binary files /dev/null and b/test/fixtures/expected/Landscape_3_flip-out.jpg differ diff --git a/test/fixtures/expected/Landscape_3_flip_flop-out.jpg b/test/fixtures/expected/Landscape_3_flip_flop-out.jpg new file mode 100644 index 000000000..b3eb2e870 Binary files /dev/null and b/test/fixtures/expected/Landscape_3_flip_flop-out.jpg differ diff --git a/test/fixtures/expected/Landscape_3_flop-out.jpg b/test/fixtures/expected/Landscape_3_flop-out.jpg new file mode 100644 index 000000000..0a105d958 Binary files /dev/null and b/test/fixtures/expected/Landscape_3_flop-out.jpg differ diff --git a/test/fixtures/expected/Landscape_3_rotate180-out.jpg b/test/fixtures/expected/Landscape_3_rotate180-out.jpg new file mode 100644 index 000000000..b3eb2e870 Binary files /dev/null and b/test/fixtures/expected/Landscape_3_rotate180-out.jpg differ diff --git a/test/fixtures/expected/Landscape_3_rotate270-out.jpg b/test/fixtures/expected/Landscape_3_rotate270-out.jpg new file mode 100644 index 000000000..98c3bcc4f Binary files /dev/null and b/test/fixtures/expected/Landscape_3_rotate270-out.jpg differ diff --git a/test/fixtures/expected/Landscape_3_rotate45-out.jpg b/test/fixtures/expected/Landscape_3_rotate45-out.jpg new file mode 100644 index 000000000..6f74fdb38 Binary files /dev/null and b/test/fixtures/expected/Landscape_3_rotate45-out.jpg differ diff --git a/test/fixtures/expected/Landscape_3_rotate90-out.jpg b/test/fixtures/expected/Landscape_3_rotate90-out.jpg new file mode 100644 index 000000000..3026268ba Binary files /dev/null and b/test/fixtures/expected/Landscape_3_rotate90-out.jpg differ diff --git a/test/fixtures/expected/Landscape_4_flip-out.jpg b/test/fixtures/expected/Landscape_4_flip-out.jpg new file mode 100644 index 000000000..6cb2e8e59 Binary files /dev/null and b/test/fixtures/expected/Landscape_4_flip-out.jpg differ diff --git a/test/fixtures/expected/Landscape_4_flip_flop-out.jpg b/test/fixtures/expected/Landscape_4_flip_flop-out.jpg new file mode 100644 index 000000000..440c23746 Binary files /dev/null and b/test/fixtures/expected/Landscape_4_flip_flop-out.jpg differ diff --git a/test/fixtures/expected/Landscape_4_flop-out.jpg b/test/fixtures/expected/Landscape_4_flop-out.jpg new file mode 100644 index 000000000..aa9d3df92 Binary files /dev/null and b/test/fixtures/expected/Landscape_4_flop-out.jpg differ diff --git a/test/fixtures/expected/Landscape_4_rotate180-out.jpg b/test/fixtures/expected/Landscape_4_rotate180-out.jpg new file mode 100644 index 000000000..440c23746 Binary files /dev/null and b/test/fixtures/expected/Landscape_4_rotate180-out.jpg differ diff --git a/test/fixtures/expected/Landscape_4_rotate270-out.jpg b/test/fixtures/expected/Landscape_4_rotate270-out.jpg new file mode 100644 index 000000000..55d17a1a7 Binary files /dev/null and b/test/fixtures/expected/Landscape_4_rotate270-out.jpg differ diff --git a/test/fixtures/expected/Landscape_4_rotate45-out.jpg b/test/fixtures/expected/Landscape_4_rotate45-out.jpg new file mode 100644 index 000000000..76237462d Binary files /dev/null and b/test/fixtures/expected/Landscape_4_rotate45-out.jpg differ diff --git a/test/fixtures/expected/Landscape_4_rotate90-out.jpg b/test/fixtures/expected/Landscape_4_rotate90-out.jpg new file mode 100644 index 000000000..c3543e07c Binary files /dev/null and b/test/fixtures/expected/Landscape_4_rotate90-out.jpg differ diff --git a/test/fixtures/expected/Landscape_5_flip-out.jpg b/test/fixtures/expected/Landscape_5_flip-out.jpg new file mode 100644 index 000000000..7b0474859 Binary files /dev/null and b/test/fixtures/expected/Landscape_5_flip-out.jpg differ diff --git a/test/fixtures/expected/Landscape_5_flip_flop-out.jpg b/test/fixtures/expected/Landscape_5_flip_flop-out.jpg new file mode 100644 index 000000000..06a1fb920 Binary files /dev/null and b/test/fixtures/expected/Landscape_5_flip_flop-out.jpg differ diff --git a/test/fixtures/expected/Landscape_5_flop-out.jpg b/test/fixtures/expected/Landscape_5_flop-out.jpg new file mode 100644 index 000000000..bd59dff28 Binary files /dev/null and b/test/fixtures/expected/Landscape_5_flop-out.jpg differ diff --git a/test/fixtures/expected/Landscape_5_rotate180-out.jpg b/test/fixtures/expected/Landscape_5_rotate180-out.jpg new file mode 100644 index 000000000..06a1fb920 Binary files /dev/null and b/test/fixtures/expected/Landscape_5_rotate180-out.jpg differ diff --git a/test/fixtures/expected/Landscape_5_rotate270-out.jpg b/test/fixtures/expected/Landscape_5_rotate270-out.jpg new file mode 100644 index 000000000..06d772522 Binary files /dev/null and b/test/fixtures/expected/Landscape_5_rotate270-out.jpg differ diff --git a/test/fixtures/expected/Landscape_5_rotate45-out.jpg b/test/fixtures/expected/Landscape_5_rotate45-out.jpg new file mode 100644 index 000000000..0df79d2b8 Binary files /dev/null and b/test/fixtures/expected/Landscape_5_rotate45-out.jpg differ diff --git a/test/fixtures/expected/Landscape_5_rotate90-out.jpg b/test/fixtures/expected/Landscape_5_rotate90-out.jpg new file mode 100644 index 000000000..ad21a7d85 Binary files /dev/null and b/test/fixtures/expected/Landscape_5_rotate90-out.jpg differ diff --git a/test/fixtures/expected/Landscape_6_flip-out.jpg b/test/fixtures/expected/Landscape_6_flip-out.jpg new file mode 100644 index 000000000..c8012cbea Binary files /dev/null and b/test/fixtures/expected/Landscape_6_flip-out.jpg differ diff --git a/test/fixtures/expected/Landscape_6_flip_flop-out.jpg b/test/fixtures/expected/Landscape_6_flip_flop-out.jpg new file mode 100644 index 000000000..ad093bff0 Binary files /dev/null and b/test/fixtures/expected/Landscape_6_flip_flop-out.jpg differ diff --git a/test/fixtures/expected/Landscape_6_flop-out.jpg b/test/fixtures/expected/Landscape_6_flop-out.jpg new file mode 100644 index 000000000..86af2d2e6 Binary files /dev/null and b/test/fixtures/expected/Landscape_6_flop-out.jpg differ diff --git a/test/fixtures/expected/Landscape_6_rotate180-out.jpg b/test/fixtures/expected/Landscape_6_rotate180-out.jpg new file mode 100644 index 000000000..ad093bff0 Binary files /dev/null and b/test/fixtures/expected/Landscape_6_rotate180-out.jpg differ diff --git a/test/fixtures/expected/Landscape_6_rotate270-out.jpg b/test/fixtures/expected/Landscape_6_rotate270-out.jpg new file mode 100644 index 000000000..4cbbff54f Binary files /dev/null and b/test/fixtures/expected/Landscape_6_rotate270-out.jpg differ diff --git a/test/fixtures/expected/Landscape_6_rotate45-out.jpg b/test/fixtures/expected/Landscape_6_rotate45-out.jpg new file mode 100644 index 000000000..07a109560 Binary files /dev/null and b/test/fixtures/expected/Landscape_6_rotate45-out.jpg differ diff --git a/test/fixtures/expected/Landscape_6_rotate90-out.jpg b/test/fixtures/expected/Landscape_6_rotate90-out.jpg new file mode 100644 index 000000000..6fe106499 Binary files /dev/null and b/test/fixtures/expected/Landscape_6_rotate90-out.jpg differ diff --git a/test/fixtures/expected/Landscape_7_flip-out.jpg b/test/fixtures/expected/Landscape_7_flip-out.jpg new file mode 100644 index 000000000..ed14166cb Binary files /dev/null and b/test/fixtures/expected/Landscape_7_flip-out.jpg differ diff --git a/test/fixtures/expected/Landscape_7_flip_flop-out.jpg b/test/fixtures/expected/Landscape_7_flip_flop-out.jpg new file mode 100644 index 000000000..522130cc1 Binary files /dev/null and b/test/fixtures/expected/Landscape_7_flip_flop-out.jpg differ diff --git a/test/fixtures/expected/Landscape_7_flop-out.jpg b/test/fixtures/expected/Landscape_7_flop-out.jpg new file mode 100644 index 000000000..a792fab3d Binary files /dev/null and b/test/fixtures/expected/Landscape_7_flop-out.jpg differ diff --git a/test/fixtures/expected/Landscape_7_rotate180-out.jpg b/test/fixtures/expected/Landscape_7_rotate180-out.jpg new file mode 100644 index 000000000..522130cc1 Binary files /dev/null and b/test/fixtures/expected/Landscape_7_rotate180-out.jpg differ diff --git a/test/fixtures/expected/Landscape_7_rotate270-out.jpg b/test/fixtures/expected/Landscape_7_rotate270-out.jpg new file mode 100644 index 000000000..b7bbae5b9 Binary files /dev/null and b/test/fixtures/expected/Landscape_7_rotate270-out.jpg differ diff --git a/test/fixtures/expected/Landscape_7_rotate45-out.jpg b/test/fixtures/expected/Landscape_7_rotate45-out.jpg new file mode 100644 index 000000000..76881c29d Binary files /dev/null and b/test/fixtures/expected/Landscape_7_rotate45-out.jpg differ diff --git a/test/fixtures/expected/Landscape_7_rotate90-out.jpg b/test/fixtures/expected/Landscape_7_rotate90-out.jpg new file mode 100644 index 000000000..b33cd9cb2 Binary files /dev/null and b/test/fixtures/expected/Landscape_7_rotate90-out.jpg differ diff --git a/test/fixtures/expected/Landscape_8_flip-out.jpg b/test/fixtures/expected/Landscape_8_flip-out.jpg new file mode 100644 index 000000000..7ab9fd406 Binary files /dev/null and b/test/fixtures/expected/Landscape_8_flip-out.jpg differ diff --git a/test/fixtures/expected/Landscape_8_flip_flop-out.jpg b/test/fixtures/expected/Landscape_8_flip_flop-out.jpg new file mode 100644 index 000000000..e931394c9 Binary files /dev/null and b/test/fixtures/expected/Landscape_8_flip_flop-out.jpg differ diff --git a/test/fixtures/expected/Landscape_8_flop-out.jpg b/test/fixtures/expected/Landscape_8_flop-out.jpg new file mode 100644 index 000000000..ea275beb6 Binary files /dev/null and b/test/fixtures/expected/Landscape_8_flop-out.jpg differ diff --git a/test/fixtures/expected/Landscape_8_rotate180-out.jpg b/test/fixtures/expected/Landscape_8_rotate180-out.jpg new file mode 100644 index 000000000..e931394c9 Binary files /dev/null and b/test/fixtures/expected/Landscape_8_rotate180-out.jpg differ diff --git a/test/fixtures/expected/Landscape_8_rotate270-out.jpg b/test/fixtures/expected/Landscape_8_rotate270-out.jpg new file mode 100644 index 000000000..09beada7d Binary files /dev/null and b/test/fixtures/expected/Landscape_8_rotate270-out.jpg differ diff --git a/test/fixtures/expected/Landscape_8_rotate45-out.jpg b/test/fixtures/expected/Landscape_8_rotate45-out.jpg new file mode 100644 index 000000000..cd0157665 Binary files /dev/null and b/test/fixtures/expected/Landscape_8_rotate45-out.jpg differ diff --git a/test/fixtures/expected/Landscape_8_rotate90-out.jpg b/test/fixtures/expected/Landscape_8_rotate90-out.jpg new file mode 100644 index 000000000..d806c013b Binary files /dev/null and b/test/fixtures/expected/Landscape_8_rotate90-out.jpg differ diff --git a/test/fixtures/expected/Portrait_1_flip-out.jpg b/test/fixtures/expected/Portrait_1_flip-out.jpg new file mode 100644 index 000000000..13fbbd666 Binary files /dev/null and b/test/fixtures/expected/Portrait_1_flip-out.jpg differ diff --git a/test/fixtures/expected/Portrait_1_flip_flop-out.jpg b/test/fixtures/expected/Portrait_1_flip_flop-out.jpg new file mode 100644 index 000000000..30b748595 Binary files /dev/null and b/test/fixtures/expected/Portrait_1_flip_flop-out.jpg differ diff --git a/test/fixtures/expected/Portrait_1_flop-out.jpg b/test/fixtures/expected/Portrait_1_flop-out.jpg new file mode 100644 index 000000000..09453d1d0 Binary files /dev/null and b/test/fixtures/expected/Portrait_1_flop-out.jpg differ diff --git a/test/fixtures/expected/Portrait_1_rotate180-out.jpg b/test/fixtures/expected/Portrait_1_rotate180-out.jpg new file mode 100644 index 000000000..30b748595 Binary files /dev/null and b/test/fixtures/expected/Portrait_1_rotate180-out.jpg differ diff --git a/test/fixtures/expected/Portrait_1_rotate270-out.jpg b/test/fixtures/expected/Portrait_1_rotate270-out.jpg new file mode 100644 index 000000000..941ae8edd Binary files /dev/null and b/test/fixtures/expected/Portrait_1_rotate270-out.jpg differ diff --git a/test/fixtures/expected/Portrait_1_rotate45-out.jpg b/test/fixtures/expected/Portrait_1_rotate45-out.jpg new file mode 100644 index 000000000..d93168893 Binary files /dev/null and b/test/fixtures/expected/Portrait_1_rotate45-out.jpg differ diff --git a/test/fixtures/expected/Portrait_1_rotate90-out.jpg b/test/fixtures/expected/Portrait_1_rotate90-out.jpg new file mode 100644 index 000000000..cd91b58e0 Binary files /dev/null and b/test/fixtures/expected/Portrait_1_rotate90-out.jpg differ diff --git a/test/fixtures/expected/Portrait_2_flip-out.jpg b/test/fixtures/expected/Portrait_2_flip-out.jpg new file mode 100644 index 000000000..af48a72c8 Binary files /dev/null and b/test/fixtures/expected/Portrait_2_flip-out.jpg differ diff --git a/test/fixtures/expected/Portrait_2_flip_flop-out.jpg b/test/fixtures/expected/Portrait_2_flip_flop-out.jpg new file mode 100644 index 000000000..7e83d64fc Binary files /dev/null and b/test/fixtures/expected/Portrait_2_flip_flop-out.jpg differ diff --git a/test/fixtures/expected/Portrait_2_flop-out.jpg b/test/fixtures/expected/Portrait_2_flop-out.jpg new file mode 100644 index 000000000..f746afbfd Binary files /dev/null and b/test/fixtures/expected/Portrait_2_flop-out.jpg differ diff --git a/test/fixtures/expected/Portrait_2_rotate180-out.jpg b/test/fixtures/expected/Portrait_2_rotate180-out.jpg new file mode 100644 index 000000000..7e83d64fc Binary files /dev/null and b/test/fixtures/expected/Portrait_2_rotate180-out.jpg differ diff --git a/test/fixtures/expected/Portrait_2_rotate270-out.jpg b/test/fixtures/expected/Portrait_2_rotate270-out.jpg new file mode 100644 index 000000000..08ab320d0 Binary files /dev/null and b/test/fixtures/expected/Portrait_2_rotate270-out.jpg differ diff --git a/test/fixtures/expected/Portrait_2_rotate45-out.jpg b/test/fixtures/expected/Portrait_2_rotate45-out.jpg new file mode 100644 index 000000000..81251c9bb Binary files /dev/null and b/test/fixtures/expected/Portrait_2_rotate45-out.jpg differ diff --git a/test/fixtures/expected/Portrait_2_rotate90-out.jpg b/test/fixtures/expected/Portrait_2_rotate90-out.jpg new file mode 100644 index 000000000..db4ed9799 Binary files /dev/null and b/test/fixtures/expected/Portrait_2_rotate90-out.jpg differ diff --git a/test/fixtures/expected/Portrait_3_flip-out.jpg b/test/fixtures/expected/Portrait_3_flip-out.jpg new file mode 100644 index 000000000..43aa137af Binary files /dev/null and b/test/fixtures/expected/Portrait_3_flip-out.jpg differ diff --git a/test/fixtures/expected/Portrait_3_flip_flop-out.jpg b/test/fixtures/expected/Portrait_3_flip_flop-out.jpg new file mode 100644 index 000000000..2b7f34d17 Binary files /dev/null and b/test/fixtures/expected/Portrait_3_flip_flop-out.jpg differ diff --git a/test/fixtures/expected/Portrait_3_flop-out.jpg b/test/fixtures/expected/Portrait_3_flop-out.jpg new file mode 100644 index 000000000..654dbb5f5 Binary files /dev/null and b/test/fixtures/expected/Portrait_3_flop-out.jpg differ diff --git a/test/fixtures/expected/Portrait_3_rotate180-out.jpg b/test/fixtures/expected/Portrait_3_rotate180-out.jpg new file mode 100644 index 000000000..2b7f34d17 Binary files /dev/null and b/test/fixtures/expected/Portrait_3_rotate180-out.jpg differ diff --git a/test/fixtures/expected/Portrait_3_rotate270-out.jpg b/test/fixtures/expected/Portrait_3_rotate270-out.jpg new file mode 100644 index 000000000..d8cc2d087 Binary files /dev/null and b/test/fixtures/expected/Portrait_3_rotate270-out.jpg differ diff --git a/test/fixtures/expected/Portrait_3_rotate45-out.jpg b/test/fixtures/expected/Portrait_3_rotate45-out.jpg new file mode 100644 index 000000000..d282dd93f Binary files /dev/null and b/test/fixtures/expected/Portrait_3_rotate45-out.jpg differ diff --git a/test/fixtures/expected/Portrait_3_rotate90-out.jpg b/test/fixtures/expected/Portrait_3_rotate90-out.jpg new file mode 100644 index 000000000..3f668810b Binary files /dev/null and b/test/fixtures/expected/Portrait_3_rotate90-out.jpg differ diff --git a/test/fixtures/expected/Portrait_4_flip-out.jpg b/test/fixtures/expected/Portrait_4_flip-out.jpg new file mode 100644 index 000000000..99d25b360 Binary files /dev/null and b/test/fixtures/expected/Portrait_4_flip-out.jpg differ diff --git a/test/fixtures/expected/Portrait_4_flip_flop-out.jpg b/test/fixtures/expected/Portrait_4_flip_flop-out.jpg new file mode 100644 index 000000000..4ba5e0f7f Binary files /dev/null and b/test/fixtures/expected/Portrait_4_flip_flop-out.jpg differ diff --git a/test/fixtures/expected/Portrait_4_flop-out.jpg b/test/fixtures/expected/Portrait_4_flop-out.jpg new file mode 100644 index 000000000..aecd7d145 Binary files /dev/null and b/test/fixtures/expected/Portrait_4_flop-out.jpg differ diff --git a/test/fixtures/expected/Portrait_4_rotate180-out.jpg b/test/fixtures/expected/Portrait_4_rotate180-out.jpg new file mode 100644 index 000000000..4ba5e0f7f Binary files /dev/null and b/test/fixtures/expected/Portrait_4_rotate180-out.jpg differ diff --git a/test/fixtures/expected/Portrait_4_rotate270-out.jpg b/test/fixtures/expected/Portrait_4_rotate270-out.jpg new file mode 100644 index 000000000..5e3833c30 Binary files /dev/null and b/test/fixtures/expected/Portrait_4_rotate270-out.jpg differ diff --git a/test/fixtures/expected/Portrait_4_rotate45-out.jpg b/test/fixtures/expected/Portrait_4_rotate45-out.jpg new file mode 100644 index 000000000..f110605dd Binary files /dev/null and b/test/fixtures/expected/Portrait_4_rotate45-out.jpg differ diff --git a/test/fixtures/expected/Portrait_4_rotate90-out.jpg b/test/fixtures/expected/Portrait_4_rotate90-out.jpg new file mode 100644 index 000000000..913626d78 Binary files /dev/null and b/test/fixtures/expected/Portrait_4_rotate90-out.jpg differ diff --git a/test/fixtures/expected/Portrait_5_flip-out.jpg b/test/fixtures/expected/Portrait_5_flip-out.jpg new file mode 100644 index 000000000..e15c97f6c Binary files /dev/null and b/test/fixtures/expected/Portrait_5_flip-out.jpg differ diff --git a/test/fixtures/expected/Portrait_5_flip_flop-out.jpg b/test/fixtures/expected/Portrait_5_flip_flop-out.jpg new file mode 100644 index 000000000..82c999c8d Binary files /dev/null and b/test/fixtures/expected/Portrait_5_flip_flop-out.jpg differ diff --git a/test/fixtures/expected/Portrait_5_flop-out.jpg b/test/fixtures/expected/Portrait_5_flop-out.jpg new file mode 100644 index 000000000..fdea1e992 Binary files /dev/null and b/test/fixtures/expected/Portrait_5_flop-out.jpg differ diff --git a/test/fixtures/expected/Portrait_5_rotate180-out.jpg b/test/fixtures/expected/Portrait_5_rotate180-out.jpg new file mode 100644 index 000000000..82c999c8d Binary files /dev/null and b/test/fixtures/expected/Portrait_5_rotate180-out.jpg differ diff --git a/test/fixtures/expected/Portrait_5_rotate270-out.jpg b/test/fixtures/expected/Portrait_5_rotate270-out.jpg new file mode 100644 index 000000000..ff7044004 Binary files /dev/null and b/test/fixtures/expected/Portrait_5_rotate270-out.jpg differ diff --git a/test/fixtures/expected/Portrait_5_rotate45-out.jpg b/test/fixtures/expected/Portrait_5_rotate45-out.jpg new file mode 100644 index 000000000..843fd8f0a Binary files /dev/null and b/test/fixtures/expected/Portrait_5_rotate45-out.jpg differ diff --git a/test/fixtures/expected/Portrait_5_rotate90-out.jpg b/test/fixtures/expected/Portrait_5_rotate90-out.jpg new file mode 100644 index 000000000..5547e7df4 Binary files /dev/null and b/test/fixtures/expected/Portrait_5_rotate90-out.jpg differ diff --git a/test/fixtures/expected/Portrait_6_flip-out.jpg b/test/fixtures/expected/Portrait_6_flip-out.jpg new file mode 100644 index 000000000..a7bd24c23 Binary files /dev/null and b/test/fixtures/expected/Portrait_6_flip-out.jpg differ diff --git a/test/fixtures/expected/Portrait_6_flip_flop-out.jpg b/test/fixtures/expected/Portrait_6_flip_flop-out.jpg new file mode 100644 index 000000000..c35bbb351 Binary files /dev/null and b/test/fixtures/expected/Portrait_6_flip_flop-out.jpg differ diff --git a/test/fixtures/expected/Portrait_6_flop-out.jpg b/test/fixtures/expected/Portrait_6_flop-out.jpg new file mode 100644 index 000000000..35e7d14e6 Binary files /dev/null and b/test/fixtures/expected/Portrait_6_flop-out.jpg differ diff --git a/test/fixtures/expected/Portrait_6_rotate180-out.jpg b/test/fixtures/expected/Portrait_6_rotate180-out.jpg new file mode 100644 index 000000000..c35bbb351 Binary files /dev/null and b/test/fixtures/expected/Portrait_6_rotate180-out.jpg differ diff --git a/test/fixtures/expected/Portrait_6_rotate270-out.jpg b/test/fixtures/expected/Portrait_6_rotate270-out.jpg new file mode 100644 index 000000000..d7e093d81 Binary files /dev/null and b/test/fixtures/expected/Portrait_6_rotate270-out.jpg differ diff --git a/test/fixtures/expected/Portrait_6_rotate45-out.jpg b/test/fixtures/expected/Portrait_6_rotate45-out.jpg new file mode 100644 index 000000000..713bb3ee0 Binary files /dev/null and b/test/fixtures/expected/Portrait_6_rotate45-out.jpg differ diff --git a/test/fixtures/expected/Portrait_6_rotate90-out.jpg b/test/fixtures/expected/Portrait_6_rotate90-out.jpg new file mode 100644 index 000000000..cb6c108d2 Binary files /dev/null and b/test/fixtures/expected/Portrait_6_rotate90-out.jpg differ diff --git a/test/fixtures/expected/Portrait_7_flip-out.jpg b/test/fixtures/expected/Portrait_7_flip-out.jpg new file mode 100644 index 000000000..3c89dc8b9 Binary files /dev/null and b/test/fixtures/expected/Portrait_7_flip-out.jpg differ diff --git a/test/fixtures/expected/Portrait_7_flip_flop-out.jpg b/test/fixtures/expected/Portrait_7_flip_flop-out.jpg new file mode 100644 index 000000000..ee99439d4 Binary files /dev/null and b/test/fixtures/expected/Portrait_7_flip_flop-out.jpg differ diff --git a/test/fixtures/expected/Portrait_7_flop-out.jpg b/test/fixtures/expected/Portrait_7_flop-out.jpg new file mode 100644 index 000000000..2dba39f88 Binary files /dev/null and b/test/fixtures/expected/Portrait_7_flop-out.jpg differ diff --git a/test/fixtures/expected/Portrait_7_rotate180-out.jpg b/test/fixtures/expected/Portrait_7_rotate180-out.jpg new file mode 100644 index 000000000..ee99439d4 Binary files /dev/null and b/test/fixtures/expected/Portrait_7_rotate180-out.jpg differ diff --git a/test/fixtures/expected/Portrait_7_rotate270-out.jpg b/test/fixtures/expected/Portrait_7_rotate270-out.jpg new file mode 100644 index 000000000..6756e61be Binary files /dev/null and b/test/fixtures/expected/Portrait_7_rotate270-out.jpg differ diff --git a/test/fixtures/expected/Portrait_7_rotate45-out.jpg b/test/fixtures/expected/Portrait_7_rotate45-out.jpg new file mode 100644 index 000000000..f8d60b808 Binary files /dev/null and b/test/fixtures/expected/Portrait_7_rotate45-out.jpg differ diff --git a/test/fixtures/expected/Portrait_7_rotate90-out.jpg b/test/fixtures/expected/Portrait_7_rotate90-out.jpg new file mode 100644 index 000000000..879d2c705 Binary files /dev/null and b/test/fixtures/expected/Portrait_7_rotate90-out.jpg differ diff --git a/test/fixtures/expected/Portrait_8_flip-out.jpg b/test/fixtures/expected/Portrait_8_flip-out.jpg new file mode 100644 index 000000000..e46300273 Binary files /dev/null and b/test/fixtures/expected/Portrait_8_flip-out.jpg differ diff --git a/test/fixtures/expected/Portrait_8_flip_flop-out.jpg b/test/fixtures/expected/Portrait_8_flip_flop-out.jpg new file mode 100644 index 000000000..2f4f2ce27 Binary files /dev/null and b/test/fixtures/expected/Portrait_8_flip_flop-out.jpg differ diff --git a/test/fixtures/expected/Portrait_8_flop-out.jpg b/test/fixtures/expected/Portrait_8_flop-out.jpg new file mode 100644 index 000000000..83e5e9d04 Binary files /dev/null and b/test/fixtures/expected/Portrait_8_flop-out.jpg differ diff --git a/test/fixtures/expected/Portrait_8_rotate180-out.jpg b/test/fixtures/expected/Portrait_8_rotate180-out.jpg new file mode 100644 index 000000000..2f4f2ce27 Binary files /dev/null and b/test/fixtures/expected/Portrait_8_rotate180-out.jpg differ diff --git a/test/fixtures/expected/Portrait_8_rotate270-out.jpg b/test/fixtures/expected/Portrait_8_rotate270-out.jpg new file mode 100644 index 000000000..b0bb7296a Binary files /dev/null and b/test/fixtures/expected/Portrait_8_rotate270-out.jpg differ diff --git a/test/fixtures/expected/Portrait_8_rotate45-out.jpg b/test/fixtures/expected/Portrait_8_rotate45-out.jpg new file mode 100644 index 000000000..482c70ae2 Binary files /dev/null and b/test/fixtures/expected/Portrait_8_rotate45-out.jpg differ diff --git a/test/fixtures/expected/Portrait_8_rotate90-out.jpg b/test/fixtures/expected/Portrait_8_rotate90-out.jpg new file mode 100644 index 000000000..695ff0274 Binary files /dev/null and b/test/fixtures/expected/Portrait_8_rotate90-out.jpg differ diff --git a/test/fixtures/expected/affine-background-all-offsets-expected.jpg b/test/fixtures/expected/affine-background-all-offsets-expected.jpg new file mode 100644 index 000000000..6a0efb2ad Binary files /dev/null and b/test/fixtures/expected/affine-background-all-offsets-expected.jpg differ diff --git a/test/fixtures/expected/affine-background-expected.jpg b/test/fixtures/expected/affine-background-expected.jpg new file mode 100644 index 000000000..cbce2cb18 Binary files /dev/null and b/test/fixtures/expected/affine-background-expected.jpg differ diff --git a/test/fixtures/expected/affine-background-output-offsets-expected.jpg b/test/fixtures/expected/affine-background-output-offsets-expected.jpg new file mode 100644 index 000000000..89789c9a3 Binary files /dev/null and b/test/fixtures/expected/affine-background-output-offsets-expected.jpg differ diff --git a/test/fixtures/expected/affine-bicubic-2x-upscale-expected.jpg b/test/fixtures/expected/affine-bicubic-2x-upscale-expected.jpg new file mode 100644 index 000000000..819e4dd9a Binary files /dev/null and b/test/fixtures/expected/affine-bicubic-2x-upscale-expected.jpg differ diff --git a/test/fixtures/expected/affine-bilinear-2x-upscale-expected.jpg b/test/fixtures/expected/affine-bilinear-2x-upscale-expected.jpg new file mode 100644 index 000000000..b02666670 Binary files /dev/null and b/test/fixtures/expected/affine-bilinear-2x-upscale-expected.jpg differ diff --git a/test/fixtures/expected/affine-extract-expected.jpg b/test/fixtures/expected/affine-extract-expected.jpg new file mode 100644 index 000000000..b152fbc12 Binary files /dev/null and b/test/fixtures/expected/affine-extract-expected.jpg differ diff --git a/test/fixtures/expected/affine-extract-rotate-expected.jpg b/test/fixtures/expected/affine-extract-rotate-expected.jpg new file mode 100644 index 000000000..ced3189aa Binary files /dev/null and b/test/fixtures/expected/affine-extract-rotate-expected.jpg differ diff --git a/test/fixtures/expected/affine-lbb-2x-upscale-expected.jpg b/test/fixtures/expected/affine-lbb-2x-upscale-expected.jpg new file mode 100644 index 000000000..f6caf6579 Binary files /dev/null and b/test/fixtures/expected/affine-lbb-2x-upscale-expected.jpg differ diff --git a/test/fixtures/expected/affine-nearest-2x-upscale-expected.jpg b/test/fixtures/expected/affine-nearest-2x-upscale-expected.jpg new file mode 100644 index 000000000..e1007693e Binary files /dev/null and b/test/fixtures/expected/affine-nearest-2x-upscale-expected.jpg differ diff --git a/test/fixtures/expected/affine-nohalo-2x-upscale-expected.jpg b/test/fixtures/expected/affine-nohalo-2x-upscale-expected.jpg new file mode 100644 index 000000000..b4064042f Binary files /dev/null and b/test/fixtures/expected/affine-nohalo-2x-upscale-expected.jpg differ diff --git a/test/fixtures/expected/affine-resize-expected.jpg b/test/fixtures/expected/affine-resize-expected.jpg new file mode 100644 index 000000000..b7d062657 Binary files /dev/null and b/test/fixtures/expected/affine-resize-expected.jpg differ diff --git a/test/fixtures/expected/affine-rotate-expected.jpg b/test/fixtures/expected/affine-rotate-expected.jpg new file mode 100644 index 000000000..8b54a3c34 Binary files /dev/null and b/test/fixtures/expected/affine-rotate-expected.jpg differ diff --git a/test/fixtures/expected/affine-vsqbs-2x-upscale-expected.jpg b/test/fixtures/expected/affine-vsqbs-2x-upscale-expected.jpg new file mode 100644 index 000000000..4c76554b9 Binary files /dev/null and b/test/fixtures/expected/affine-vsqbs-2x-upscale-expected.jpg differ diff --git a/test/fixtures/expected/alpha-layer-01-imagemagick.png b/test/fixtures/expected/alpha-layer-01-imagemagick.png deleted file mode 100644 index 45cdfc56c..000000000 Binary files a/test/fixtures/expected/alpha-layer-01-imagemagick.png and /dev/null differ diff --git a/test/fixtures/expected/alpha-layer-01-low-alpha-imagemagick.png b/test/fixtures/expected/alpha-layer-01-low-alpha-imagemagick.png deleted file mode 100644 index e8ff90590..000000000 Binary files a/test/fixtures/expected/alpha-layer-01-low-alpha-imagemagick.png and /dev/null differ diff --git a/test/fixtures/expected/alpha-layer-01-low-alpha.png b/test/fixtures/expected/alpha-layer-01-low-alpha.png deleted file mode 100644 index c5cc94162..000000000 Binary files a/test/fixtures/expected/alpha-layer-01-low-alpha.png and /dev/null differ diff --git a/test/fixtures/expected/alpha-layer-01.png b/test/fixtures/expected/alpha-layer-01.png deleted file mode 100644 index a57349d54..000000000 Binary files a/test/fixtures/expected/alpha-layer-01.png and /dev/null differ diff --git a/test/fixtures/expected/alpha-layer-012-imagemagick.png b/test/fixtures/expected/alpha-layer-012-imagemagick.png deleted file mode 100644 index dcf1390c7..000000000 Binary files a/test/fixtures/expected/alpha-layer-012-imagemagick.png and /dev/null differ diff --git a/test/fixtures/expected/alpha-layer-012-low-alpha-imagemagick.png b/test/fixtures/expected/alpha-layer-012-low-alpha-imagemagick.png deleted file mode 100644 index a276ae9dc..000000000 Binary files a/test/fixtures/expected/alpha-layer-012-low-alpha-imagemagick.png and /dev/null differ diff --git a/test/fixtures/expected/alpha-layer-012-low-alpha.png b/test/fixtures/expected/alpha-layer-012-low-alpha.png deleted file mode 100644 index 1eb13cf9a..000000000 Binary files a/test/fixtures/expected/alpha-layer-012-low-alpha.png and /dev/null differ diff --git a/test/fixtures/expected/alpha-layer-012.png b/test/fixtures/expected/alpha-layer-012.png deleted file mode 100644 index a88ad6c66..000000000 Binary files a/test/fixtures/expected/alpha-layer-012.png and /dev/null differ diff --git a/test/fixtures/expected/alpha-layer-1-fill-linear.png b/test/fixtures/expected/alpha-layer-1-fill-linear.png new file mode 100644 index 000000000..abd22a6e6 Binary files /dev/null and b/test/fixtures/expected/alpha-layer-1-fill-linear.png differ diff --git a/test/fixtures/expected/alpha-layer-1-fill-offset.png b/test/fixtures/expected/alpha-layer-1-fill-offset.png new file mode 100644 index 000000000..9c8a398bc Binary files /dev/null and b/test/fixtures/expected/alpha-layer-1-fill-offset.png differ diff --git a/test/fixtures/expected/alpha-layer-1-fill-slope.png b/test/fixtures/expected/alpha-layer-1-fill-slope.png new file mode 100644 index 000000000..c9a277dcb Binary files /dev/null and b/test/fixtures/expected/alpha-layer-1-fill-slope.png differ diff --git a/test/fixtures/expected/alpha-layer-12-imagemagick.png b/test/fixtures/expected/alpha-layer-12-imagemagick.png deleted file mode 100644 index 739e1b7bb..000000000 Binary files a/test/fixtures/expected/alpha-layer-12-imagemagick.png and /dev/null differ diff --git a/test/fixtures/expected/alpha-layer-12-low-alpha-imagemagick.png b/test/fixtures/expected/alpha-layer-12-low-alpha-imagemagick.png deleted file mode 100644 index 9135af46d..000000000 Binary files a/test/fixtures/expected/alpha-layer-12-low-alpha-imagemagick.png and /dev/null differ diff --git a/test/fixtures/expected/alpha-layer-12-low-alpha.png b/test/fixtures/expected/alpha-layer-12-low-alpha.png deleted file mode 100644 index 94001e613..000000000 Binary files a/test/fixtures/expected/alpha-layer-12-low-alpha.png and /dev/null differ diff --git a/test/fixtures/expected/alpha-layer-12.png b/test/fixtures/expected/alpha-layer-12.png deleted file mode 100644 index 7573953aa..000000000 Binary files a/test/fixtures/expected/alpha-layer-12.png and /dev/null differ diff --git a/test/fixtures/expected/alpha-layer-2-trim-resize.jpg b/test/fixtures/expected/alpha-layer-2-trim-resize.jpg new file mode 100644 index 000000000..aaa548a74 Binary files /dev/null and b/test/fixtures/expected/alpha-layer-2-trim-resize.jpg differ diff --git a/test/fixtures/expected/alpha-recomb-sepia.png b/test/fixtures/expected/alpha-recomb-sepia.png new file mode 100644 index 000000000..6e3daf9c1 Binary files /dev/null and b/test/fixtures/expected/alpha-recomb-sepia.png differ diff --git a/test/fixtures/expected/blur-0.3.jpg b/test/fixtures/expected/blur-0.3.jpg index e982f561b..db992900f 100644 Binary files a/test/fixtures/expected/blur-0.3.jpg and b/test/fixtures/expected/blur-0.3.jpg differ diff --git a/test/fixtures/expected/circle.png b/test/fixtures/expected/circle.png new file mode 100644 index 000000000..9b4575ac2 Binary files /dev/null and b/test/fixtures/expected/circle.png differ diff --git a/test/fixtures/expected/clahe-100-100-0.jpg b/test/fixtures/expected/clahe-100-100-0.jpg new file mode 100644 index 000000000..119bdd9a1 Binary files /dev/null and b/test/fixtures/expected/clahe-100-100-0.jpg differ diff --git a/test/fixtures/expected/clahe-100-50-3.jpg b/test/fixtures/expected/clahe-100-50-3.jpg new file mode 100644 index 000000000..b4c248169 Binary files /dev/null and b/test/fixtures/expected/clahe-100-50-3.jpg differ diff --git a/test/fixtures/expected/clahe-11-25-14.jpg b/test/fixtures/expected/clahe-11-25-14.jpg new file mode 100644 index 000000000..067b79d8b Binary files /dev/null and b/test/fixtures/expected/clahe-11-25-14.jpg differ diff --git a/test/fixtures/expected/clahe-5-5-0.jpg b/test/fixtures/expected/clahe-5-5-0.jpg new file mode 100644 index 000000000..7a70f8bb6 Binary files /dev/null and b/test/fixtures/expected/clahe-5-5-0.jpg differ diff --git a/test/fixtures/expected/clahe-5-5-5.jpg b/test/fixtures/expected/clahe-5-5-5.jpg new file mode 100644 index 000000000..5cea17a15 Binary files /dev/null and b/test/fixtures/expected/clahe-5-5-5.jpg differ diff --git a/test/fixtures/expected/clahe-50-50-0.jpg b/test/fixtures/expected/clahe-50-50-0.jpg new file mode 100644 index 000000000..f56f4b88e Binary files /dev/null and b/test/fixtures/expected/clahe-50-50-0.jpg differ diff --git a/test/fixtures/expected/clahe-50-50-14.jpg b/test/fixtures/expected/clahe-50-50-14.jpg new file mode 100644 index 000000000..b9d4244b7 Binary files /dev/null and b/test/fixtures/expected/clahe-50-50-14.jpg differ diff --git a/test/fixtures/expected/colourspace-gradients-gamma-resize.png b/test/fixtures/expected/colourspace-gradients-gamma-resize.png new file mode 100644 index 000000000..5fc9ca7bf Binary files /dev/null and b/test/fixtures/expected/colourspace-gradients-gamma-resize.png differ diff --git a/test/fixtures/expected/colourspace.cmyk-to-cmyk-negated.tif b/test/fixtures/expected/colourspace.cmyk-to-cmyk-negated.tif new file mode 100644 index 000000000..e1f1b60e9 Binary files /dev/null and b/test/fixtures/expected/colourspace.cmyk-to-cmyk-negated.tif differ diff --git a/test/fixtures/expected/composite-autoOrient.jpg b/test/fixtures/expected/composite-autoOrient.jpg new file mode 100644 index 000000000..75cf137f1 Binary files /dev/null and b/test/fixtures/expected/composite-autoOrient.jpg differ diff --git a/test/fixtures/expected/composite-cutout.png b/test/fixtures/expected/composite-cutout.png new file mode 100644 index 000000000..b98f2c35c Binary files /dev/null and b/test/fixtures/expected/composite-cutout.png differ diff --git a/test/fixtures/expected/composite-multiple.png b/test/fixtures/expected/composite-multiple.png new file mode 100644 index 000000000..18671a39b Binary files /dev/null and b/test/fixtures/expected/composite-multiple.png differ diff --git a/test/fixtures/expected/composite-red-scrgb.png b/test/fixtures/expected/composite-red-scrgb.png new file mode 100644 index 000000000..fd2bdf940 Binary files /dev/null and b/test/fixtures/expected/composite-red-scrgb.png differ diff --git a/test/fixtures/expected/composite.blend.dest-over.png b/test/fixtures/expected/composite.blend.dest-over.png new file mode 100644 index 000000000..f7f7b1eb9 Binary files /dev/null and b/test/fixtures/expected/composite.blend.dest-over.png differ diff --git a/test/fixtures/expected/composite.blend.over.png b/test/fixtures/expected/composite.blend.over.png new file mode 100644 index 000000000..8c58c402c Binary files /dev/null and b/test/fixtures/expected/composite.blend.over.png differ diff --git a/test/fixtures/expected/composite.blend.saturate.png b/test/fixtures/expected/composite.blend.saturate.png new file mode 100644 index 000000000..6c77b7fd2 Binary files /dev/null and b/test/fixtures/expected/composite.blend.saturate.png differ diff --git a/test/fixtures/expected/composite.blend.xor.png b/test/fixtures/expected/composite.blend.xor.png new file mode 100644 index 000000000..916eaf776 Binary files /dev/null and b/test/fixtures/expected/composite.blend.xor.png differ diff --git a/test/fixtures/expected/conv-sobel-horizontal.jpg b/test/fixtures/expected/conv-sobel-horizontal.jpg index f886a6073..4a9e2423d 100644 Binary files a/test/fixtures/expected/conv-sobel-horizontal.jpg and b/test/fixtures/expected/conv-sobel-horizontal.jpg differ diff --git a/test/fixtures/expected/crop-strategy-attention.jpg b/test/fixtures/expected/crop-strategy-attention.jpg index 16ddaa166..f1ef51e63 100644 Binary files a/test/fixtures/expected/crop-strategy-attention.jpg and b/test/fixtures/expected/crop-strategy-attention.jpg differ diff --git a/test/fixtures/expected/crop-strategy.webp b/test/fixtures/expected/crop-strategy.webp new file mode 100644 index 000000000..857a471af Binary files /dev/null and b/test/fixtures/expected/crop-strategy.webp differ diff --git a/test/fixtures/expected/d-opacity-30.png b/test/fixtures/expected/d-opacity-30.png new file mode 100644 index 000000000..d053cfa2b Binary files /dev/null and b/test/fixtures/expected/d-opacity-30.png differ diff --git a/test/fixtures/expected/dilate-1.png b/test/fixtures/expected/dilate-1.png new file mode 100644 index 000000000..947eb4b0c Binary files /dev/null and b/test/fixtures/expected/dilate-1.png differ diff --git a/test/fixtures/expected/embed-16bit-rgba.png b/test/fixtures/expected/embed-16bit-rgba.png index 29b8f6350..e775088b6 100644 Binary files a/test/fixtures/expected/embed-16bit-rgba.png and b/test/fixtures/expected/embed-16bit-rgba.png differ diff --git a/test/fixtures/expected/embed-2channel.png b/test/fixtures/expected/embed-2channel.png index dbf3bda02..a05eec65a 100644 Binary files a/test/fixtures/expected/embed-2channel.png and b/test/fixtures/expected/embed-2channel.png differ diff --git a/test/fixtures/expected/embed-4-into-4.png b/test/fixtures/expected/embed-4-into-4.png index 79cf1802f..071909cba 100644 Binary files a/test/fixtures/expected/embed-4-into-4.png and b/test/fixtures/expected/embed-4-into-4.png differ diff --git a/test/fixtures/expected/embed-animated-height.webp b/test/fixtures/expected/embed-animated-height.webp new file mode 100644 index 000000000..a546307d9 Binary files /dev/null and b/test/fixtures/expected/embed-animated-height.webp differ diff --git a/test/fixtures/expected/embed-animated-width.webp b/test/fixtures/expected/embed-animated-width.webp new file mode 100644 index 000000000..e11efa686 Binary files /dev/null and b/test/fixtures/expected/embed-animated-width.webp differ diff --git a/test/fixtures/expected/embed-enlarge.png b/test/fixtures/expected/embed-enlarge.png index 26e35615f..5bc155f1c 100644 Binary files a/test/fixtures/expected/embed-enlarge.png and b/test/fixtures/expected/embed-enlarge.png differ diff --git a/test/fixtures/expected/embed-lab-into-rgba.png b/test/fixtures/expected/embed-lab-into-rgba.png index 23c5d9ae6..6e48a7213 100644 Binary files a/test/fixtures/expected/embed-lab-into-rgba.png and b/test/fixtures/expected/embed-lab-into-rgba.png differ diff --git a/test/fixtures/expected/embedgravitybird/1-nw.png b/test/fixtures/expected/embedgravitybird/1-nw.png new file mode 100644 index 000000000..7d107748a Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/1-nw.png differ diff --git a/test/fixtures/expected/embedgravitybird/2-n.png b/test/fixtures/expected/embedgravitybird/2-n.png new file mode 100644 index 000000000..7d107748a Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/2-n.png differ diff --git a/test/fixtures/expected/embedgravitybird/3-ne.png b/test/fixtures/expected/embedgravitybird/3-ne.png new file mode 100644 index 000000000..7d107748a Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/3-ne.png differ diff --git a/test/fixtures/expected/embedgravitybird/4-e.png b/test/fixtures/expected/embedgravitybird/4-e.png new file mode 100644 index 000000000..4710a3f5c Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/4-e.png differ diff --git a/test/fixtures/expected/embedgravitybird/5-se.png b/test/fixtures/expected/embedgravitybird/5-se.png new file mode 100644 index 000000000..1c5df9f4a Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/5-se.png differ diff --git a/test/fixtures/expected/embedgravitybird/6-s.png b/test/fixtures/expected/embedgravitybird/6-s.png new file mode 100644 index 000000000..1c5df9f4a Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/6-s.png differ diff --git a/test/fixtures/expected/embedgravitybird/7-sw.png b/test/fixtures/expected/embedgravitybird/7-sw.png new file mode 100644 index 000000000..1c5df9f4a Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/7-sw.png differ diff --git a/test/fixtures/expected/embedgravitybird/8-w.png b/test/fixtures/expected/embedgravitybird/8-w.png new file mode 100644 index 000000000..4710a3f5c Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/8-w.png differ diff --git a/test/fixtures/expected/embedgravitybird/9-c.png b/test/fixtures/expected/embedgravitybird/9-c.png new file mode 100644 index 000000000..4710a3f5c Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/9-c.png differ diff --git a/test/fixtures/expected/embedgravitybird/a1-nw.png b/test/fixtures/expected/embedgravitybird/a1-nw.png new file mode 100644 index 000000000..f79957ec1 Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/a1-nw.png differ diff --git a/test/fixtures/expected/embedgravitybird/a2-n.png b/test/fixtures/expected/embedgravitybird/a2-n.png new file mode 100644 index 000000000..6669bd626 Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/a2-n.png differ diff --git a/test/fixtures/expected/embedgravitybird/a3-ne.png b/test/fixtures/expected/embedgravitybird/a3-ne.png new file mode 100644 index 000000000..df2fe69c4 Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/a3-ne.png differ diff --git a/test/fixtures/expected/embedgravitybird/a4-e.png b/test/fixtures/expected/embedgravitybird/a4-e.png new file mode 100644 index 000000000..df2fe69c4 Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/a4-e.png differ diff --git a/test/fixtures/expected/embedgravitybird/a5-se.png b/test/fixtures/expected/embedgravitybird/a5-se.png new file mode 100644 index 000000000..df2fe69c4 Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/a5-se.png differ diff --git a/test/fixtures/expected/embedgravitybird/a6-s.png b/test/fixtures/expected/embedgravitybird/a6-s.png new file mode 100644 index 000000000..6669bd626 Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/a6-s.png differ diff --git a/test/fixtures/expected/embedgravitybird/a7-sw.png b/test/fixtures/expected/embedgravitybird/a7-sw.png new file mode 100644 index 000000000..f79957ec1 Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/a7-sw.png differ diff --git a/test/fixtures/expected/embedgravitybird/a8-w.png b/test/fixtures/expected/embedgravitybird/a8-w.png new file mode 100644 index 000000000..f79957ec1 Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/a8-w.png differ diff --git a/test/fixtures/expected/embedgravitybird/a9-c.png b/test/fixtures/expected/embedgravitybird/a9-c.png new file mode 100644 index 000000000..6669bd626 Binary files /dev/null and b/test/fixtures/expected/embedgravitybird/a9-c.png differ diff --git a/test/fixtures/expected/erode-1.png b/test/fixtures/expected/erode-1.png new file mode 100644 index 000000000..54ff8035b Binary files /dev/null and b/test/fixtures/expected/erode-1.png differ diff --git a/test/fixtures/expected/expected.absent.composite.premultiplied.png b/test/fixtures/expected/expected.absent.composite.premultiplied.png new file mode 100644 index 000000000..09d90aba1 Binary files /dev/null and b/test/fixtures/expected/expected.absent.composite.premultiplied.png differ diff --git a/test/fixtures/expected/expected.false.composite.premultiplied.png b/test/fixtures/expected/expected.false.composite.premultiplied.png new file mode 100644 index 000000000..09d90aba1 Binary files /dev/null and b/test/fixtures/expected/expected.false.composite.premultiplied.png differ diff --git a/test/fixtures/expected/expected.true.composite.premultiplied.png b/test/fixtures/expected/expected.true.composite.premultiplied.png new file mode 100644 index 000000000..85f59a511 Binary files /dev/null and b/test/fixtures/expected/expected.true.composite.premultiplied.png differ diff --git a/test/fixtures/expected/extend-2channel-background.png b/test/fixtures/expected/extend-2channel-background.png new file mode 100644 index 000000000..e29a5ebac Binary files /dev/null and b/test/fixtures/expected/extend-2channel-background.png differ diff --git a/test/fixtures/expected/extend-2channel-copy.png b/test/fixtures/expected/extend-2channel-copy.png new file mode 100644 index 000000000..e29a5ebac Binary files /dev/null and b/test/fixtures/expected/extend-2channel-copy.png differ diff --git a/test/fixtures/expected/extend-2channel-mirror.png b/test/fixtures/expected/extend-2channel-mirror.png new file mode 100644 index 000000000..9a0ea8f98 Binary files /dev/null and b/test/fixtures/expected/extend-2channel-mirror.png differ diff --git a/test/fixtures/expected/extend-2channel-repeat.png b/test/fixtures/expected/extend-2channel-repeat.png new file mode 100644 index 000000000..0c4b2b5f0 Binary files /dev/null and b/test/fixtures/expected/extend-2channel-repeat.png differ diff --git a/test/fixtures/expected/extend-2channel.png b/test/fixtures/expected/extend-2channel.png deleted file mode 100644 index bd57b5a93..000000000 Binary files a/test/fixtures/expected/extend-2channel.png and /dev/null differ diff --git a/test/fixtures/expected/extend-equal.jpg b/test/fixtures/expected/extend-equal-background.jpg similarity index 100% rename from test/fixtures/expected/extend-equal.jpg rename to test/fixtures/expected/extend-equal-background.jpg diff --git a/test/fixtures/expected/extend-equal-background.webp b/test/fixtures/expected/extend-equal-background.webp new file mode 100644 index 000000000..c7898aefa Binary files /dev/null and b/test/fixtures/expected/extend-equal-background.webp differ diff --git a/test/fixtures/expected/extend-equal-copy.jpg b/test/fixtures/expected/extend-equal-copy.jpg new file mode 100644 index 000000000..160395eb8 Binary files /dev/null and b/test/fixtures/expected/extend-equal-copy.jpg differ diff --git a/test/fixtures/expected/extend-equal-copy.webp b/test/fixtures/expected/extend-equal-copy.webp new file mode 100644 index 000000000..7123beee9 Binary files /dev/null and b/test/fixtures/expected/extend-equal-copy.webp differ diff --git a/test/fixtures/expected/extend-equal-mirror.jpg b/test/fixtures/expected/extend-equal-mirror.jpg new file mode 100644 index 000000000..9ff2a3cbf Binary files /dev/null and b/test/fixtures/expected/extend-equal-mirror.jpg differ diff --git a/test/fixtures/expected/extend-equal-mirror.webp b/test/fixtures/expected/extend-equal-mirror.webp new file mode 100644 index 000000000..9b3b35187 Binary files /dev/null and b/test/fixtures/expected/extend-equal-mirror.webp differ diff --git a/test/fixtures/expected/extend-equal-repeat.jpg b/test/fixtures/expected/extend-equal-repeat.jpg new file mode 100644 index 000000000..860787da0 Binary files /dev/null and b/test/fixtures/expected/extend-equal-repeat.jpg differ diff --git a/test/fixtures/expected/extend-equal-repeat.webp b/test/fixtures/expected/extend-equal-repeat.webp new file mode 100644 index 000000000..17320d84c Binary files /dev/null and b/test/fixtures/expected/extend-equal-repeat.webp differ diff --git a/test/fixtures/expected/extend-equal-single.jpg b/test/fixtures/expected/extend-equal-single.jpg new file mode 100644 index 000000000..f7da5177f Binary files /dev/null and b/test/fixtures/expected/extend-equal-single.jpg differ diff --git a/test/fixtures/expected/extend-equal-single.webp b/test/fixtures/expected/extend-equal-single.webp new file mode 100644 index 000000000..95fab3970 Binary files /dev/null and b/test/fixtures/expected/extend-equal-single.webp differ diff --git a/test/fixtures/expected/extend-unequal-background.png b/test/fixtures/expected/extend-unequal-background.png new file mode 100644 index 000000000..7d29c94e0 Binary files /dev/null and b/test/fixtures/expected/extend-unequal-background.png differ diff --git a/test/fixtures/expected/extend-unequal-copy.png b/test/fixtures/expected/extend-unequal-copy.png new file mode 100644 index 000000000..ab324ef47 Binary files /dev/null and b/test/fixtures/expected/extend-unequal-copy.png differ diff --git a/test/fixtures/expected/extend-unequal-mirror.png b/test/fixtures/expected/extend-unequal-mirror.png new file mode 100644 index 000000000..99e35bf95 Binary files /dev/null and b/test/fixtures/expected/extend-unequal-mirror.png differ diff --git a/test/fixtures/expected/extend-unequal-repeat.png b/test/fixtures/expected/extend-unequal-repeat.png new file mode 100644 index 000000000..6f0afd4b6 Binary files /dev/null and b/test/fixtures/expected/extend-unequal-repeat.png differ diff --git a/test/fixtures/expected/extend-unequal.png b/test/fixtures/expected/extend-unequal.png deleted file mode 100644 index 1454f0164..000000000 Binary files a/test/fixtures/expected/extend-unequal.png and /dev/null differ diff --git a/test/fixtures/expected/extract-alpha-16bit.png b/test/fixtures/expected/extract-alpha-16bit.png new file mode 100644 index 000000000..91537410f Binary files /dev/null and b/test/fixtures/expected/extract-alpha-16bit.png differ diff --git a/test/fixtures/expected/extract-alpha-2-channel.png b/test/fixtures/expected/extract-alpha-2-channel.png new file mode 100644 index 000000000..bfcd30b05 Binary files /dev/null and b/test/fixtures/expected/extract-alpha-2-channel.png differ diff --git a/test/fixtures/expected/extract-resize.jpg b/test/fixtures/expected/extract-resize.jpg index d4e6d5cde..5ef6907d6 100644 Binary files a/test/fixtures/expected/extract-resize.jpg and b/test/fixtures/expected/extract-resize.jpg differ diff --git a/test/fixtures/expected/extract-rotate-45.jpg b/test/fixtures/expected/extract-rotate-45.jpg new file mode 100644 index 000000000..38efece21 Binary files /dev/null and b/test/fixtures/expected/extract-rotate-45.jpg differ diff --git a/test/fixtures/expected/extract-rotate-extract.jpg b/test/fixtures/expected/extract-rotate-extract.jpg new file mode 100644 index 000000000..a9f09c641 Binary files /dev/null and b/test/fixtures/expected/extract-rotate-extract.jpg differ diff --git a/test/fixtures/expected/extract.jpg b/test/fixtures/expected/extract.jpg index ee1d37fb2..53a4ef9b8 100644 Binary files a/test/fixtures/expected/extract.jpg and b/test/fixtures/expected/extract.jpg differ diff --git a/test/fixtures/expected/extract.tiff b/test/fixtures/expected/extract.tiff index 74c2cac3f..e02db48b2 100644 Binary files a/test/fixtures/expected/extract.tiff and b/test/fixtures/expected/extract.tiff differ diff --git a/test/fixtures/expected/extract.webp b/test/fixtures/expected/extract.webp index 9165ab5a0..e090ce73f 100644 Binary files a/test/fixtures/expected/extract.webp and b/test/fixtures/expected/extract.webp differ diff --git a/test/fixtures/expected/fast-shrink-on-load.png b/test/fixtures/expected/fast-shrink-on-load.png new file mode 100644 index 000000000..feeb8e8b8 Binary files /dev/null and b/test/fixtures/expected/fast-shrink-on-load.png differ diff --git a/test/fixtures/expected/flatten-orange.jpg b/test/fixtures/expected/flatten-orange.jpg index 18f49bb87..5f00a736d 100644 Binary files a/test/fixtures/expected/flatten-orange.jpg and b/test/fixtures/expected/flatten-orange.jpg differ diff --git a/test/fixtures/expected/flatten-rgb16-orange.jpg b/test/fixtures/expected/flatten-rgb16-orange.jpg index f4567011d..6371a444b 100644 Binary files a/test/fixtures/expected/flatten-rgb16-orange.jpg and b/test/fixtures/expected/flatten-rgb16-orange.jpg differ diff --git a/test/fixtures/expected/flip-and-flop.jpg b/test/fixtures/expected/flip-and-flop.jpg index ce58eadfb..44f70f15a 100644 Binary files a/test/fixtures/expected/flip-and-flop.jpg and b/test/fixtures/expected/flip-and-flop.jpg differ diff --git a/test/fixtures/expected/gamma-0.0.jpg b/test/fixtures/expected/gamma-0.0.jpg index 4ae2fe070..f3fe1e7e6 100644 Binary files a/test/fixtures/expected/gamma-0.0.jpg and b/test/fixtures/expected/gamma-0.0.jpg differ diff --git a/test/fixtures/expected/gamma-in-2.2-out-3.0.jpg b/test/fixtures/expected/gamma-in-2.2-out-3.0.jpg new file mode 100644 index 000000000..50ddb9d16 Binary files /dev/null and b/test/fixtures/expected/gamma-in-2.2-out-3.0.jpg differ diff --git a/test/fixtures/expected/gravity-center-height.webp b/test/fixtures/expected/gravity-center-height.webp new file mode 100644 index 000000000..70fbd1de0 Binary files /dev/null and b/test/fixtures/expected/gravity-center-height.webp differ diff --git a/test/fixtures/expected/gravity-center-width.webp b/test/fixtures/expected/gravity-center-width.webp new file mode 100644 index 000000000..964ebe61f Binary files /dev/null and b/test/fixtures/expected/gravity-center-width.webp differ diff --git a/test/fixtures/expected/hilutite.jpg b/test/fixtures/expected/hilutite.jpg new file mode 100644 index 000000000..d11facb2f Binary files /dev/null and b/test/fixtures/expected/hilutite.jpg differ diff --git a/lib/icc/cmyk.icm b/test/fixtures/expected/icc-cmyk.jpg similarity index 98% rename from lib/icc/cmyk.icm rename to test/fixtures/expected/icc-cmyk.jpg index 7f070779e..4b7cf7d4f 100644 Binary files a/lib/icc/cmyk.icm and b/test/fixtures/expected/icc-cmyk.jpg differ diff --git a/test/fixtures/expected/join2x2.png b/test/fixtures/expected/join2x2.png new file mode 100644 index 000000000..533df1284 Binary files /dev/null and b/test/fixtures/expected/join2x2.png differ diff --git a/test/fixtures/expected/linear-16bit.png b/test/fixtures/expected/linear-16bit.png new file mode 100644 index 000000000..3e7bfc525 Binary files /dev/null and b/test/fixtures/expected/linear-16bit.png differ diff --git a/test/fixtures/expected/linear-per-channel.jpg b/test/fixtures/expected/linear-per-channel.jpg new file mode 100644 index 000000000..13cef2b8a Binary files /dev/null and b/test/fixtures/expected/linear-per-channel.jpg differ diff --git a/test/fixtures/expected/low-contrast-linear.jpg b/test/fixtures/expected/low-contrast-linear.jpg new file mode 100644 index 000000000..78f043cc9 Binary files /dev/null and b/test/fixtures/expected/low-contrast-linear.jpg differ diff --git a/test/fixtures/expected/low-contrast-offset.jpg b/test/fixtures/expected/low-contrast-offset.jpg new file mode 100644 index 000000000..c403fc329 Binary files /dev/null and b/test/fixtures/expected/low-contrast-offset.jpg differ diff --git a/test/fixtures/expected/low-contrast-slope.jpg b/test/fixtures/expected/low-contrast-slope.jpg new file mode 100644 index 000000000..0a3b4967d Binary files /dev/null and b/test/fixtures/expected/low-contrast-slope.jpg differ diff --git a/test/fixtures/expected/modulate-hue-angle-120.png b/test/fixtures/expected/modulate-hue-angle-120.png new file mode 100644 index 000000000..0a6d0e9d2 Binary files /dev/null and b/test/fixtures/expected/modulate-hue-angle-120.png differ diff --git a/test/fixtures/expected/modulate-hue-angle-150.png b/test/fixtures/expected/modulate-hue-angle-150.png new file mode 100644 index 000000000..e776fc4a0 Binary files /dev/null and b/test/fixtures/expected/modulate-hue-angle-150.png differ diff --git a/test/fixtures/expected/modulate-hue-angle-180.png b/test/fixtures/expected/modulate-hue-angle-180.png new file mode 100644 index 000000000..6b3f2e708 Binary files /dev/null and b/test/fixtures/expected/modulate-hue-angle-180.png differ diff --git a/test/fixtures/expected/modulate-hue-angle-210.png b/test/fixtures/expected/modulate-hue-angle-210.png new file mode 100644 index 000000000..ca489f0ca Binary files /dev/null and b/test/fixtures/expected/modulate-hue-angle-210.png differ diff --git a/test/fixtures/expected/modulate-hue-angle-240.png b/test/fixtures/expected/modulate-hue-angle-240.png new file mode 100644 index 000000000..8bedbbfab Binary files /dev/null and b/test/fixtures/expected/modulate-hue-angle-240.png differ diff --git a/test/fixtures/expected/modulate-hue-angle-270.png b/test/fixtures/expected/modulate-hue-angle-270.png new file mode 100644 index 000000000..f2bf4f9e9 Binary files /dev/null and b/test/fixtures/expected/modulate-hue-angle-270.png differ diff --git a/test/fixtures/expected/modulate-hue-angle-30.png b/test/fixtures/expected/modulate-hue-angle-30.png new file mode 100644 index 000000000..b7dfce148 Binary files /dev/null and b/test/fixtures/expected/modulate-hue-angle-30.png differ diff --git a/test/fixtures/expected/modulate-hue-angle-300.png b/test/fixtures/expected/modulate-hue-angle-300.png new file mode 100644 index 000000000..69238c3b0 Binary files /dev/null and b/test/fixtures/expected/modulate-hue-angle-300.png differ diff --git a/test/fixtures/expected/modulate-hue-angle-330.png b/test/fixtures/expected/modulate-hue-angle-330.png new file mode 100644 index 000000000..c42501309 Binary files /dev/null and b/test/fixtures/expected/modulate-hue-angle-330.png differ diff --git a/test/fixtures/expected/modulate-hue-angle-360.png b/test/fixtures/expected/modulate-hue-angle-360.png new file mode 100644 index 000000000..dd7474135 Binary files /dev/null and b/test/fixtures/expected/modulate-hue-angle-360.png differ diff --git a/test/fixtures/expected/modulate-hue-angle-60.png b/test/fixtures/expected/modulate-hue-angle-60.png new file mode 100644 index 000000000..3305d7870 Binary files /dev/null and b/test/fixtures/expected/modulate-hue-angle-60.png differ diff --git a/test/fixtures/expected/modulate-hue-angle-90.png b/test/fixtures/expected/modulate-hue-angle-90.png new file mode 100644 index 000000000..077a92a09 Binary files /dev/null and b/test/fixtures/expected/modulate-hue-angle-90.png differ diff --git a/test/fixtures/expected/negate-preserve-alpha-grey.png b/test/fixtures/expected/negate-preserve-alpha-grey.png new file mode 100644 index 000000000..7e4b85df6 Binary files /dev/null and b/test/fixtures/expected/negate-preserve-alpha-grey.png differ diff --git a/test/fixtures/expected/negate-preserve-alpha-trans.png b/test/fixtures/expected/negate-preserve-alpha-trans.png new file mode 100644 index 000000000..18e98a703 Binary files /dev/null and b/test/fixtures/expected/negate-preserve-alpha-trans.png differ diff --git a/test/fixtures/expected/negate-preserve-alpha-trans.webp b/test/fixtures/expected/negate-preserve-alpha-trans.webp new file mode 100644 index 000000000..eb8ac98a4 Binary files /dev/null and b/test/fixtures/expected/negate-preserve-alpha-trans.webp differ diff --git a/test/fixtures/expected/negate-preserve-alpha.png b/test/fixtures/expected/negate-preserve-alpha.png new file mode 100644 index 000000000..1e5122a21 Binary files /dev/null and b/test/fixtures/expected/negate-preserve-alpha.png differ diff --git a/test/fixtures/expected/negate-preserve-alpha.webp b/test/fixtures/expected/negate-preserve-alpha.webp new file mode 100644 index 000000000..287e38c86 Binary files /dev/null and b/test/fixtures/expected/negate-preserve-alpha.webp differ diff --git a/test/fixtures/expected/negate-trans.png b/test/fixtures/expected/negate-trans.png index 52e79e5f6..d590068cc 100644 Binary files a/test/fixtures/expected/negate-trans.png and b/test/fixtures/expected/negate-trans.png differ diff --git a/test/fixtures/expected/overlay-gravity-center.jpg b/test/fixtures/expected/overlay-gravity-center.jpg index 8b65ebcf7..709403e7d 100644 Binary files a/test/fixtures/expected/overlay-gravity-center.jpg and b/test/fixtures/expected/overlay-gravity-center.jpg differ diff --git a/test/fixtures/expected/overlay-gravity-centre.jpg b/test/fixtures/expected/overlay-gravity-centre.jpg index 8b65ebcf7..709403e7d 100644 Binary files a/test/fixtures/expected/overlay-gravity-centre.jpg and b/test/fixtures/expected/overlay-gravity-centre.jpg differ diff --git a/test/fixtures/expected/overlay-gravity-east.jpg b/test/fixtures/expected/overlay-gravity-east.jpg index 756a82c11..81cb9a7fa 100644 Binary files a/test/fixtures/expected/overlay-gravity-east.jpg and b/test/fixtures/expected/overlay-gravity-east.jpg differ diff --git a/test/fixtures/expected/overlay-gravity-north.jpg b/test/fixtures/expected/overlay-gravity-north.jpg index b61fc8375..fa5a9639b 100644 Binary files a/test/fixtures/expected/overlay-gravity-north.jpg and b/test/fixtures/expected/overlay-gravity-north.jpg differ diff --git a/test/fixtures/expected/overlay-gravity-northeast.jpg b/test/fixtures/expected/overlay-gravity-northeast.jpg index 9523792be..1be5a1944 100644 Binary files a/test/fixtures/expected/overlay-gravity-northeast.jpg and b/test/fixtures/expected/overlay-gravity-northeast.jpg differ diff --git a/test/fixtures/expected/overlay-gravity-northwest.jpg b/test/fixtures/expected/overlay-gravity-northwest.jpg index 030e0b124..acf269989 100644 Binary files a/test/fixtures/expected/overlay-gravity-northwest.jpg and b/test/fixtures/expected/overlay-gravity-northwest.jpg differ diff --git a/test/fixtures/expected/overlay-gravity-south.jpg b/test/fixtures/expected/overlay-gravity-south.jpg index 96cd451d8..bd8856413 100644 Binary files a/test/fixtures/expected/overlay-gravity-south.jpg and b/test/fixtures/expected/overlay-gravity-south.jpg differ diff --git a/test/fixtures/expected/overlay-gravity-southeast.jpg b/test/fixtures/expected/overlay-gravity-southeast.jpg index 852292d86..d46da3a0d 100644 Binary files a/test/fixtures/expected/overlay-gravity-southeast.jpg and b/test/fixtures/expected/overlay-gravity-southeast.jpg differ diff --git a/test/fixtures/expected/overlay-gravity-southwest.jpg b/test/fixtures/expected/overlay-gravity-southwest.jpg index 8876c9fdc..755c128a5 100644 Binary files a/test/fixtures/expected/overlay-gravity-southwest.jpg and b/test/fixtures/expected/overlay-gravity-southwest.jpg differ diff --git a/test/fixtures/expected/overlay-gravity-west.jpg b/test/fixtures/expected/overlay-gravity-west.jpg index 1495500cb..4c4223616 100644 Binary files a/test/fixtures/expected/overlay-gravity-west.jpg and b/test/fixtures/expected/overlay-gravity-west.jpg differ diff --git a/test/fixtures/expected/overlay-negative-offset-with-gravity.jpg b/test/fixtures/expected/overlay-negative-offset-with-gravity.jpg new file mode 100644 index 000000000..06e585e2b Binary files /dev/null and b/test/fixtures/expected/overlay-negative-offset-with-gravity.jpg differ diff --git a/test/fixtures/expected/overlay-offset-0.jpg b/test/fixtures/expected/overlay-offset-0.jpg index e64b8e509..005dfa012 100644 Binary files a/test/fixtures/expected/overlay-offset-0.jpg and b/test/fixtures/expected/overlay-offset-0.jpg differ diff --git a/test/fixtures/expected/overlay-offset-with-gravity-tile.jpg b/test/fixtures/expected/overlay-offset-with-gravity-tile.jpg index fc0596832..83a700525 100644 Binary files a/test/fixtures/expected/overlay-offset-with-gravity-tile.jpg and b/test/fixtures/expected/overlay-offset-with-gravity-tile.jpg differ diff --git a/test/fixtures/expected/overlay-offset-with-gravity.jpg b/test/fixtures/expected/overlay-offset-with-gravity.jpg index e217a345d..ee7ac1f1d 100644 Binary files a/test/fixtures/expected/overlay-offset-with-gravity.jpg and b/test/fixtures/expected/overlay-offset-with-gravity.jpg differ diff --git a/test/fixtures/expected/overlay-offset-with-tile.jpg b/test/fixtures/expected/overlay-offset-with-tile.jpg index fc0596832..83a700525 100644 Binary files a/test/fixtures/expected/overlay-offset-with-tile.jpg and b/test/fixtures/expected/overlay-offset-with-tile.jpg differ diff --git a/test/fixtures/expected/resize-crop-extract.jpg b/test/fixtures/expected/resize-crop-extract.jpg index 166e5e967..e596b9a0a 100644 Binary files a/test/fixtures/expected/resize-crop-extract.jpg and b/test/fixtures/expected/resize-crop-extract.jpg differ diff --git a/test/fixtures/expected/rotate-extract-45.jpg b/test/fixtures/expected/rotate-extract-45.jpg new file mode 100644 index 000000000..5600280fc Binary files /dev/null and b/test/fixtures/expected/rotate-extract-45.jpg differ diff --git a/test/fixtures/expected/rotate-mirror-extract.jpg b/test/fixtures/expected/rotate-mirror-extract.jpg new file mode 100644 index 000000000..46d5a93b6 Binary files /dev/null and b/test/fixtures/expected/rotate-mirror-extract.jpg differ diff --git a/test/fixtures/expected/rotate-solid-bg.jpg b/test/fixtures/expected/rotate-solid-bg.jpg new file mode 100644 index 000000000..2110578fd Binary files /dev/null and b/test/fixtures/expected/rotate-solid-bg.jpg differ diff --git a/test/fixtures/expected/rotate-transparent-bg.png b/test/fixtures/expected/rotate-transparent-bg.png new file mode 100644 index 000000000..62fefc100 Binary files /dev/null and b/test/fixtures/expected/rotate-transparent-bg.png differ diff --git a/test/fixtures/expected/svg-embedded.png b/test/fixtures/expected/svg-embedded.png index 7c8cddc28..c08b1f110 100644 Binary files a/test/fixtures/expected/svg-embedded.png and b/test/fixtures/expected/svg-embedded.png differ diff --git a/test/fixtures/expected/svg1200.png b/test/fixtures/expected/svg1200.png index 78df696dc..4e5d9b89e 100644 Binary files a/test/fixtures/expected/svg1200.png and b/test/fixtures/expected/svg1200.png differ diff --git a/test/fixtures/expected/svg14.4.png b/test/fixtures/expected/svg14.4.png new file mode 100644 index 000000000..32f2fb3da Binary files /dev/null and b/test/fixtures/expected/svg14.4.png differ diff --git a/test/fixtures/expected/svg72.png b/test/fixtures/expected/svg72.png index a372da8e1..01d0586a9 100644 Binary files a/test/fixtures/expected/svg72.png and b/test/fixtures/expected/svg72.png differ diff --git a/test/fixtures/expected/tile_centered.jpg b/test/fixtures/expected/tile_centered.jpg new file mode 100644 index 000000000..9ae14ff6d Binary files /dev/null and b/test/fixtures/expected/tile_centered.jpg differ diff --git a/test/fixtures/expected/tint-alpha.png b/test/fixtures/expected/tint-alpha.png new file mode 100644 index 000000000..1375af948 Binary files /dev/null and b/test/fixtures/expected/tint-alpha.png differ diff --git a/test/fixtures/expected/tint-blue.jpg b/test/fixtures/expected/tint-blue.jpg new file mode 100644 index 000000000..780ffef52 Binary files /dev/null and b/test/fixtures/expected/tint-blue.jpg differ diff --git a/test/fixtures/expected/tint-cmyk.jpg b/test/fixtures/expected/tint-cmyk.jpg new file mode 100644 index 000000000..1bd076da5 Binary files /dev/null and b/test/fixtures/expected/tint-cmyk.jpg differ diff --git a/test/fixtures/expected/tint-green.jpg b/test/fixtures/expected/tint-green.jpg new file mode 100644 index 000000000..c7ef22bef Binary files /dev/null and b/test/fixtures/expected/tint-green.jpg differ diff --git a/test/fixtures/expected/tint-red.jpg b/test/fixtures/expected/tint-red.jpg new file mode 100644 index 000000000..b0fdc44c7 Binary files /dev/null and b/test/fixtures/expected/tint-red.jpg differ diff --git a/test/fixtures/expected/tint-sepia.jpg b/test/fixtures/expected/tint-sepia.jpg new file mode 100644 index 000000000..57fae97bb Binary files /dev/null and b/test/fixtures/expected/tint-sepia.jpg differ diff --git a/test/fixtures/expected/trim-16bit-rgba.png b/test/fixtures/expected/trim-16bit-rgba.png index 555f79d30..46788fc25 100644 Binary files a/test/fixtures/expected/trim-16bit-rgba.png and b/test/fixtures/expected/trim-16bit-rgba.png differ diff --git a/test/fixtures/expected/truncated.jpg b/test/fixtures/expected/truncated.jpg new file mode 100644 index 000000000..0bd7b8082 Binary files /dev/null and b/test/fixtures/expected/truncated.jpg differ diff --git a/test/fixtures/expected/unflatten-flag-white-transparent.png b/test/fixtures/expected/unflatten-flag-white-transparent.png new file mode 100644 index 000000000..afed72132 Binary files /dev/null and b/test/fixtures/expected/unflatten-flag-white-transparent.png differ diff --git a/test/fixtures/expected/unflatten-swiss.png b/test/fixtures/expected/unflatten-swiss.png new file mode 100644 index 000000000..89c8a8674 Binary files /dev/null and b/test/fixtures/expected/unflatten-swiss.png differ diff --git a/test/fixtures/expected/unflatten-white-transparent.png b/test/fixtures/expected/unflatten-white-transparent.png new file mode 100644 index 000000000..abf3ed48b Binary files /dev/null and b/test/fixtures/expected/unflatten-white-transparent.png differ diff --git a/test/fixtures/expected/webp-alpha-80.webp b/test/fixtures/expected/webp-alpha-80.webp index 739d8c00e..cb4ef6ec4 100644 Binary files a/test/fixtures/expected/webp-alpha-80.webp and b/test/fixtures/expected/webp-alpha-80.webp differ diff --git a/test/fixtures/fogra-0-100-100-0.tif b/test/fixtures/fogra-0-100-100-0.tif new file mode 100644 index 000000000..d6c3a598a Binary files /dev/null and b/test/fixtures/fogra-0-100-100-0.tif differ diff --git a/test/fixtures/full-transparent.png b/test/fixtures/full-transparent.png new file mode 100644 index 000000000..c87eaf4a1 Binary files /dev/null and b/test/fixtures/full-transparent.png differ diff --git a/test/fixtures/gradients-rgb8.png b/test/fixtures/gradients-rgb8.png new file mode 100644 index 000000000..693272bc0 Binary files /dev/null and b/test/fixtures/gradients-rgb8.png differ diff --git a/test/fixtures/hilutite.icm b/test/fixtures/hilutite.icm new file mode 100644 index 000000000..6a7ad4be2 Binary files /dev/null and b/test/fixtures/hilutite.icm differ diff --git a/test/fixtures/image-in-alpha.png b/test/fixtures/image-in-alpha.png new file mode 100644 index 000000000..f58ca10c5 Binary files /dev/null and b/test/fixtures/image-in-alpha.png differ diff --git a/test/fixtures/index.js b/test/fixtures/index.js index d45d8a972..2448843e2 100644 --- a/test/fixtures/index.js +++ b/test/fixtures/index.js @@ -1,39 +1,37 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const path = require('path'); +const path = require('node:path'); const sharp = require('../../'); -const maxColourDistance = require('../../build/Release/sharp')._maxColourDistance; +const maxColourDistance = require('../../lib/sharp')._maxColourDistance; // Helpers -const getPath = function (filename) { - return path.join(__dirname, filename); -}; +const getPath = (filename) => path.join(__dirname, filename); // Generates a 64-bit-as-binary-string image fingerprint // Based on the dHash gradient method - see http://www.hackerfactor.com/blog/index.php?/archives/529-Kind-of-Like-That.html -const fingerprint = function (image, callback) { - sharp(image) +async function fingerprint (image) { + return sharp(image) + .flatten('gray') .greyscale() .normalise() - .resize(9, 8) - .ignoreAspectRatio() + .resize(9, 8, { fit: sharp.fit.fill }) .raw() - .toBuffer(function (err, data) { - if (err) { - callback(err); - } else { - let fingerprint = ''; - for (let col = 0; col < 8; col++) { - for (let row = 0; row < 8; row++) { - const left = data[(row * 8) + col]; - const right = data[(row * 8) + col + 1]; - fingerprint = fingerprint + (left < right ? '1' : '0'); - } + .toBuffer() + .then((data) => { + let fingerprint = ''; + for (let col = 0; col < 8; col++) { + for (let row = 0; row < 8; row++) { + const left = data[(row * 8) + col]; + const right = data[(row * 8) + col + 1]; + fingerprint = fingerprint + (left < right ? '1' : '0'); } - callback(null, fingerprint); } + return fingerprint; }); -}; +} module.exports = { @@ -57,6 +55,7 @@ module.exports = { inputJpg: getPath('2569067123_aca715a2ee_o.jpg'), // http://www.flickr.com/photos/grizdave/2569067123/ inputJpgWithExif: getPath('Landscape_8.jpg'), // https://github.com/recurser/exif-orientation-examples/blob/master/Landscape_8.jpg + inputJpgWithIptcAndXmp: getPath('Landscape_9.jpg'), // https://unsplash.com/photos/RWAIyGmgHTQ inputJpgWithExifMirroring: getPath('Landscape_5.jpg'), // https://github.com/recurser/exif-orientation-examples/blob/master/Landscape_5.jpg inputJpgWithGammaHoliness: getPath('gamma_dalai_lama_gray.jpg'), // http://www.4p8.com/eric.brasseur/gamma.html inputJpgWithCmykProfile: getPath('Channel_digital_image_CMYK_color.jpg'), // http://en.wikipedia.org/wiki/File:Channel_digital_image_CMYK_color.jpg @@ -65,35 +64,72 @@ module.exports = { inputJpgWithLowContrast: getPath('low-contrast.jpg'), // http://www.flickr.com/photos/grizdave/2569067123/ inputJpgLarge: getPath('giant-image.jpg'), inputJpg320x240: getPath('320x240.jpg'), // http://www.andrewault.net/2010/01/26/create-a-test-pattern-video-with-perl/ + inputJpgOverlayLayer2: getPath('alpha-layer-2-ink.jpg'), + inputJpgTruncated: getPath('truncated.jpg'), // head -c 10000 2569067123_aca715a2ee_o.jpg > truncated.jpg + inputJpgCenteredImage: getPath('centered_image.jpeg'), + inputJpgRandom: getPath('random.jpg'), // convert -size 200x200 xc: +noise Random random.jpg + inputJpgThRandom: getPath('thRandom.jpg'), // convert random.jpg -channel G -threshold 5% -separate +channel -negate thRandom.jpg + inputJpgLossless: getPath('testimgl.jpg'), // Lossless JPEG from ftp://ftp.fu-berlin.de/unix/X11/graphics/ImageMagick/delegates/ljpeg-6b.tar.gz inputPng: getPath('50020484-00001.png'), // http://c.searspartsdirect.com/lis_png/PLDM/50020484-00001.png + inputPngGradients: getPath('gradients-rgb8.png'), inputPngWithTransparency: getPath('blackbug.png'), // public domain + inputPngCompleteTransparency: getPath('full-transparent.png'), inputPngWithGreyAlpha: getPath('grey-8bit-alpha.png'), inputPngWithOneColor: getPath('2x2_fdcce6.png'), inputPngWithTransparency16bit: getPath('tbgn2c16.png'), // http://www.schaik.com/pngsuite/tbgn2c16.png + inputPng8BitGreyBackground: getPath('bgbn4a08.png'), // http://www.schaik.com/pngsuite/bgbn4a08.png + inputPng16BitGreyBackground: getPath('bggn4a16.png'), // http://www.schaik.com/pngsuite/bggn4a16.png + inputPng16BitGreyAlpha: getPath('16-bit-grey-alpha.png'), // CC-BY-NC-SA florc http://www.colourlovers.com/pattern/50713/pat inputPngOverlayLayer0: getPath('alpha-layer-0-background.png'), inputPngOverlayLayer1: getPath('alpha-layer-1-fill.png'), - inputPngOverlayLayer2: getPath('alpha-layer-2-ink.png'), - inputPngOverlayLayer1LowAlpha: getPath('alpha-layer-1-fill-low-alpha.png'), - inputPngOverlayLayer2LowAlpha: getPath('alpha-layer-2-ink-low-alpha.png'), inputPngAlphaPremultiplicationSmall: getPath('alpha-premultiply-1024x768-paper.png'), inputPngAlphaPremultiplicationLarge: getPath('alpha-premultiply-2048x1536-paper.png'), inputPngBooleanNoAlpha: getPath('bandbool.png'), inputPngTestJoinChannel: getPath('testJoinChannel.png'), + inputPngTruncated: getPath('truncated.png'), // gm convert 2569067123_aca715a2ee_o.jpg -resize 320x240 saw.png ; head -c 10000 saw.png > truncated.png + inputPngEmbed: getPath('embedgravitybird.png'), // Released to sharp under a CC BY 4.0 + inputPngRGBWithAlpha: getPath('2569067123_aca715a2ee_o.png'), // http://www.flickr.com/photos/grizdave/2569067123/ (same as inputJpg) + inputPngImageInAlpha: getPath('image-in-alpha.png'), // https://github.com/lovell/sharp/issues/1597 + inputPngSolidAlpha: getPath('with-alpha.png'), // https://github.com/lovell/sharp/issues/1599 + inputPngP3: getPath('p3.png'), // https://github.com/lovell/sharp/issues/2862 + inputPngPalette: getPath('swiss.png'), // https://github.com/randy408/libspng/issues/188 + inputPngTrimIncludeAlpha: getPath('trim-mc.png'), // https://github.com/lovell/sharp/issues/2166 + inputPngTrimSpecificColour: getPath('Flag_of_the_Netherlands.png'), // https://commons.wikimedia.org/wiki/File:Flag_of_the_Netherlands.svg + inputPngTrimSpecificColour16bit: getPath('Flag_of_the_Netherlands-16bit.png'), // convert Flag_of_the_Netherlands.png -depth 16 Flag_of_the_Netherlands-16bit.png + inputPngTrimSpecificColourIncludeAlpha: getPath('Flag_of_the_Netherlands-alpha.png'), // convert Flag_of_the_Netherlands.png -alpha set -background none -channel A -evaluate multiply 0.5 +channel Flag_of_the_Netherlands-alpha.png + inputPngUint32Limit: getPath('65536-uint32-limit.png'), // https://alexandre.alapetite.fr/doc-alex/large-image/ + inputPngWithProPhotoProfile: getPath('prophoto.png'), inputWebP: getPath('4.webp'), // http://www.gstatic.com/webp/gallery/4.webp inputWebPWithTransparency: getPath('5_webp_a.webp'), // http://www.gstatic.com/webp/gallery3/5_webp_a.webp + inputWebPAnimated: getPath('rotating-squares.webp'), // http://www.gstatic.com/webp/gallery3/5_webp_a.webp + inputWebPAnimatedLoop3: getPath('animated-loop-3.webp'), // http://www.gstatic.com/webp/gallery3/5_webp_a.webp + inputWebPAnimatedBigHeight: getPath('big-height.webp'), inputTiff: getPath('G31D.TIF'), // http://www.fileformat.info/format/tiff/sample/e6c9a6e5253348f4aef6d17b534360ab/index.htm + inputTiffMultipage: getPath('G31D_MULTI.TIF'), // gm convert G31D.TIF -resize 50% G31D_2.TIF ; tiffcp G31D.TIF G31D_2.TIF G31D_MULTI.TIF inputTiffCielab: getPath('cielab-dagams.tiff'), // https://github.com/lovell/sharp/issues/646 inputTiffUncompressed: getPath('uncompressed_tiff.tiff'), // https://code.google.com/archive/p/imagetestsuite/wikis/TIFFTestSuite.wiki file: 0c84d07e1b22b76f24cccc70d8788e4a.tif inputTiff8BitDepth: getPath('8bit_depth.tiff'), + inputTifftagPhotoshop: getPath('tifftag-photoshop.tiff'), // https://github.com/lovell/sharp/issues/1600 + inputTiffFogra: getPath('fogra-0-100-100-0.tif'), // https://github.com/lovell/sharp/issues/4045 + inputTiffGeo: getPath('bonne.geo.tif'), // https://download.osgeo.org/geotiff/samples/intergraph + + inputJp2: getPath('relax.jp2'), // https://www.fnordware.com/j2k/relax.jp2 + inputJp2TileParts: getPath('relax_tileparts.jp2'), // kdu_expand -i relax.jp2 -o relax-tmp.tif ; kdu_compress -i relax-tmp.tif -o relax_tileparts.jp2 -jp2_space sRGB Clayers=8 -rate 1.0,0.04 Stiles='{128,128}' ORGtparts=L ; rm relax-tmp.tif inputGif: getPath('Crash_test.gif'), // http://upload.wikimedia.org/wikipedia/commons/e/e3/Crash_test.gif inputGifGreyPlusAlpha: getPath('grey-plus-alpha.gif'), // http://i.imgur.com/gZ5jlmE.gif + inputGifAnimated: getPath('rotating-squares.gif'), // CC0 https://loading.io/spinner/blocks/-rotating-squares-preloader-gif + inputGifAnimatedLoop3: getPath('animated-loop-3.gif'), // CC-BY-SA-4.0 Petrus3743 https://commons.wikimedia.org/wiki/File:01-Goldener_Schnitt_Formel-Animation.gif inputSvg: getPath('check.svg'), // http://dev.w3.org/SVG/tools/svgweb/samples/svg-files/check.svg + inputSvgSmallViewBox: getPath('circle.svg'), inputSvgWithEmbeddedImages: getPath('struct-image-04-t.svg'), // https://dev.w3.org/SVG/profiles/1.2T/test/svg/struct-image-04-t.svg + inputAvif: getPath('sdr_cosmos12920_cicp1-13-6_yuv444_full_qp10.avif'), // CC by-nc-nd https://github.com/AOMediaCodec/av1-avif/tree/master/testFiles/Netflix inputJPGBig: getPath('flowers.jpeg'), + inputPngDotAndLines: getPath('dot-and-lines.png'), + inputPngStripesV: getPath('stripesV.png'), inputPngStripesH: getPath('stripesH.png'), @@ -101,72 +137,66 @@ module.exports = { inputV: getPath('vfile.v'), - outputJpg: getPath('output.jpg'), - outputPng: getPath('output.png'), - outputWebP: getPath('output.webp'), - outputV: getPath('output.v'), - outputTiff: getPath('output.tiff'), - outputZoinks: getPath('output.zoinks'), // an 'unknown' file extension + inputJpgClahe: getPath('concert.jpg'), // public domain - https://www.flickr.com/photos/mars_/14389236779/ + + testPattern: getPath('test-pattern.png'), + inputPngWithTransparent: getPath('d.png'), // Path for tests requiring human inspection path: getPath, // Path for expected output images - expected: function (filename) { - return getPath(path.join('expected', filename)); - }, + expected: (filename) => getPath(path.join('expected', filename)), // Verify similarity of expected vs actual images via fingerprint // Specify distance threshold using `options={threshold: 42}`, default // `threshold` is 5; - assertSimilar: function (expectedImage, actualImage, options, callback) { + assertSimilar: async (expectedImage, actualImage, options, callback) => { if (typeof options === 'function') { callback = options; options = {}; } - - if (typeof options === 'undefined' && options === null) { + if (typeof options === 'undefined' || options === null) { options = {}; } - if (options.threshold === null || typeof options.threshold === 'undefined') { options.threshold = 5; // ~7% threshold } - if (typeof options.threshold !== 'number') { throw new TypeError('`options.threshold` must be a number'); } - if (typeof callback !== 'function') { - throw new TypeError('`callback` must be a function'); - } - - fingerprint(expectedImage, function (err, expectedFingerprint) { - if (err) return callback(err); - fingerprint(actualImage, function (err, actualFingerprint) { - if (err) return callback(err); - let distance = 0; - for (let i = 0; i < 64; i++) { - if (expectedFingerprint[i] !== actualFingerprint[i]) { - distance++; - } - } - - if (distance > options.threshold) { - return callback(new Error('Expected maximum similarity distance: ' + options.threshold + '. Actual: ' + distance + '.')); + try { + const [expectedFingerprint, actualFingerprint] = await Promise.all([ + fingerprint(expectedImage), + fingerprint(actualImage) + ]); + let distance = 0; + for (let i = 0; i < 64; i++) { + if (expectedFingerprint[i] !== actualFingerprint[i]) { + distance++; } - - callback(); - }); - }); + } + if (distance > options.threshold) { + throw new Error(`Expected maximum similarity distance: ${options.threshold}. Actual: ${distance}.`); + } + } catch (err) { + if (callback) { + return callback(err); + } + throw err; + } + if (callback) { + callback(); + } }, - assertMaxColourDistance: function (actualImagePath, expectedImagePath, acceptedDistance) { + assertMaxColourDistance: (actualImagePath, expectedImagePath, acceptedDistance) => { if (typeof actualImagePath !== 'string') { - throw new TypeError('`actualImagePath` must be a string; got ' + actualImagePath); + throw new TypeError(`\`actualImagePath\` must be a string; got ${actualImagePath}`); } if (typeof expectedImagePath !== 'string') { - throw new TypeError('`expectedImagePath` must be a string; got ' + expectedImagePath); + throw new TypeError(`\`expectedImagePath\` must be a string; got ${expectedImagePath}`); } if (typeof acceptedDistance !== 'number') { // Default threshold @@ -174,7 +204,7 @@ module.exports = { } const distance = maxColourDistance(actualImagePath, expectedImagePath); if (distance > acceptedDistance) { - throw new Error('Expected maximum absolute distance of ' + acceptedDistance + ', actual ' + distance); + throw new Error(`Expected maximum absolute distance of ${acceptedDistance}, actual ${distance}`); } } diff --git a/test/fixtures/input.above.composite.premultiplied.png b/test/fixtures/input.above.composite.premultiplied.png new file mode 100644 index 000000000..dfd105e4d Binary files /dev/null and b/test/fixtures/input.above.composite.premultiplied.png differ diff --git a/test/fixtures/input.below.composite.premultiplied.png b/test/fixtures/input.below.composite.premultiplied.png new file mode 100644 index 000000000..e2bcb6b05 Binary files /dev/null and b/test/fixtures/input.below.composite.premultiplied.png differ diff --git a/test/fixtures/invalid-illuminant.icc b/test/fixtures/invalid-illuminant.icc new file mode 100644 index 000000000..8ad73be25 Binary files /dev/null and b/test/fixtures/invalid-illuminant.icc differ diff --git a/test/fixtures/p3.png b/test/fixtures/p3.png new file mode 100644 index 000000000..96e10ff15 Binary files /dev/null and b/test/fixtures/p3.png differ diff --git a/test/fixtures/prophoto.png b/test/fixtures/prophoto.png new file mode 100644 index 000000000..e01fcb02a Binary files /dev/null and b/test/fixtures/prophoto.png differ diff --git a/test/fixtures/random.jpg b/test/fixtures/random.jpg new file mode 100644 index 000000000..723c342e8 Binary files /dev/null and b/test/fixtures/random.jpg differ diff --git a/test/fixtures/relax.jp2 b/test/fixtures/relax.jp2 new file mode 100644 index 000000000..1823990f4 Binary files /dev/null and b/test/fixtures/relax.jp2 differ diff --git a/test/fixtures/relax_tileparts.jp2 b/test/fixtures/relax_tileparts.jp2 new file mode 100644 index 000000000..62621c689 Binary files /dev/null and b/test/fixtures/relax_tileparts.jp2 differ diff --git a/test/fixtures/rotating-squares.gif b/test/fixtures/rotating-squares.gif new file mode 100644 index 000000000..4d6202c7e Binary files /dev/null and b/test/fixtures/rotating-squares.gif differ diff --git a/test/fixtures/rotating-squares.webp b/test/fixtures/rotating-squares.webp new file mode 100644 index 000000000..2d6193549 Binary files /dev/null and b/test/fixtures/rotating-squares.webp differ diff --git a/test/fixtures/sdr_cosmos12920_cicp1-13-6_yuv444_full_qp10.avif b/test/fixtures/sdr_cosmos12920_cicp1-13-6_yuv444_full_qp10.avif new file mode 100644 index 000000000..85d30b15c Binary files /dev/null and b/test/fixtures/sdr_cosmos12920_cicp1-13-6_yuv444_full_qp10.avif differ diff --git a/test/fixtures/swiss.png b/test/fixtures/swiss.png new file mode 100644 index 000000000..cf310ec60 Binary files /dev/null and b/test/fixtures/swiss.png differ diff --git a/test/fixtures/test-pattern.png b/test/fixtures/test-pattern.png new file mode 100644 index 000000000..315644130 Binary files /dev/null and b/test/fixtures/test-pattern.png differ diff --git a/test/fixtures/testimgl.jpg b/test/fixtures/testimgl.jpg new file mode 100644 index 000000000..82b6b0366 Binary files /dev/null and b/test/fixtures/testimgl.jpg differ diff --git a/test/fixtures/thRandom.jpg b/test/fixtures/thRandom.jpg new file mode 100644 index 000000000..aa01d199e Binary files /dev/null and b/test/fixtures/thRandom.jpg differ diff --git a/test/fixtures/tifftag-photoshop.tiff b/test/fixtures/tifftag-photoshop.tiff new file mode 100644 index 000000000..939954cf6 Binary files /dev/null and b/test/fixtures/tifftag-photoshop.tiff differ diff --git a/test/fixtures/trim-mc.png b/test/fixtures/trim-mc.png new file mode 100644 index 000000000..f170a290f Binary files /dev/null and b/test/fixtures/trim-mc.png differ diff --git a/test/fixtures/truncated.jpg b/test/fixtures/truncated.jpg new file mode 100644 index 000000000..2a3fe7a7b Binary files /dev/null and b/test/fixtures/truncated.jpg differ diff --git a/test/fixtures/truncated.png b/test/fixtures/truncated.png new file mode 100644 index 000000000..3ab39b69a Binary files /dev/null and b/test/fixtures/truncated.png differ diff --git a/test/fixtures/with-alpha.png b/test/fixtures/with-alpha.png new file mode 100644 index 000000000..723dd5850 Binary files /dev/null and b/test/fixtures/with-alpha.png differ diff --git a/test/leak/leak.sh b/test/leak/leak.sh index dc4c09ee6..c1e5a0a33 100755 --- a/test/leak/leak.sh +++ b/test/leak/leak.sh @@ -1,18 +1,22 @@ -#!/bin/sh +#!/usr/bin/env bash +set -e if ! type valgrind >/dev/null; then echo "Please install valgrind before running memory leak tests" exit 1 fi -curl -o ./test/leak/libvips.supp https://raw.githubusercontent.com/jcupitt/libvips/master/libvips.supp +curl -s -o ./test/leak/libvips.supp https://raw.githubusercontent.com/libvips/libvips/master/suppressions/valgrind.supp -G_SLICE=always-malloc G_DEBUG=gc-friendly valgrind \ - --suppressions=test/leak/libvips.supp \ - --suppressions=test/leak/sharp.supp \ - --gen-suppressions=yes \ - --leak-check=full \ - --show-leak-kinds=definite,indirect,possible \ - --num-callers=20 \ - --trace-children=yes \ - npm test +TESTS=$(ls test/unit --ignore=svg.js --ignore=text.js) +for test in $TESTS; do + G_SLICE=always-malloc G_DEBUG=gc-friendly VIPS_LEAK=1 VIPS_NOVECTOR=1 valgrind \ + --suppressions=test/leak/libvips.supp \ + --suppressions=test/leak/sharp.supp \ + --gen-suppressions=yes \ + --leak-check=full \ + --show-leak-kinds=definite,indirect \ + --num-callers=20 \ + --trace-children=yes \ + node --zero-fill-buffers --test "test/unit/$test"; +done diff --git a/test/leak/sharp.supp b/test/leak/sharp.supp index 9c01c4001..c61e5c7d2 100644 --- a/test/leak/sharp.supp +++ b/test/leak/sharp.supp @@ -39,6 +39,23 @@ Memcheck:Cond obj:*/libjpeg.so* } +{ + value_jpeg_obj_static + Memcheck:Value8 + obj:*/libvips.so* +} +{ + cond_jpeg_obj_static + Memcheck:Cond + obj:*/libvips.so* +} +{ + param_jpeg_jpeg_finish_compress + Memcheck:Param + write(buf) + ... + fun:jpeg_finish_compress +} # libpng { cond_libpng_png_read_row @@ -140,6 +157,197 @@ ... fun:WebPDecode } +{ + cond_libwebp_generic + Memcheck:Cond + obj:*/libwebp.so.* +} + +# tiff +{ + param_tiff_write_encoded_tile + Memcheck:Param + write(buf) + fun:write + ... + fun:TIFFWriteEncodedTile +} + +# fontconfig +{ + leak_fontconfig_FcConfigSubstituteWithPat + Memcheck:Leak + match-leak-kinds: definite,indirect + ... + fun:FcConfigSubstituteWithPat +} +{ + leak_fontconfig_init + Memcheck:Leak + match-leak-kinds: indirect + fun:calloc + ... + fun:FcInitLoadConfigAndFonts +} +{ + leak_fontconfig_XML_ParseBuffer + Memcheck:Leak + match-leak-kinds: definite + ... + fun:XML_ParseBuffer + obj:*/libfontconfig.so.* +} +{ + leak_fontconfig_XML_ParseBuffer_indirect + Memcheck:Leak + match-leak-kinds: indirect + ... + fun:XML_ParseBuffer + obj:*/libfontconfig.so.* +} +{ + leak_fontconfig_FcInitLoadConfigAndFonts + Memcheck:Leak + match-leak-kinds: definite + fun:malloc + ... + fun:XML_ParseBuffer + ... + fun:FcInitLoadConfigAndFonts +} +{ + leak_fontconfig_FcDefaultSubstitute + Memcheck:Leak + match-leak-kinds: indirect + fun:calloc + ... + fun:FcDefaultSubstitute + ... + fun:pango_itemize_with_base_dir + ... + fun:pango_layout_get_pixel_extents + fun:vips_text_get_extents +} +{ + leak_fontconfig_FcLangSetCreate + Memcheck:Leak + match-leak-kinds: indirect + fun:malloc + fun:FcLangSetCreate + fun:FcLangSetCopy + fun:FcValueSave + ... + fun:FcFontRenderPrepare + fun:FcFontMatch + ... + fun:pango_itemize_with_base_dir + ... + fun:pango_layout_get_pixel_extents + fun:vips_text_get_extents +} + +# heif +{ + cond_heif_encode_image + Memcheck:Cond + ... + fun:heif_context_encode_image +} +{ + value8_heif_encode_image + Memcheck:Value8 + ... + fun:heif_context_encode_image +} +{ + cond_heif_aom_codec_encode + Memcheck:Cond + ... + fun:aom_codec_encode +} +{ + value8_heif_aom_codec_encode + Memcheck:Value8 + ... + fun:aom_codec_encode +} +{ + value1_heif_aom_codec_encode + Memcheck:Value1 + ... + fun:aom_codec_encode +} +{ + cond_heif_av1_encode_frame + Memcheck:Cond + ... + fun:av1_encode_frame +} +{ + value8_heif_av1_encode_frame + Memcheck:Value8 + ... + fun:av1_encode_frame +} +{ + cond_heif_context_write + Memcheck:Cond + ... + fun:heif_context_write +} +{ + value8_heif_context_write + Memcheck:Value8 + ... + fun:heif_context_write +} +{ + cond_heif_context_read + Memcheck:Cond + ... + fun:heif_context_read_from_reader +} +{ + value8_heif_context_read + Memcheck:Value8 + ... + fun:heif_context_read_from_reader +} + +# glib +{ + leak_glib__tls_get_addr + Memcheck:Leak + match-leak-kinds: possible + ... + fun:malloc + fun:allocate_dtv_entry + fun:allocate_and_init + fun:tls_get_addr_tail + fun:__tls_get_addr +} +{ + value_g_utf8_make_valid_strlen + Memcheck:Value8 + fun:strlen + fun:g_utf8_make_valid +} +{ + value_g_utf8_make_valid_strncpy + Memcheck:Value8 + fun:strncpy + fun:g_strndup + ... + fun:g_utf8_make_valid +} +{ + cond_g_utf8_make_valid_strncpy + Memcheck:Cond + fun:strncpy + fun:g_strndup + ... + fun:g_utf8_make_valid +} # libvips { @@ -156,10 +364,34 @@ fun:vips_region_generate } { - cond_libvips_col_sRGB2scRGB_8 + value_libvips_col_sRGB2scRGB_8 Memcheck:Value8 fun:vips_col_sRGB2scRGB_8 } +{ + value_libvips_col_sRGB2scRGB_line_8 + Memcheck:Value8 + fun:vips_sRGB2scRGB_line_8 +} +{ + value_libvips_write_webp + Memcheck:Value8 + ... + fun:write_webp.constprop.1 + fun:vips__webp_write_buffer +} +{ + value_libvips_start_thread + Memcheck:Value8 + obj:*/libvips.so.* + fun:start_thread + fun:clone +} +{ + cond_libvips_vips_cast_gen + Memcheck:Cond + fun:vips_cast_gen +} { cond_libvips_vips_region_fill Memcheck:Cond @@ -167,6 +399,77 @@ fun:vips_region_fill fun:vips_region_prepare } +{ + cond_libvips_vips_region_prepare_to + Memcheck:Cond + ... + fun:vips_region_prepare_to +} +{ + cond_libvips_vips_stats_scan + Memcheck:Cond + fun:vips_stats_scan +} +{ + value_libvips_vips_region_fill + Memcheck:Value8 + ... + fun:vips_region_fill + fun:vips_region_prepare +} +{ + value_libvips_vips_hist_find_uchar_scan + Memcheck:Value8 + fun:vips_hist_find_uchar_scan +} +{ + value_libvips_write_webp_image + Memcheck:Value8 + ... + fun:write_webp_image +} +{ + param_libvips_write_buf + Memcheck:Param + write(buf) + fun:write + ... + fun:start_thread +} +{ + cond_libvips_source_read + Memcheck:Cond + ... + fun:vips_source_read +} +{ + value8_libvips_source_read + Memcheck:Value8 + ... + fun:vips_source_read +} +{ + cond_libvips_target_finish + Memcheck:Cond + ... + fun:vips_target_finish +} +{ + value8_libvips_target_finish + Memcheck:Value8 + ... + fun:vips_target_finish +} +{ + value8_libvips_static + Memcheck:Value8 + obj:*/libvips-cpp.so.* +} +{ + cond_libvips_static + Memcheck:Cond + obj:*/libvips-cpp.so.* +} { leak_libvips_init Memcheck:Leak @@ -175,6 +478,66 @@ ... fun:vips__init } +{ + leak_libvips_thread_pool_new + Memcheck:Leak + match-leak-kinds: possible + fun:calloc + ... + fun:g_system_thread_new +} +{ + leak_libvips_thread_pool_push + Memcheck:Leak + match-leak-kinds: possible + fun:calloc + ... + fun:g_thread_pool_push +} +{ + leak_rsvg_static_data + Memcheck:Leak + match-leak-kinds: definite + fun:malloc + ... + fun:rsvg_handle_new_from_stream_sync +} +{ + leak_rsvg_rsvg_rust_handle_new_from_gfile_sync + Memcheck:Leak + match-leak-kinds: definite + fun:malloc + ... + fun:rsvg_handle_new_from_gfile_sync +} +{ + leak_rsvg_rust_handle_new_from_stream_sync + Memcheck:Leak + match-leak-kinds: possible + fun:malloc + ... + fun:xmlParseElement + ... + fun:rsvg_handle_new_from_stream_sync +} +{ + leak_rsvg_rust_handle_new_from_gfile_sync + Memcheck:Leak + match-leak-kinds: possible + fun:malloc + ... + fun:xmlParseElement + ... + fun:rsvg_handle_new_from_gfile_sync +} +{ + leak_rsvg_rust_280_bytes_static_regex + Memcheck:Leak + match-leak-kinds: possible + fun:malloc + ... + fun:rsvg_handle_get_dimensions_sub +} # libuv warnings { @@ -190,13 +553,26 @@ ... fun:uv__fs_work } +{ + param_libuv_epoll_ctl + Memcheck:Param + epoll_ctl(event) + fun:epoll_ctl + fun:uv__io_poll +} { cond_libuv_work_done Memcheck:Cond ... fun:uv__work_done } - +{ + leak_libuv_FlushForegroundTasks + Memcheck:Leak + match-leak-kinds: possible + ... + fun:_ZN4node12NodePlatform28FlushForegroundTasksInternalEv +} # nodejs warnings { param_nodejs_write_buffer @@ -212,6 +588,12 @@ ... obj:/usr/bin/iojs } +{ + value_node_invoke_params + Memcheck:Value8 + ... + fun:_ZN2v88internal12_GLOBAL__N_16InvokeEPNS0_7IsolateERKNS1_12InvokeParamsE +} { leak_nodejs_ImmutableAsciiSource_CreateFromLiteral Memcheck:Leak @@ -270,6 +652,55 @@ ... fun:_ZN4node17CreateEnvironmentEPN2v87IsolateEP9uv_loop_sNS0_5LocalINS0_7ContextEEEiPKPKciSB_ } +{ + leak_nodejs_CreateEnvironment_IsolateData + Memcheck:Leak + match-leak-kinds: possible + ... + fun:_ZN4node17CreateEnvironmentEPNS_11IsolateDataEN2v85LocalINS2_7ContextEEERKSt6vectorISsSaISsEESA_NS_16EnvironmentFlags5FlagsENS_8ThreadIdESt10unique_ptrINS_21InspectorParentHandleESt14default_deleteISF_EE +} +{ + leak_nodejs_Environment_Start + Memcheck:Leak + match-leak-kinds: possible + ... + fun:_ZN4node11Environment5StartEiPKPKciS4_b +} +{ + leak_nodejs_node9inspector5Agent5Start + Memcheck:Leak + match-leak-kinds: possible + ... + fun:_ZN4node9inspector5Agent5StartEPN2v88PlatformEPKcRKNS_12DebugOptionsE +} +{ + leak_nodejs_node9inspector5Agent5Start_NodePlatform + Memcheck:Leak + match-leak-kinds: possible + ... + fun:_ZN4node9inspector5Agent5StartEPNS_12NodePlatformEPKcRKNS_12DebugOptionsE +} +{ + leak_nodejs_node9inspector5Agent5StartERKSsRKNS + Memcheck:Leak + match-leak-kinds: possible + ... + fun:_ZN4node9inspector5Agent5StartERKSsRKNS_12DebugOptionsESt10shared_ptrINS_15ExclusiveAccessINS_8HostPortENS_9MutexBaseINS_16LibuvMutexTraitsEEEEEEb +} +{ + leak_nodejs_node12NodePlatform_TracingController + Memcheck:Leak + match-leak-kinds: possible + ... + fun:_ZN4node12NodePlatformC1EiP9uv_loop_sPN2v817TracingControllerE +} +{ + leak_nodejs_node11performance24MarkGarbageCollectionEnd + Memcheck:Leak + match-leak-kinds: possible + ... + fun:_ZN4node11performance24MarkGarbageCollectionEndEPN2v87IsolateENS1_6GCTypeENS1_15GCCallbackFlagsEPv +} { leak_nodejs_icu_getAvailableLocales Memcheck:Leak @@ -289,11 +720,120 @@ fun:_ZN2v84base6Thread5StartEv } { - leak_nan_FunctionCallbackInfo + leak_nodejs_thread_TracingController + Memcheck:Leak + match-leak-kinds: possible + fun:calloc + fun:allocate_dtv + fun:_dl_allocate_tls + fun:allocate_stack + ... + fun:_ZN4node12NodePlatformC1EiPN2v817TracingControllerE +} +{ + leak_nodejs_start_isolate_data + Memcheck:Leak + match-leak-kinds: possible + fun:_Znwm + ... + fun:_ZN4node5StartEPN2v87IsolateEPNS_11IsolateDataERKSt6vectorISsSaISsEES9_ +} +{ + leak_nodejs_runtime_stackguard_object_isolate + Memcheck:Leak + match-leak-kinds: possible + fun:_Znwm + ... + fun:_ZN2v88internal18Runtime_StackGuardEiPPNS0_6ObjectEPNS0_7IsolateE +} +{ + leak_nodejs_builtin_handleapicall_object_isolate + Memcheck:Leak + match-leak-kinds: possible + fun:_Znwm + ... + fun:_ZN2v88internal21Builtin_HandleApiCallEiPPNS0_6ObjectEPNS0_7IsolateE +} +{ + param_nodejs_delayed_task_scheduler + Memcheck:Param + epoll_ctl(event) + fun:epoll_ctl + fun:uv__io_poll + fun:uv_run + fun:_ZZN4node20BackgroundTaskRunner20DelayedTaskScheduler5StartEvENUlPvE_4_FUNES2_ +} +{ + param_nodejs_isolate_data + Memcheck:Param + epoll_ctl(event) + fun:epoll_ctl + fun:uv__io_poll + fun:uv_run + fun:_ZN4node5StartEPN2v87IsolateEPNS_11IsolateDataERKSt6vectorISsSaISsEES9_ +} +{ + param_nodejs_try_init_and_run_loop + Memcheck:Param + epoll_ctl(event) + fun:epoll_ctl + fun:uv__io_poll + fun:uv_run + fun:_ZN4node17SyncProcessRunner23TryInitializeAndRunLoopEN2v85LocalINS1_5ValueEEE +} +{ + param_nodejs_run_exit_handlers + Memcheck:Param + epoll_ctl(event) + fun:epoll_ctl + fun:uv__io_poll + fun:uv_run + fun:_ZN4node7tracing5AgentD1Ev + fun:_ZN4node5._215D1Ev + fun:__run_exit_handlers +} +{ + leak_nodejs_crypto_entropy_source + Memcheck:Leak + ... + fun:_ZN4node6crypto13EntropySourceEPhm +} +{ + leak_nodejs_debug_options + Memcheck:Leak + ... + fun:_ZN4node9inspector5Agent5StartERKSsSt10shared_ptrINS_12DebugOptionsEEb +} +{ + leak_nodejs_debug_host_port + Memcheck:Leak + match-leak-kinds: possible + ... + fun:_ZN4node9inspector5Agent5StartERKSsRKNS_12DebugOptionsESt10shared_ptrINS_8HostPortEEb +} +{ + leak_nodejs_start + Memcheck:Leak + match-leak-kinds: definite + fun:_Znwm + fun:_ZN4node5StartEiPPc +} +{ + leak_nodejs_start_background_task_runner + Memcheck:Leak + match-leak-kinds: possible + ... + fun:_ZN4node20BackgroundTaskRunnerC1Ei +} +{ + leak_napi_module_register Memcheck:Leak match-leak-kinds: definite ... - fun:_ZN3Nan3impL23FunctionCallbackWrapperERKN2v820FunctionCallbackInfoINS1_5ValueEEE + fun:napi_module_register + fun:call_init.part.0 + fun:call_init + fun:_dl_init } { leak_v8_FunctionCallbackInfo @@ -365,3 +905,144 @@ ... fun:_ZN2v88internal8Malloced3NewEm } +{ + leak_v8_inspector10toString + Memcheck:Leak + match-leak-kinds: possible + ... + fun:_ZN12v8_inspector10toString16ERKNS_10StringViewE +} +{ + cond_v8_Builtins_InterpreterEntryTrampoline + Memcheck:Cond + ... + fun:Builtins_InterpreterEntryTrampoline +} +{ + cond_v8_ZN2v88internal18ArrayBufferSweeper9SweepFullEv + Memcheck:Cond + ... + fun:_ZN2v88internal18ArrayBufferSweeper9SweepFullEv +} +{ + cond_v8_ZN4node11Environment27RunAndClearNativeImmediatesEb + Memcheck:Cond + ... + fun:_ZN4node11Environment27RunAndClearNativeImmediatesEb +} +{ + cond_v8_ZN2v88internal18ArrayBufferSweeper10ReleaseAllEv + Memcheck:Cond + ... + fun:_ZN2v88internal18ArrayBufferSweeper10ReleaseAllEv +} +{ + cond_v8_ZN2v88internal8compiler12PipelineImpl13OptimizeGraphEPNS1_7LinkageE + Memcheck:Cond + ... + fun:_ZN2v88internal8compiler12PipelineImpl13OptimizeGraphEPNS1_7LinkageE +} +{ + cond_v8_ZN2v88internal4Heap20HasLowAllocationRateEv + Memcheck:Cond + ... + fun:_ZN2v88internal4Heap20HasLowAllocationRateEv +} +{ + cond_v8_ZN2v88internal4Heap15RecomputeLimitsENS0_16GarbageCollectorENS_4base9TimeTicksE + Memcheck:Cond + ... + fun:_ZN2v88internal4Heap15RecomputeLimitsENS0_16GarbageCollectorENS_4base9TimeTicksE +} +{ + cond_node_Builtins_JSEntry + Memcheck:Cond + ... + fun:Builtins_JSEntry + ... + fun:uv__poll_io_uring +} +{ + cond_node_Builtins_TestEqualStrictHandler + Memcheck:Cond + fun:Builtins_TestEqualStrictHandler + ... + fun:uv__poll_io_uring +} +{ + cond_node_Builtins_TestGreaterThanHandler + Memcheck:Cond + fun:Builtins_TestGreaterThanHandler + ... + fun:uv__poll_io_uring +} +{ + cond_node_AfterStat + Memcheck:Cond + ... + fun:_ZN4node2fs9AfterStatEP7uv_fs_s + ... + fun:uv__poll_io_uring +} +{ + cond_node_AfterMkdirp + Memcheck:Cond + fun:_ZN4node2fs11AfterMkdirpEP7uv_fs_s + fun:_ZN4node24MakeLibuvRequestCallbackI7uv_fs_sPFvPS1_EE7WrapperES2_ + fun:_ZZZN4node2fs11MKDirpAsyncEP9uv_loop_sP7uv_fs_sPKciPFvS4_EENKUlS4_E_clES4_ENUlS4_E_4_FUNES4_ + fun:uv__poll_io_uring +} +{ + cond_v8_ArrayBufferSweeper_Finalize + Memcheck:Cond + fun:_ZN2v88internal18ArrayBufferSweeper8FinalizeEv +} +{ + cond_v8_AdjustAmountOfExternalAllocatedMemory + Memcheck:Cond + fun:_ZN2v87Isolate37AdjustAmountOfExternalAllocatedMemoryEl +} +{ + cond_v8_IncrementalMarkingLimitReached + Memcheck:Cond + fun:_ZN2v88internal4Heap30IncrementalMarkingLimitReachedEv +} +{ + cond_v8_ShouldExpandOldGenerationOnSlowAllocation + Memcheck:Cond + fun:_ZN2v88internal4Heap41ShouldExpandOldGenerationOnSlowAllocationEPNS0_9LocalHeapENS0_16AllocationOriginE +} +{ + cond_v8_ArrayBufferSweeper_SweepingJob_SweepListFull + Memcheck:Cond + fun:_ZN2v88internal18ArrayBufferSweeper11SweepingJob13SweepListFullEPNS0_15ArrayBufferListE +} +{ + cond_v8_ArrayBufferSweeper_SweepingJob_SweepYoung + Memcheck:Cond + fun:_ZN2v88internal18ArrayBufferSweeper11SweepingJob10SweepYoungEv +} +{ + cond_v8_StartIncrementalMarkingIfAllocationLimitIsReachedBackground + Memcheck:Cond + fun:_ZN2v88internal4Heap59StartIncrementalMarkingIfAllocationLimitIsReachedBackgroundEv +} +{ + addr_v8_ZN2v88internal12_GLOBAL__N_119HandleApiCallHelperILb0EEENS0 + Memcheck:Addr8 + fun:strncmp + ... + fun:_ZZN4node7binding6DLOpenERKN2v820FunctionCallbackInfoINS1_5ValueEEEENKUlPNS0_4DLibEE_clES8_ + fun:_ZN4node7binding6DLOpenERKN2v820FunctionCallbackInfoINS1_5ValueEEE + fun:_ZN2v88internal12_GLOBAL__N_119HandleApiCallHelperILb0EEENS0_11MaybeHandleINS0_6ObjectEEEPNS0_7IsolateENS0_6HandleINS0_10HeapObjectEEESA_NS8_INS0_20FunctionTemplateInfoEEENS8_IS4_EENS0_16BuiltinArgumentsE +} +{ + addr_node_binding_dlopen_strncmp + Memcheck:Addr8 + fun:strncmp + fun:is_dst + ... + fun:dlopen_implementation + ... + fun:_ZNSt17_Function_handlerIFbPN4node7binding4DLibEEZNS1_6DLOpenERKN2v820FunctionCallbackInfoINS5_5ValueEEEEUlS3_E_E9_M_invokeERKSt9_Any_dataOS3_ +} diff --git a/test/saliency/README.md b/test/saliency/README.md deleted file mode 100644 index 580d7f244..000000000 --- a/test/saliency/README.md +++ /dev/null @@ -1,16 +0,0 @@ -# Crop strategy accuracy - -1. Download the [MSRA Salient Object Database](http://research.microsoft.com/en-us/um/people/jiansun/SalientObject/salient_object.htm) (101MB). -2. Extract each image and its median human-labelled salient region. -3. Generate a test report of percentage deviance of top and left edges for each crop strategy, plus a naive centre gravity crop as "control". - -```sh -git clone https://github.com/lovell/sharp.git -cd sharp/test/saliency -./download.sh -node report.js -python -m SimpleHTTPServer -``` - -The test report will then be available at -http://localhost:8000/report.html diff --git a/test/saliency/download.sh b/test/saliency/download.sh deleted file mode 100755 index 747844725..000000000 --- a/test/saliency/download.sh +++ /dev/null @@ -1,25 +0,0 @@ -#!/bin/sh - -# Fetch and parse the MSRA Salient Object Database 'Image set B' -# http://research.microsoft.com/en-us/um/people/jiansun/salientobject/salient_object.htm - -if [ ! -d Image ]; then - if [ ! -f ImageB.zip ]; then - echo "Downloading 5000 images (101MB)" - curl -O http://research.microsoft.com/en-us/um/people/jiansun/salientobject/ImageSetB/ImageB.zip - fi - unzip ImageB.zip -fi - -if [ ! -d UserData ]; then - if [ ! -f UserDataB.zip ]; then - echo "Downloading human-labelled regions" - curl -O http://research.microsoft.com/en-us/um/people/jiansun/salientobject/ImageSetB/UserDataB.zip - fi - unzip UserDataB.zip -fi - -if [ ! -f userData.json ]; then - echo "Processing human-labelled regions" - node userData.js -fi diff --git a/test/saliency/humanae/download.js b/test/saliency/humanae/download.js deleted file mode 100644 index 7a43fd421..000000000 --- a/test/saliency/humanae/download.js +++ /dev/null @@ -1,37 +0,0 @@ -'use strict'; - -const fs = require('fs'); -const request = require('request'); -const tumblr = require('tumblr.js'); - -const client = tumblr.createClient({ - consumer_key: '***', - consumer_secret: '***' -}); - -const fetchImages = function (offset) { - console.log(`Fetching offset ${offset}`); - client.posts('humanae', { - type: 'photo', - offset: offset - }, function (err, response) { - if (err) throw err; - if (response.posts.length > 0) { - response.posts.forEach((post) => { - const url = post.photos[0].alt_sizes - .filter((image) => image.width === 100) - .map((image) => image.url)[0]; - const filename = `./images/${post.id}.jpg`; - try { - fs.statSync(filename); - } catch (err) { - if (err.code === 'ENOENT') { - request(url).pipe(fs.createWriteStream(filename)); - } - } - }); - fetchImages(offset + 20); - } - }); -}; -fetchImages(0); diff --git a/test/saliency/humanae/package.json b/test/saliency/humanae/package.json deleted file mode 100644 index f436f5a28..000000000 --- a/test/saliency/humanae/package.json +++ /dev/null @@ -1,9 +0,0 @@ -{ - "name": "sharp-crop-strategy-attention-model-humanae", - "version": "0.0.1", - "private": true, - "dependencies": { - "request": "^2.75.0", - "tumblr.js": "^1.1.1" - } -} diff --git a/test/saliency/humanae/tone.js b/test/saliency/humanae/tone.js deleted file mode 100644 index 65f37d1ab..000000000 --- a/test/saliency/humanae/tone.js +++ /dev/null @@ -1,33 +0,0 @@ -'use strict'; - -const fs = require('fs'); -const childProcess = require('child_process'); - -const a = []; -const b = []; - -fs.readdirSync('./images') - .filter((file) => file.endsWith('.jpg')) - .forEach((file) => { - // Extract one pixel, avoiding first DCT block, and return value of A and B channels - const command = `convert ./images/${file}[1x1+8+8] -colorspace lab -format "%[fx:u.g] %[fx:u.b]" info:`; - const result = childProcess.execSync(command, { encoding: 'utf8' }); - const ab = result.split(' '); - a.push(ab[0]); - b.push(ab[1]); - }); - -a.sort((v1, v2) => v1 - v2); -b.sort((v1, v2) => v1 - v2); - -// Convert from 0..1 to -128..128 -const convert = function (v) { - return Math.round(256 * (v - 0.5)); -}; - -const threshold = Math.round(a.length / 100); -console.log(`Trimming lowest/highest ${threshold} for 98th percentile`); - -// Ignore ~2% outliers -console.log(`a ${convert(a[threshold])} - ${convert(a[a.length - threshold])}`); -console.log(`b ${convert(b[threshold])} - ${convert(b[b.length - threshold])}`); diff --git a/test/saliency/report.html b/test/saliency/report.html deleted file mode 100644 index 792d27374..000000000 --- a/test/saliency/report.html +++ /dev/null @@ -1,25 +0,0 @@ - - - - - - - -
- - - diff --git a/test/saliency/report.js b/test/saliency/report.js deleted file mode 100644 index 42c64ad3c..000000000 --- a/test/saliency/report.js +++ /dev/null @@ -1,68 +0,0 @@ -'use strict'; - -const os = require('os'); -const fs = require('fs'); -const path = require('path'); -const async = require('async'); -const sharp = require('../../'); - -const crops = { - centre: sharp.gravity.centre, - entropy: sharp.strategy.entropy, - attention: sharp.strategy.attention -}; -const concurrency = os.cpus().length; - -const scores = {}; - -const incrementScore = function (accuracy, crop) { - if (typeof scores[accuracy] === 'undefined') { - scores[accuracy] = {}; - } - if (typeof scores[accuracy][crop] === 'undefined') { - scores[accuracy][crop] = 0; - } - scores[accuracy][crop]++; -}; - -const userData = require('./userData.json'); -const files = Object.keys(userData); - -async.eachLimit(files, concurrency, function (file, done) { - const filename = path.join(__dirname, 'Image', file); - const salientWidth = userData[file].right - userData[file].left; - const salientHeight = userData[file].bottom - userData[file].top; - sharp(filename).metadata(function (err, metadata) { - if (err) console.log(err); - async.each(Object.keys(crops), function (crop, done) { - async.parallel([ - // Left edge accuracy - function (done) { - sharp(filename).resize(salientWidth, metadata.height).crop(crops[crop]).toBuffer(function (err, data, info) { - const accuracy = Math.round(Math.abs(userData[file].left - info.cropCalcLeft) / (metadata.width - salientWidth) * 100); - incrementScore(accuracy, crop); - done(err); - }); - }, - // Top edge accuracy - function (done) { - sharp(filename).resize(metadata.width, salientHeight).crop(crops[crop]).toBuffer(function (err, data, info) { - const accuracy = Math.round(Math.abs(userData[file].top - info.cropCalcTop) / (metadata.height - salientHeight) * 100); - incrementScore(accuracy, crop); - done(err); - }); - } - ], done); - }, done); - }); -}, function () { - const report = []; - Object.keys(scores).forEach(function (accuracy) { - report.push( - Object.assign({ - accuracy: parseInt(accuracy, 10) - }, scores[accuracy]) - ); - }); - fs.writeFileSync('report.json', JSON.stringify(report, null, 2)); -}); diff --git a/test/saliency/userData.js b/test/saliency/userData.js deleted file mode 100644 index c5358d30a..000000000 --- a/test/saliency/userData.js +++ /dev/null @@ -1,71 +0,0 @@ -'use strict'; - -const fs = require('fs'); -const path = require('path'); - -const userDataDir = 'UserData'; - -const images = {}; - -const median = function (values) { - values.sort(function (a, b) { - return a - b; - }); - const half = Math.floor(values.length / 2); - if (values.length % 2) { - return values[half]; - } else { - return Math.floor((values[half - 1] + values[half]) / 2); - } -}; - -// List of files -fs.readdirSync(userDataDir).forEach(function (file) { - // Contents of file - const lines = fs.readFileSync(path.join(userDataDir, file), {encoding: 'utf-8'}).split(/\r\n/); - // First line = number of entries - const entries = parseInt(lines[0], 10); - // Verify number of entries - if (entries !== 500) { - throw new Error('Expecting 500 images in ' + file + ', found ' + entries); - } - // Keep track of which line we're on - let linePos = 2; - for (let i = 0; i < entries; i++) { - // Get data for current image - const filename = lines[linePos].replace(/\\/, path.sep); - linePos = linePos + 2; - const regions = lines[linePos].split('; '); - linePos = linePos + 2; - // Parse human-labelled regions for min/max coords - const lefts = []; - const tops = []; - const rights = []; - const bottoms = []; - regions.forEach(function (region) { - if (region.indexOf(' ') !== -1) { - const coords = region.split(' '); - lefts.push(parseInt(coords[0], 10)); - tops.push(parseInt(coords[1], 10)); - rights.push(parseInt(coords[2], 10)); - bottoms.push(parseInt(coords[3], 10)); - } - }); - // Add image - images[filename] = { - left: median(lefts), - top: median(tops), - right: median(rights), - bottom: median(bottoms) - }; - } -}); - -// Verify number of images found -const imageCount = Object.keys(images).length; -if (imageCount === 5000) { - // Write output - fs.writeFileSync('userData.json', JSON.stringify(images, null, 2)); -} else { - throw new Error('Expecting 5000 images, found ' + imageCount); -} diff --git a/test/types/sharp.test-d.ts b/test/types/sharp.test-d.ts new file mode 100644 index 000000000..9364c08c8 --- /dev/null +++ b/test/types/sharp.test-d.ts @@ -0,0 +1,773 @@ +// biome-ignore-all lint/correctness/noUnusedFunctionParameters: types only test file +// biome-ignore-all lint/correctness/noUnusedVariables: types only test file + +import sharp = require('../../'); + +import { createReadStream, createWriteStream } from 'node:fs'; + +const input: Buffer = Buffer.alloc(0); +const readableStream: NodeJS.ReadableStream = createReadStream(input); +const writableStream: NodeJS.WritableStream = createWriteStream(input); + +sharp(input) + .extractChannel('green') + .toFile('input_green.jpg', (err, info) => { + // info.channels === 1 + // input_green.jpg contains the green channel of the input image + }); + +sharp('3-channel-rgb-input.png') + .bandbool(sharp.bool.and) + .toFile('1-channel-output.png', (err, info) => { + // The output will be a single channel image where each pixel `P = R & G & B`. + // If `I(1,1) = [247, 170, 14] = [0b11110111, 0b10101010, 0b00001111]` + // then `O(1,1) = 0b11110111 & 0b10101010 & 0b00001111 = 0b00000010 = 2`. + }); + +sharp('input.png') + .rotate(180) + .resize(300) + .flatten({ background: '#ff6600' }) + .composite([{ input: 'overlay.png', gravity: sharp.gravity.southeast, animated: false, failOn: 'warning' }]) + .sharpen() + .withMetadata() + .withMetadata({ + density: 96, + orientation: 8, + icc: 'some/path', + exif: { IFD0: { Copyright: 'Wernham Hogg' } }, + }) + .webp({ + quality: 90, + }) + .toBuffer() + .then((outputBuffer: Buffer) => { + // outputBuffer contains upside down, 300px wide, alpha channel flattened + // onto orange background, composited with overlay.png with SE gravity, + // sharpened, with metadata, 90% quality WebP image data. Phew! + }); + +sharp('input.png') + .keepMetadata() + .toFile('output.png', (err, info) => { + // output.png is an image containing input.png along with all metadata(EXIF, ICC, XMP, IPTC) from input.png + }) + +sharp('input.jpg') + .resize(300, 200) + .toFile('output.jpg', (err: Error) => { + // output.jpg is a 300 pixels wide and 200 pixels high image + // containing a scaled and cropped version of input.jpg + }); + +sharp('input.jpg').resize({ width: 300 }).blur(false).blur(true).toFile('output.jpg'); + +sharp().blur(); +sharp().blur(1); +sharp().blur({ sigma: 1 }); +sharp().blur({ sigma: 1, precision: 'approximate' }); +sharp().blur({ sigma: 1, minAmplitude: 0.8 }); + +sharp({ + create: { + width: 300, + height: 200, + channels: 4, + background: { r: 255, g: 0, b: 0, alpha: 128 }, + }, +}) + .png() + .toBuffer(); + +let transformer = sharp() + .resize(300) + .on('info', (info: sharp.OutputInfo) => { + console.log(`Image height is ${info.height}`); + }); +readableStream.pipe(transformer).pipe(writableStream); + +console.log(sharp.format); +console.log(sharp.versions); + +sharp.queue.on('change', (queueLength: number) => { + console.log(`Queue contains ${queueLength} task(s)`); +}); + +let pipeline: sharp.Sharp = sharp().rotate(); +pipeline.clone().resize(800, 600).pipe(writableStream); +pipeline.clone().extract({ left: 20, top: 20, width: 100, height: 100 }).pipe(writableStream); +readableStream.pipe(pipeline); +// firstWritableStream receives auto-rotated, resized readableStream +// secondWritableStream receives auto-rotated, extracted region of readableStream + +const image: sharp.Sharp = sharp(input); +image + .metadata() + .then((metadata: sharp.Metadata) => { + if (metadata.width) { + return image + .resize(Math.round(metadata.width / 2)) + .webp() + .toBuffer(); + } + }) + .then((data) => { + // data contains a WebP image half the width and height of the original JPEG + }); + +pipeline = sharp() + .rotate() + .resize(undefined, 200) + .toBuffer((err: Error, outputBuffer: Buffer, info: sharp.OutputInfo) => { + // outputBuffer contains 200px high JPEG image data, + // auto-rotated using EXIF Orientation tag + // info.width and info.height contain the dimensions of the resized image + }); +readableStream.pipe(pipeline); + +sharp(input) + .extract({ left: 0, top: 0, width: 100, height: 100 }) + .toFile('output', (err: Error) => { + // Extract a region of the input image, saving in the same format. + }); + +sharp(input) + .extract({ left: 0, top: 0, width: 100, height: 100 }) + .resize(200, 200) + .extract({ left: 0, top: 0, width: 100, height: 100 }) + .toFile('output', (err: Error) => { + // Extract a region, resize, then extract from the resized image + }); + +// Resize to 140 pixels wide, then add 10 transparent pixels +// to the top, left and right edges and 20 to the bottom edge +sharp(input) + .resize(140, null, { background: { r: 0, g: 0, b: 0, alpha: 0 } }) + .extend({ top: 10, bottom: 20, left: 10, right: 10 }); + +sharp(input) + .convolve({ + width: 3, + height: 3, + kernel: [-1, 0, 1, -2, 0, 2, -1, 0, 1], + }) + .raw() + .toBuffer((err: Error, data: Buffer, info: sharp.OutputInfo) => { + // data contains the raw pixel data representing the convolution + // of the input image with the horizontal Sobel operator + }); + +sharp('input.tiff') + .png() + .tile({ + size: 512, + }) + .toFile('output.dz', (err: Error, info: sharp.OutputInfo) => { + // output.dzi is the Deep Zoom XML definition + // output_files contains 512x512 tiles grouped by zoom level + }); + +sharp('input.tiff') + .png() + .tile({ + size: 512, + center: true, + layout: 'iiif3', + id: 'https://my.image.host/iiif', + }) + .toFile('output'); + +sharp(input) + .resize(200, 300, { + fit: 'contain', + position: 'north', + kernel: sharp.kernel.lanczos2, + background: 'white', + }) + .toFile('output.tiff') + .then(() => { + // output.tiff is a 200 pixels wide and 300 pixels high image + // containing a lanczos2/nohalo scaled version, embedded on a white canvas, + // of the image data in inputBuffer + }); + +sharp(input).resize({ kernel: 'mks2013' }); + +transformer = sharp() + .resize(200, 200, { + fit: 'cover', + position: sharp.strategy.entropy, + }) + .on('error', (err: Error) => { + console.log(err); + }); +// Read image data from readableStream +// Write 200px square auto-cropped image data to writableStream +readableStream.pipe(transformer).pipe(writableStream); + +sharp('input.gif') + .resize(200, 300, { + fit: 'contain', + position: 'north', + background: { r: 0, g: 0, b: 0, alpha: 0 }, + }) + .toFormat(sharp.format.webp) + .toBuffer((err: Error, outputBuffer: Buffer) => { + if (err) { + throw err; + } + // outputBuffer contains WebP image data of a 200 pixels wide and 300 pixels high + // containing a scaled version, embedded on a transparent canvas, of input.gif + }); + +sharp(input) + .resize(200, 200, { fit: 'inside' }) + .toFormat('jpeg') + .toBuffer() + .then((outputBuffer: Buffer) => { + // outputBuffer contains JPEG image data no wider than 200 pixels and no higher + // than 200 pixels regardless of the inputBuffer image dimensions + }); + +sharp(input) + .resize(100, 100) + .toFormat('jpg') + .toBuffer({ resolveWithObject: false }) + .then((outputBuffer: Buffer) => { + // Resolves with a Buffer object when resolveWithObject is false + }); + +sharp(input) + .resize(100, 100) + .toBuffer({ resolveWithObject: true }) + .then((object: { data: Buffer; info: sharp.OutputInfo }) => { + // Resolve with an object containing data Buffer and an OutputInfo object + // when resolveWithObject is true + }); + +sharp(input) + .resize(640, 480, { withoutEnlargement: true }) + .toFormat('jpeg') + .toBuffer() + .then((outputBuffer: Buffer) => { + // outputBuffer contains JPEG image data no larger than the input + }); + +sharp(input) + .resize(640, 480, { withoutReduction: true }) + .toFormat('jpeg') + .toBuffer() + .then((outputBuffer: Buffer) => { + // outputBuffer contains JPEG image data no smaller than the input + }); + +// Output to tif +sharp(input) + .resize(100, 100) + .toFormat('tif') + .toFormat('tiff') + .toFormat(sharp.format.tif) + .toFormat(sharp.format.tiff) + .toBuffer(); + +const stats = sharp.cache(); + +sharp.cache({ items: 200 }); +sharp.cache({ files: 0 }); +sharp.cache(false); + +const threads = sharp.concurrency(); // 4 +sharp.concurrency(2); // 2 +sharp.concurrency(0); // 4 + +const counters = sharp.counters(); // { queue: 2, process: 4 } + +let simd: boolean = sharp.simd(); +// simd is `true` if SIMD is currently enabled + +simd = sharp.simd(true); +// attempts to enable the use of SIMD, returning true if available + +const vipsVersion: string = sharp.versions.vips; + +if (sharp.versions.cairo) { + const cairoVersion: string = sharp.versions.cairo; +} + +sharp('input.gif') + .linear(1) + .linear(1, 0) + .linear(null, 0) + .linear([0.25, 0.5, 0.75], [150, 100, 50]) + + .recomb([ + [0.3588, 0.7044, 0.1368], + [0.299, 0.587, 0.114], + [0.2392, 0.4696, 0.0912], + ]) + + .recomb([ + [1,0,0,0], + [0,1,0,0], + [0,0,1,0], + [0,0,0,1], + ]) + + .modulate({ brightness: 2 }) + .modulate({ hue: 180 }) + .modulate({ lightness: 10 }) + .modulate({ brightness: 0.5, saturation: 0.5, hue: 90 }); + +// From https://sharp.pixelplumbing.com/api-output#examples-9 +// Extract raw RGB pixel data from JPEG input +sharp('input.jpg') + .raw({ depth: 'ushort' }) + .toBuffer({ resolveWithObject: true }) + .then(({ data, info }) => { + console.log(data); + console.log(info); + }); + +sharp(input).jpeg().jpeg({}).jpeg({ + progressive: false, + chromaSubsampling: '4:4:4', + trellisQuantisation: false, + overshootDeringing: false, + optimiseScans: false, + optimizeScans: false, + optimiseCoding: false, + optimizeCoding: false, + quantisationTable: 10, + quantizationTable: 10, + mozjpeg: false, + quality: 10, + force: false, +}); + +sharp(input).png().png({}).png({ + progressive: false, + compressionLevel: 10, + adaptiveFiltering: false, + force: false, + quality: 10, + palette: false, + colours: 10, + colors: 10, + dither: 10, +}); + +sharp(input) + .avif() + .avif({}) + .avif({ quality: 50, lossless: false, effort: 5, chromaSubsampling: '4:2:0' }) + .heif() + .heif({}) + .heif({ quality: 50, compression: 'hevc', lossless: false, effort: 5, chromaSubsampling: '4:2:0' }) + .toBuffer({ resolveWithObject: true }) + .then(({ data, info }) => { + console.log(data); + console.log(info); + }); + +sharp(input) + .gif() + .gif({}) + .gif({ loop: 0, delay: [], force: true }) + .gif({ delay: 30 }) + .gif({ reuse: true }) + .gif({ reuse: false }) + .gif({ progressive: true }) + .gif({ progressive: false }) + .gif({ keepDuplicateFrames: true }) + .gif({ keepDuplicateFrames: false }) + .toBuffer({ resolveWithObject: true }) + .then(({ data, info }) => { + console.log(data); + console.log(info); + }); + +sharp(input) + .tiff({ compression: 'packbits' }) + .toBuffer({ resolveWithObject: true }) + .then(({ data, info }) => { + console.log(data); + console.log(info); + }); + +sharp('input.jpg') + .stats() + .then(stats => { + const { + sharpness, + dominant: { r, g, b }, + } = stats; + console.log(sharpness); + console.log(`${r}, ${g}, ${b}`); + }); + +// From https://sharp.pixelplumbing.com/api-output#examples-9 +// Extract alpha channel as raw pixel data from PNG input +sharp('input.png').ensureAlpha().ensureAlpha(0).extractChannel(3).toColourspace('b-w').raw().toBuffer(); + +// From https://sharp.pixelplumbing.com/api-constructor#examples-4 +// Convert an animated GIF to an animated WebP +sharp('in.gif', { animated: true }).toFile('out.webp'); + +// From https://github.com/lovell/sharp/issues/2701 +// Type support for limitInputPixels +sharp({ + create: { + background: 'red', + channels: 4, + height: 25000, + width: 25000, + pageHeight: 1000, + }, + limitInputPixels: false, +}) + .toFormat('png') + .toBuffer() + .then(largeImage => sharp(input).composite([{ input: largeImage, limitInputPixels: false }])); + +// Taken from API documentation at +// https://sharp.pixelplumbing.com/api-operation#clahe +// introduced +sharp('input.jpg').clahe({ width: 10, height: 10 }).toFile('output.jpg'); + +sharp('input.jpg').clahe({ width: 10, height: 10, maxSlope: 5 }).toFile('outfile.jpg'); + +// Support `unlimited` input option +sharp('input.png', { unlimited: true }).resize(320, 240).toFile('outfile.png'); + +// Support creating with noise +sharp({ + create: { + background: 'red', + channels: 4, + height: 100, + width: 100, + noise: { + type: 'gaussian', + mean: 128, + sigma: 30, + }, + }, +}) + .png() + .toFile('output.png'); + +sharp(new Uint8Array(input.buffer)).toFile('output.jpg'); + +// Support for negate options +sharp('input.png').negate({ alpha: false }).toFile('output.png'); + +// From https://github.com/lovell/sharp/pull/2704 +// Type support for pipelineColourspace +sharp(input) + .pipelineColourspace('rgb16') + .resize(320, 240) + .gamma() + .toColourspace('srgb') // this is the default, but included here for clarity + .toBuffer(); + +// From https://github.com/lovell/sharp/pull/1439 +// Second parameter to gamma operation for different output gamma +sharp(input) + .resize(129, 111) + .gamma(2.2, 3.0) + .toBuffer(err => { + if (err) throw err; + }); + +// Support for raw depth specification +sharp('16bpc.png') + .toColourspace('rgb16') + .raw({ depth: 'ushort' }) + .toBuffer((error, data, { width, height, channels, size }) => { + console.log((size / width / height / channels) * 8); + console.log(new Uint16Array(data.buffer)); + }); + +// Output channels are constrained from 1-4, can be used as raw input +sharp(input) + .toBuffer({ resolveWithObject: true }) + .then(result => { + const newImg = sharp(result.data, { + raw: { + channels: result.info.channels, + width: result.info.width, + height: result.info.height, + }, + }); + + return newImg.toBuffer(); + }); + +// Support for specifying a timeout +sharp('someImage.png').timeout({ seconds: 30 }).resize(300, 300).toBuffer(); + +// Support for `effort` in different formats +sharp('input.tiff').png({ effort: 9 }).toFile('out.png'); +sharp('input.tiff').webp({ effort: 9 }).toFile('out.webp'); +sharp('input.tiff').avif({ effort: 9 }).toFile('out.avif'); +sharp('input.tiff').heif({ effort: 9 }).toFile('out.heif'); +sharp('input.tiff').gif({ effort: 9 }).toFile('out.gif'); + +// Support for `colors`/`colours` for gif output +sharp('input.gif').gif({ colors: 16 }).toFile('out.gif'); +sharp('input.gif').gif({ colours: 16 }).toFile('out.gif'); + +// Support for `dither` for gif/png output +sharp('input.gif').gif({ dither: 0.5 }).toFile('out.gif'); +sharp('input.gif').png({ dither: 0.5 }).toFile('out.png'); + +// Support for `interFrameMaxError` for gif output +sharp('input.gif').gif({ interFrameMaxError: 0 }).toFile('out.gif'); + +// Support for `interPaletteMaxError` for gif output +sharp('input.gif').gif({ interPaletteMaxError: 0 }).toFile('out.gif'); + +// Support for `resolutionUnit` for tiff output +sharp('input.tiff').tiff({ resolutionUnit: 'cm' }).toFile('out.tiff'); + +// Support for `jp2` output with different options +sharp('input.tiff').jp2().toFile('out.jp2'); +sharp('input.tiff').jp2({ quality: 50 }).toFile('out.jp2'); +sharp('input.tiff').jp2({ lossless: true }).toFile('out.jp2'); +sharp('input.tiff').jp2({ tileWidth: 128, tileHeight: 128 }).toFile('out.jp2'); +sharp('input.tiff').jp2({ chromaSubsampling: '4:2:0' }).toFile('out.jp2'); + +// Support for `jxl` output with different options +sharp('input.tiff').jxl().toFile('out.jxl'); +sharp('input.tiff').jxl({ distance: 15.0 }).toFile('out.jxl'); +sharp('input.tiff').jxl({ quality: 50 }).toFile('out.jxl'); +sharp('input.tiff').jxl({ decodingTier: 4 }).toFile('out.jxl'); +sharp('input.tiff').jxl({ lossless: true }).toFile('out.jxl'); +sharp('input.tiff').jxl({ effort: 7 }).toFile('out.jxl'); + +// Support `minSize` and `mixed` webp options +sharp('input.tiff').webp({ minSize: true, mixed: true }).toFile('out.gif'); + +// 'failOn' input param +sharp('input.tiff', { failOn: 'none' }); +sharp('input.tiff', { failOn: 'truncated' }); +sharp('input.tiff', { failOn: 'error' }); +sharp('input.tiff', { failOn: 'warning' }); + +// Sharpen operation taking an object instead of three params +sharp('input.tiff').sharpen().toBuffer(); +sharp('input.tiff').sharpen({ sigma: 2 }).toBuffer(); +sharp('input.tiff') + .sharpen({ + sigma: 2, + m1: 0, + m2: 3, + x1: 3, + y2: 15, + y3: 15, + }) + .toBuffer(); + +// Affine operator + interpolator hash +sharp().affine( + [ + [1, 0.3], + [0.1, 0.7], + ], + { + background: 'white', + interpolator: sharp.interpolators.nohalo, + }, +); + +sharp().affine([1, 1, 1, 1], { + background: 'white', + idx: 0, + idy: 0, + odx: 0, + ody: 0, +}); + +const bicubic: string = sharp.interpolators.bicubic; +const bilinear: string = sharp.interpolators.bilinear; +const locallyBoundedBicubic: string = sharp.interpolators.locallyBoundedBicubic; +const nearest: string = sharp.interpolators.nearest; +const nohalo: string = sharp.interpolators.nohalo; +const vertexSplitQuadraticBasisSpline: string = sharp.interpolators.vertexSplitQuadraticBasisSpline; + +// Triming +sharp(input).trim({ background: '#000' }).toBuffer(); +sharp(input).trim({ threshold: 10, lineArt: true }).toBuffer(); +sharp(input).trim({ background: '#bf1942', threshold: 30 }).toBuffer(); + +// Text input +sharp({ + text: { + text: 'Hello world', + align: 'centre', + dpi: 72, + font: 'Arial', + fontfile: 'path/to/arial.ttf', + height: 500, + width: 500, + rgba: true, + justify: true, + spacing: 10, + wrap: 'word-char', + }, +}) + .png() + .toBuffer({ resolveWithObject: true }) + .then(out => { + console.log(out.info.textAutofitDpi); + }); + +// Text composite +sharp('input.png').composite([ + { + input: { + text: { + text: 'Okay then', + font: 'Comic Sans', + }, + }, + }, +]); + +// From https://github.com/lovell/sharp/pull/1835 +sharp('input.png').composite([ + { + input: { + text: { + text: 'Okay then', + font: 'Comic Sans', + }, + }, + blend: 'color-burn', + top: 0, + left: 0, + premultiplied: true, + }, +]); + +// https://github.com/lovell/sharp/pull/402 +(['fs', 'zip'] as const).forEach(container => { + sharp().tile({ container }); +}); + +// From https://github.com/lovell/sharp/issues/2238 +sharp('input.png').tile({ + basename: 'output.dz.tiles', +}); + +// https://github.com/lovell/sharp/issues/3669 +sharp(input).composite([ + { + raw: { + width: 1, + height: 1, + channels: 1, + premultiplied: false, + }, + sequentialRead: false, + unlimited: true, + } +]); + +// Support for webp preset in types +// https://github.com/lovell/sharp/issues/3747 +sharp('input.tiff').webp({ preset: 'photo' }).toFile('out.webp'); +sharp('input.tiff').webp({ preset: 'picture' }).toFile('out.webp'); +sharp('input.tiff').webp({ preset: 'icon' }).toFile('out.webp'); +sharp('input.tiff').webp({ preset: 'drawing' }).toFile('out.webp'); +sharp('input.tiff').webp({ preset: 'text' }).toFile('out.webp'); +sharp('input.tiff').webp({ preset: 'default' }).toFile('out.webp'); + +sharp(input) + .keepExif() + .withExif({ + IFD0: { + k1: 'v1' + } + }) + .withExifMerge({ + IFD1: { + k2: 'v2' + } + }) + .keepXmp() + .withXmp('test') + .keepIccProfile() + .withIccProfile('filename') + .withIccProfile('filename', { attach: false }); + +// Added missing types for OverlayOptions +// https://github.com/lovell/sharp/pull/4048 +sharp(input).composite([ + { + input: 'image.gif', + animated: true, + limitInputPixels: 536805378, + density: 144, + failOn: "warning", + autoOrient: true + } +]) +sharp(input).composite([ + { + input: 'image.png', + animated: false, + limitInputPixels: 178935126, + density: 72, + failOn: "truncated" + } +]) + +// Support format-specific input options +const colour: sharp.Colour = '#fff'; +const color: sharp.Color = '#fff'; +sharp({ pdf: { background: colour } }); +sharp({ pdf: { background: color } }); +sharp({ pdfBackground: colour }); // Deprecated +sharp({ pdfBackground: color }); // Deprecated +sharp({ tiff: { subifd: 3 } }); +sharp({ subifd: 3 }); // Deprecated +sharp({ openSlide: { level: 0 } }); +sharp({ level: 0 }); // Deprecated +sharp({ jp2: { oneshot: true } }); +sharp({ jp2: { oneshot: false } }); +sharp({ svg: { stylesheet: 'test' }}); +sharp({ svg: { highBitdepth: true }}); +sharp({ svg: { highBitdepth: false }}); + +// Raw input options +const raw: sharp.Raw = { width: 1, height: 1, channels: 3 }; +sharp({ raw }); +sharp({ raw: { ...raw, premultiplied: true } }); +sharp({ raw: { ...raw, premultiplied: false } }); +sharp({ raw: { ...raw, pageHeight: 1 } }); + +sharp({ autoOrient: true }); +sharp({ autoOrient: false }); +sharp().autoOrient(); + +sharp([input, input]); +sharp([input, input], { + join: { + animated: true + } +}); +sharp([input, input], { + join: { + across: 2, + shim: 5, + background: colour, + halign: 'centre', + valign: 'bottom' + } +}); + +sharp().erode(); +sharp().erode(1); +sharp().dilate(); +sharp().dilate(1); diff --git a/test/types/tsconfig.json b/test/types/tsconfig.json new file mode 100644 index 000000000..d8ec98092 --- /dev/null +++ b/test/types/tsconfig.json @@ -0,0 +1,4 @@ +{ + "module": "commonjs", + "strict": true +} diff --git a/test/unit.mjs b/test/unit.mjs new file mode 100644 index 000000000..0a1aa3c0f --- /dev/null +++ b/test/unit.mjs @@ -0,0 +1,16 @@ +import { readdir } from 'node:fs/promises'; +import { run } from 'node:test'; +import { spec } from 'node:test/reporters'; + +const files = (await readdir('./test/unit')).map((f) => `./test/unit/${f}`); + +run({ + files, + concurrency: true, + timeout: 60000, + coverage: true, + coverageIncludeGlobs: ['lib/*.js'], + branchCoverage: 100, +}) + .compose(new spec()) + .pipe(process.stdout); diff --git a/test/unit/affine.js b/test/unit/affine.js new file mode 100644 index 000000000..43ff8ed92 --- /dev/null +++ b/test/unit/affine.js @@ -0,0 +1,188 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('Affine transform', () => { + describe('Invalid input', () => { + it('Missing matrix', () => { + assert.throws(() => { + sharp(fixtures.inputJpg) + .affine(); + }); + }); + it('Invalid 1d matrix', () => { + assert.throws(() => { + sharp(fixtures.inputJpg) + .affine(['123', 123, 123, 123]); + }); + }); + it('Invalid 2d matrix', () => { + assert.throws(() => { + sharp(fixtures.inputJpg) + .affine([[123, 123], [null, 123]]); + }); + }); + it('Invalid options parameter type', () => { + assert.throws(() => { + sharp(fixtures.inputJpg) + .affine([[1, 0], [0, 1]], 'invalid options type'); + }); + }); + it('Invalid background color', () => { + assert.throws(() => { + sharp(fixtures.inputJpg) + .affine([4, 4, 4, 4], { background: 'not a color' }); + }); + }); + it('Invalid idx offset type', () => { + assert.throws(() => { + sharp(fixtures.inputJpg) + .affine([[4, 4], [4, 4]], { idx: 'invalid idx type' }); + }); + }); + it('Invalid idy offset type', () => { + assert.throws(() => { + sharp(fixtures.inputJpg) + .affine([4, 4, 4, 4], { idy: 'invalid idy type' }); + }); + }); + it('Invalid odx offset type', () => { + assert.throws(() => { + sharp(fixtures.inputJpg) + .affine([[4, 4], [4, 4]], { odx: 'invalid odx type' }); + }); + }); + it('Invalid ody offset type', () => { + assert.throws(() => { + sharp(fixtures.inputJpg) + .affine([[4, 4], [4, 4]], { ody: 'invalid ody type' }); + }); + }); + it('Invalid interpolator', () => { + assert.throws(() => { + sharp(fixtures.inputJpg) + .affine([[4, 4], [4, 4]], { interpolator: 'cubic' }); + }); + }); + }); + it('Applies identity matrix', done => { + const input = fixtures.inputJpg; + sharp(input) + .affine([[1, 0], [0, 1]]) + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(input, data, done); + }); + }); + it('Applies resize affine matrix', done => { + const input = fixtures.inputJpg; + const inputWidth = 2725; + const inputHeight = 2225; + sharp(input) + .affine([[0.2, 0], [0, 1.5]]) + .toBuffer((err, data, info) => { + if (err) throw err; + fixtures.assertSimilar(input, data, done); + assert.strictEqual(info.width, Math.ceil(inputWidth * 0.2)); + assert.strictEqual(info.height, Math.ceil(inputHeight * 1.5)); + }); + }); + it('Resizes and applies affine transform', done => { + const input = fixtures.inputJpg; + sharp(input) + .resize(500, 500) + .affine([[0.5, 1], [1, 0.5]]) + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(data, fixtures.expected('affine-resize-expected.jpg'), done); + }); + }); + it('Extracts and applies affine transform', done => { + sharp(fixtures.inputJpg) + .extract({ left: 300, top: 300, width: 600, height: 600 }) + .affine([0.3, 0, -0.5, 0.3]) + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(data, fixtures.expected('affine-extract-expected.jpg'), done); + }); + }); + it('Rotates and applies affine transform', done => { + sharp(fixtures.inputJpg320x240) + .rotate(90) + .affine([[-1.2, 0], [0, -1.2]]) + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(data, fixtures.expected('affine-rotate-expected.jpg'), done); + }); + }); + it('Extracts, rotates and applies affine transform', done => { + sharp(fixtures.inputJpg) + .extract({ left: 1000, top: 1000, width: 200, height: 200 }) + .rotate(45, { background: 'blue' }) + .affine([[2, 1], [2, -0.5]], { background: 'red' }) + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(fixtures.expected('affine-extract-rotate-expected.jpg'), data, done); + }); + }); + it('Applies affine transform with background color', done => { + sharp(fixtures.inputJpg320x240) + .rotate(180) + .affine([[-1.5, 1.2], [-1, 1]], { background: 'red' }) + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(fixtures.expected('affine-background-expected.jpg'), data, done); + }); + }); + it('Applies affine transform with background color and output offsets', done => { + sharp(fixtures.inputJpg320x240) + .rotate(180) + .affine([[-2, 1.5], [-1, 2]], { background: 'blue', odx: 40, ody: -100 }) + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(fixtures.expected('affine-background-output-offsets-expected.jpg'), data, done); + }); + }); + it('Applies affine transform with background color and all offsets', done => { + sharp(fixtures.inputJpg320x240) + .rotate(180) + .affine([[-1.2, 1.8], [-1, 2]], { background: 'yellow', idx: 10, idy: -40, odx: 10, ody: -50 }) + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(fixtures.expected('affine-background-all-offsets-expected.jpg'), data, done); + }); + }); + + it('Animated image rejects', () => + assert.rejects(() => sharp(fixtures.inputGifAnimated, { animated: true }) + .affine([1, 1, 1, 1]) + .toBuffer(), + /Affine is not supported for multi-page images/ + ) + ); + + describe('Interpolations', () => { + const input = fixtures.inputJpg320x240; + const inputWidth = 320; + const inputHeight = 240; + for (const interp in sharp.interpolators) { + it(`Performs 2x upscale with ${interp} interpolation`, done => { + sharp(input) + .affine([[2, 0], [0, 2]], { interpolator: sharp.interpolators[interp] }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(info.width, Math.ceil(inputWidth * 2)); + assert.strictEqual(info.height, Math.ceil(inputHeight * 2)); + fixtures.assertSimilar(fixtures.expected(`affine-${sharp.interpolators[interp]}-2x-upscale-expected.jpg`), data, done); + }); + }); + } + }); +}); diff --git a/test/unit/alpha.js b/test/unit/alpha.js index 750f51edc..42e178098 100644 --- a/test/unit/alpha.js +++ b/test/unit/alpha.js @@ -1,15 +1,19 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const fixtures = require('../fixtures'); const sharp = require('../../'); -describe('Alpha transparency', function () { - it('Flatten to black', function (done) { +describe('Alpha transparency', () => { + it('Flatten to black', (_t, done) => { sharp(fixtures.inputPngWithTransparency) .flatten() .resize(400, 300) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(400, info.width); assert.strictEqual(300, info.height); @@ -17,12 +21,14 @@ describe('Alpha transparency', function () { }); }); - it('Flatten to RGB orange', function (done) { + it('Flatten to RGB orange', (_t, done) => { sharp(fixtures.inputPngWithTransparency) - .flatten() - .background({r: 255, g: 102, b: 0}) .resize(400, 300) - .toBuffer(function (err, data, info) { + .flatten({ + background: { r: 255, g: 102, b: 0 } + }) + .jpeg({ chromaSubsampling: '4:4:4' }) + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(400, info.width); assert.strictEqual(300, info.height); @@ -30,12 +36,12 @@ describe('Alpha transparency', function () { }); }); - it('Flatten to CSS/hex orange', function (done) { + it('Flatten to CSS/hex orange', (_t, done) => { sharp(fixtures.inputPngWithTransparency) - .flatten() - .background('#ff6600') .resize(400, 300) - .toBuffer(function (err, data, info) { + .flatten({ background: '#ff6600' }) + .jpeg({ chromaSubsampling: '4:4:4' }) + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(400, info.width); assert.strictEqual(300, info.height); @@ -43,25 +49,26 @@ describe('Alpha transparency', function () { }); }); - it('Flatten 16-bit PNG with transparency to orange', function (done) { + it('Flatten 16-bit PNG with transparency to orange', (_t, done) => { const output = fixtures.path('output.flatten-rgb16-orange.jpg'); sharp(fixtures.inputPngWithTransparency16bit) - .flatten() - .background({r: 255, g: 102, b: 0}) - .toFile(output, function (err, info) { + .flatten({ + background: { r: 255, g: 102, b: 0 } + }) + .toFile(output, (err, info) => { if (err) throw err; assert.strictEqual(true, info.size > 0); assert.strictEqual(32, info.width); assert.strictEqual(32, info.height); - fixtures.assertMaxColourDistance(output, fixtures.expected('flatten-rgb16-orange.jpg'), 25); + fixtures.assertMaxColourDistance(output, fixtures.expected('flatten-rgb16-orange.jpg'), 10); done(); }); }); - it('Do not flatten', function (done) { + it('Do not flatten', (_t, done) => { sharp(fixtures.inputPngWithTransparency) .flatten(false) - .toBuffer(function (err, data, info) { + .toBuffer((err, _data, info) => { if (err) throw err; assert.strictEqual('png', info.format); assert.strictEqual(4, info.channels); @@ -69,11 +76,10 @@ describe('Alpha transparency', function () { }); }); - it('Ignored for JPEG', function (done) { + it('Ignored for JPEG', (_t, done) => { sharp(fixtures.inputJpg) - .background('#ff0000') - .flatten() - .toBuffer(function (err, data, info) { + .flatten({ background: '#ff0000' }) + .toBuffer((err, _data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(3, info.channels); @@ -81,35 +87,93 @@ describe('Alpha transparency', function () { }); }); - it('Enlargement with non-nearest neighbor interpolation shouldn’t cause dark edges', function (done) { + it('Flatten with options but without colour does not throw', () => { + assert.doesNotThrow(() => { + sharp().flatten({}); + }); + }); + + it('Flatten to invalid colour throws', () => { + assert.throws(() => { + sharp().flatten({ background: 1 }); + }); + }); + + it('Enlargement with non-nearest neighbor interpolation shouldn’t cause dark edges', () => { const base = 'alpha-premultiply-enlargement-2048x1536-paper.png'; - const actual = fixtures.path('output.' + base); + const actual = fixtures.path(`output.${base}`); const expected = fixtures.expected(base); - sharp(fixtures.inputPngAlphaPremultiplicationSmall) + return sharp(fixtures.inputPngAlphaPremultiplicationSmall) .resize(2048, 1536) - .toFile(actual, function (err) { - if (err) { - done(err); - } else { - fixtures.assertMaxColourDistance(actual, expected, 102); - done(); - } + .toFile(actual) + .then(() => { + fixtures.assertMaxColourDistance(actual, expected, 102); }); }); - it('Reduction with non-nearest neighbor interpolation shouldn’t cause dark edges', function (done) { + it('Reduction with non-nearest neighbor interpolation shouldn’t cause dark edges', () => { const base = 'alpha-premultiply-reduction-1024x768-paper.png'; - const actual = fixtures.path('output.' + base); + const actual = fixtures.path(`output.${base}`); const expected = fixtures.expected(base); - sharp(fixtures.inputPngAlphaPremultiplicationLarge) + return sharp(fixtures.inputPngAlphaPremultiplicationLarge) .resize(1024, 768) - .toFile(actual, function (err) { - if (err) { - done(err); - } else { - fixtures.assertMaxColourDistance(actual, expected, 102); - done(); - } + .toFile(actual) + .then(() => { + fixtures.assertMaxColourDistance(actual, expected, 102); }); }); + + it('Removes alpha from fixtures with transparency, ignores those without', () => Promise.all([ + fixtures.inputPngWithTransparency, + fixtures.inputPngWithTransparency16bit, + fixtures.inputWebPWithTransparency, + fixtures.inputJpg, + fixtures.inputPng, + fixtures.inputWebP + ].map((input) => sharp(input) + .resize(10) + .removeAlpha() + .toBuffer({ resolveWithObject: true }) + .then((result) => { + assert.strictEqual(3, result.info.channels); + })))); + + it('Ensures alpha from fixtures without transparency, ignores those with', () => Promise.all([ + fixtures.inputPngWithTransparency, + fixtures.inputPngWithTransparency16bit, + fixtures.inputWebPWithTransparency, + fixtures.inputJpg, + fixtures.inputPng, + fixtures.inputWebP + ].map((input) => sharp(input) + .resize(10) + .ensureAlpha() + .png() + .toBuffer({ resolveWithObject: true }) + .then((result) => { + assert.strictEqual(4, result.info.channels); + })))); + + it('Valid ensureAlpha value used for alpha channel', async () => { + const background = { r: 255, g: 0, b: 0 }; + const [r, g, b, alpha] = await sharp({ + create: { + width: 8, + height: 8, + channels: 3, + background + } + }) + .ensureAlpha(0.5) + .raw() + .toBuffer(); + + assert.deepStrictEqual({ r, g, b, alpha }, { ...background, alpha: 127 }); + }); + + it('Invalid ensureAlpha value throws', async () => { + assert.throws(() => { + sharp().ensureAlpha('fail'); + }); + }); }); diff --git a/test/unit/avif.js b/test/unit/avif.js new file mode 100644 index 000000000..6fb08c6fb --- /dev/null +++ b/test/unit/avif.js @@ -0,0 +1,184 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const { inputAvif, inputJpg, inputGifAnimated } = require('../fixtures'); + +describe('AVIF', () => { + it('called without options does not throw an error', () => { + assert.doesNotThrow(() => { + sharp().avif(); + }); + }); + + it('can convert AVIF to JPEG', async () => { + const data = await sharp(inputAvif) + .resize(32) + .jpeg() + .toBuffer(); + const { size, ...metadata } = await sharp(data).metadata(); + void size; + assert.deepStrictEqual(metadata, { + autoOrient: { + height: 13, + width: 32 + }, + channels: 3, + chromaSubsampling: '4:2:0', + density: 72, + depth: 'uchar', + format: 'jpeg', + hasAlpha: false, + hasProfile: false, + // 32 / (2048 / 858) = 13.40625 + // Math.round(13.40625) = 13 + height: 13, + isProgressive: false, + isPalette: false, + space: 'srgb', + width: 32 + }); + }); + + it('can convert JPEG to AVIF', async () => { + const data = await sharp(inputJpg) + .resize(32) + .avif({ effort: 0 }) + .toBuffer(); + const { size, ...metadata } = await sharp(data).metadata(); + void size; + assert.deepStrictEqual(metadata, { + autoOrient: { + height: 26, + width: 32 + }, + channels: 3, + compression: 'av1', + depth: 'uchar', + format: 'heif', + hasAlpha: false, + hasProfile: false, + height: 26, + isProgressive: false, + isPalette: false, + bitsPerSample: 8, + pagePrimary: 0, + pages: 1, + space: 'srgb', + width: 32 + }); + }); + + it('can passthrough AVIF', async () => { + const data = await sharp(inputAvif) + .resize(32) + .toBuffer(); + const { size, ...metadata } = await sharp(data).metadata(); + void size; + assert.deepStrictEqual(metadata, { + autoOrient: { + height: 13, + width: 32 + }, + channels: 3, + compression: 'av1', + depth: 'uchar', + format: 'heif', + hasAlpha: false, + hasProfile: false, + height: 13, + isProgressive: false, + isPalette: false, + bitsPerSample: 8, + pagePrimary: 0, + pages: 1, + space: 'srgb', + width: 32 + }); + }); + + it('can convert animated GIF to non-animated AVIF', async () => { + const data = await sharp(inputGifAnimated, { animated: true }) + .resize(10) + .avif({ effort: 0 }) + .toBuffer(); + const { size, ...metadata } = await sharp(data).metadata(); + void size; + assert.deepStrictEqual(metadata, { + autoOrient: { + height: 300, + width: 10 + }, + channels: 4, + compression: 'av1', + depth: 'uchar', + format: 'heif', + hasAlpha: true, + hasProfile: false, + height: 300, + isProgressive: false, + isPalette: false, + bitsPerSample: 8, + pagePrimary: 0, + pages: 1, + space: 'srgb', + width: 10 + }); + }); + + it('should cast to uchar', async () => { + const data = await sharp(inputJpg) + .resize(32) + .sharpen() + .avif({ effort: 0 }) + .toBuffer(); + const { size, ...metadata } = await sharp(data).metadata(); + void size; + assert.deepStrictEqual(metadata, { + autoOrient: { + height: 26, + width: 32 + }, + channels: 3, + compression: 'av1', + depth: 'uchar', + format: 'heif', + hasAlpha: false, + hasProfile: false, + height: 26, + isProgressive: false, + isPalette: false, + bitsPerSample: 8, + pagePrimary: 0, + pages: 1, + space: 'srgb', + width: 32 + }); + }); + + it('Invalid width - too large', async () => + assert.rejects( + () => sharp({ create: { width: 16385, height: 16, channels: 3, background: 'red' } }).avif().toBuffer(), + /Processed image is too large for the HEIF format/ + ) + ); + + it('Invalid height - too large', async () => + assert.rejects( + () => sharp({ create: { width: 16, height: 16385, channels: 3, background: 'red' } }).avif().toBuffer(), + /Processed image is too large for the HEIF format/ + ) + ); + + it('Invalid bitdepth value throws error', () => + assert.throws( + () => sharp().avif({ bitdepth: 11 }), + /Expected 8, 10 or 12 for bitdepth but received 11 of type number/ + ) + ); +}); diff --git a/test/unit/bandbool.js b/test/unit/bandbool.js index e6ea9d666..073fb68ba 100644 --- a/test/unit/bandbool.js +++ b/test/unit/bandbool.js @@ -1,48 +1,52 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const fixtures = require('../fixtures'); const sharp = require('../../'); -describe('Bandbool per-channel boolean operations', function () { +describe('Bandbool per-channel boolean operations', () => { [ sharp.bool.and, sharp.bool.or, sharp.bool.eor ] - .forEach(function (op) { - it(op + ' operation', function (done) { - sharp(fixtures.inputPngBooleanNoAlpha) - .bandbool(op) - .toColourspace('b-w') - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(200, info.width); - assert.strictEqual(200, info.height); - assert.strictEqual(1, info.channels); - fixtures.assertSimilar(fixtures.expected('bandbool_' + op + '_result.png'), data, done); - }); + .forEach((op) => { + it(`${op} operation`, (_t, done) => { + sharp(fixtures.inputPngBooleanNoAlpha) + .bandbool(op) + .toColourspace('b-w') + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(1, info.channels); + fixtures.assertSimilar(fixtures.expected(`bandbool_${op}_result.png`), data, done); + }); + }); }); - }); - it('sRGB image retains 3 channels', function (done) { + it('sRGB image retains 3 channels', (_t, done) => { sharp(fixtures.inputJpg) .bandbool('and') - .toBuffer(function (err, data, info) { + .toBuffer((err, _data, info) => { if (err) throw err; assert.strictEqual(3, info.channels); done(); }); }); - it('Invalid operation', function () { - assert.throws(function () { + it('Invalid operation', () => { + assert.throws(() => { sharp().bandbool('fail'); }); }); - it('Missing operation', function () { - assert.throws(function () { + it('Missing operation', () => { + assert.throws(() => { sharp().bandbool(); }); }); diff --git a/test/unit/blur.js b/test/unit/blur.js index 72b853036..f5132098b 100644 --- a/test/unit/blur.js +++ b/test/unit/blur.js @@ -1,16 +1,20 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const sharp = require('../../'); const fixtures = require('../fixtures'); -describe('Blur', function () { - it('specific radius 1', function (done) { +describe('Blur', () => { + it('specific radius 1', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .blur(1) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -19,11 +23,11 @@ describe('Blur', function () { }); }); - it('specific radius 10', function (done) { + it('specific radius 10', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .blur(10) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -32,11 +36,24 @@ describe('Blur', function () { }); }); - it('specific radius 0.3', function (done) { + it('specific options.sigma 10', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 240) + .blur({ sigma: 10 }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + fixtures.assertSimilar(fixtures.expected('blur-10.jpg'), data, done); + }); + }); + + it('specific radius 0.3', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .blur(0.3) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -45,11 +62,11 @@ describe('Blur', function () { }); }); - it('mild blur', function (done) { + it('mild blur', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .blur() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -58,17 +75,17 @@ describe('Blur', function () { }); }); - it('invalid radius', function () { - assert.throws(function () { + it('invalid radius', () => { + assert.throws(() => { sharp(fixtures.inputJpg).blur(0.1); }); }); - it('blurred image is smaller than non-blurred', function (done) { + it('blurred image is smaller than non-blurred', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .blur(false) - .toBuffer(function (err, notBlurred, info) { + .toBuffer((err, notBlurred, info) => { if (err) throw err; assert.strictEqual(true, notBlurred.length > 0); assert.strictEqual('jpeg', info.format); @@ -77,7 +94,7 @@ describe('Blur', function () { sharp(fixtures.inputJpg) .resize(320, 240) .blur(true) - .toBuffer(function (err, blurred, info) { + .toBuffer((err, blurred, info) => { if (err) throw err; assert.strictEqual(true, blurred.length > 0); assert.strictEqual(true, blurred.length < notBlurred.length); @@ -88,4 +105,54 @@ describe('Blur', function () { }); }); }); + + it('invalid precision', () => { + assert.throws(() => { + sharp(fixtures.inputJpg).blur({ sigma: 1, precision: 'invalid' }); + }, /Expected one of: integer, float, approximate for precision but received invalid of type string/); + }); + + it('invalid minAmplitude', () => { + assert.throws(() => { + sharp(fixtures.inputJpg).blur({ sigma: 1, minAmplitude: 0 }); + }, /Expected number between 0.001 and 1 for minAmplitude but received 0 of type number/); + + assert.throws(() => { + sharp(fixtures.inputJpg).blur({ sigma: 1, minAmplitude: 1.01 }); + }, /Expected number between 0.001 and 1 for minAmplitude but received 1.01 of type number/); + }); + + it('specific radius 10 and precision approximate', async () => { + const approximate = await sharp(fixtures.inputJpg) + .resize(320, 240) + .blur({ sigma: 10, precision: 'approximate' }) + .toBuffer(); + const integer = await sharp(fixtures.inputJpg) + .resize(320, 240) + .blur(10) + .toBuffer(); + + assert.notDeepEqual(approximate, integer); + await fixtures.assertSimilar(fixtures.expected('blur-10.jpg'), approximate); + }); + + it('specific radius 10 and minAmplitude 0.01', async () => { + const minAmplitudeLow = await sharp(fixtures.inputJpg) + .resize(320, 240) + .blur({ sigma: 10, minAmplitude: 0.01 }) + .toBuffer(); + const minAmplitudeDefault = await sharp(fixtures.inputJpg) + .resize(320, 240) + .blur(10) + .toBuffer(); + + assert.notDeepEqual(minAmplitudeLow, minAmplitudeDefault); + await fixtures.assertSimilar(fixtures.expected('blur-10.jpg'), minAmplitudeLow); + }); + + it('options.sigma is required if options object is passed', () => { + assert.throws(() => { + sharp(fixtures.inputJpg).blur({ precision: 'invalid' }); + }, /Expected number between 0.3 and 1000 for options.sigma but received undefined of type undefined/); + }); }); diff --git a/test/unit/boolean.js b/test/unit/boolean.js index 98a4570ef..f85b376da 100644 --- a/test/unit/boolean.js +++ b/test/unit/boolean.js @@ -1,12 +1,16 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const fs = require('fs'); -const assert = require('assert'); +const fs = require('node:fs'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const fixtures = require('../fixtures'); const sharp = require('../../'); -describe('Boolean operation between two images', function () { +describe('Boolean operation between two images', () => { const inputJpgBooleanTestBuffer = fs.readFileSync(fixtures.inputJpgBooleanTest); [ @@ -14,63 +18,63 @@ describe('Boolean operation between two images', function () { sharp.bool.or, sharp.bool.eor ] - .forEach(function (op) { - it(op + ' operation, file', function (done) { - sharp(fixtures.inputJpg) - .resize(320, 240) - .boolean(fixtures.inputJpgBooleanTest, op) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(320, info.width); - assert.strictEqual(240, info.height); - fixtures.assertSimilar(fixtures.expected('boolean_' + op + '_result.jpg'), data, done); - }); - }); + .forEach((op) => { + it(`${op} operation, file`, (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 240) + .boolean(fixtures.inputJpgBooleanTest, op) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + fixtures.assertSimilar(fixtures.expected(`boolean_${op}_result.jpg`), data, done); + }); + }); - it(op + ' operation, buffer', function (done) { - sharp(fixtures.inputJpg) - .resize(320, 240) - .boolean(inputJpgBooleanTestBuffer, op) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(320, info.width); - assert.strictEqual(240, info.height); - fixtures.assertSimilar(fixtures.expected('boolean_' + op + '_result.jpg'), data, done); - }); - }); + it(`${op} operation, buffer`, (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 240) + .boolean(inputJpgBooleanTestBuffer, op) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + fixtures.assertSimilar(fixtures.expected(`boolean_${op}_result.jpg`), data, done); + }); + }); - it(op + ' operation, raw', function (done) { - sharp(fixtures.inputJpgBooleanTest) - .raw() - .toBuffer(function (err, data, info) { - if (err) throw err; - sharp(fixtures.inputJpg) - .resize(320, 240) - .boolean(data, op, { raw: info }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(320, info.width); - assert.strictEqual(240, info.height); - fixtures.assertSimilar(fixtures.expected('boolean_' + op + '_result.jpg'), data, done); - }); - }); + it(`${op} operation, raw`, (_t, done) => { + sharp(fixtures.inputJpgBooleanTest) + .raw() + .toBuffer((err, data, info) => { + if (err) throw err; + sharp(fixtures.inputJpg) + .resize(320, 240) + .boolean(data, op, { raw: info }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + fixtures.assertSimilar(fixtures.expected(`boolean_${op}_result.jpg`), data, done); + }); + }); + }); }); - }); - it('Invalid operation', function () { - assert.throws(function () { + it('Invalid operation', () => { + assert.throws(() => { sharp().boolean(fixtures.inputJpgBooleanTest, 'fail'); }); }); - it('Invalid operation, non-string', function () { - assert.throws(function () { + it('Invalid operation, non-string', () => { + assert.throws(() => { sharp().boolean(fixtures.inputJpgBooleanTest, null); }); }); - it('Missing input', function () { - assert.throws(function () { + it('Missing input', () => { + assert.throws(() => { sharp().boolean(); }); }); diff --git a/test/unit/cache.js b/test/unit/cache.js deleted file mode 100644 index 093b91c09..000000000 --- a/test/unit/cache.js +++ /dev/null @@ -1,9 +0,0 @@ -'use strict'; - -const sharp = require('../../'); - -// Define SHARP_TEST_WITHOUT_CACHE environment variable to prevent use of libvips' cache - -beforeEach(function () { - sharp.cache(!process.env.SHARP_TEST_WITHOUT_CACHE); -}); diff --git a/test/unit/clahe.js b/test/unit/clahe.js new file mode 100644 index 000000000..f3d68ebbb --- /dev/null +++ b/test/unit/clahe.js @@ -0,0 +1,143 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../lib'); +const fixtures = require('../fixtures'); + +describe('Clahe', () => { + it('width 5 width 5 maxSlope 0', (_t, done) => { + sharp(fixtures.inputJpgClahe) + .clahe({ width: 5, height: 5, maxSlope: 0 }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + fixtures.assertSimilar(fixtures.expected('clahe-5-5-0.jpg'), data, { threshold: 10 }, done); + }); + }); + + it('width 5 width 5 maxSlope 5', (_t, done) => { + sharp(fixtures.inputJpgClahe) + .clahe({ width: 5, height: 5, maxSlope: 5 }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + fixtures.assertSimilar(fixtures.expected('clahe-5-5-5.jpg'), data, done); + }); + }); + + it('width 11 width 25 maxSlope 14', (_t, done) => { + sharp(fixtures.inputJpgClahe) + .clahe({ width: 11, height: 25, maxSlope: 14 }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + fixtures.assertSimilar(fixtures.expected('clahe-11-25-14.jpg'), data, done); + }); + }); + + it('width 50 width 50 maxSlope 0', (_t, done) => { + sharp(fixtures.inputJpgClahe) + .clahe({ width: 50, height: 50, maxSlope: 0 }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + fixtures.assertSimilar(fixtures.expected('clahe-50-50-0.jpg'), data, done); + }); + }); + + it('width 50 width 50 maxSlope 14', (_t, done) => { + sharp(fixtures.inputJpgClahe) + .clahe({ width: 50, height: 50, maxSlope: 14 }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + fixtures.assertSimilar(fixtures.expected('clahe-50-50-14.jpg'), data, done); + }); + }); + + it('width 100 width 50 maxSlope 3', (_t, done) => { + sharp(fixtures.inputJpgClahe) + .clahe({ width: 100, height: 50, maxSlope: 3 }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + fixtures.assertSimilar(fixtures.expected('clahe-100-50-3.jpg'), data, done); + }); + }); + + it('width 100 width 100 maxSlope 0', (_t, done) => { + sharp(fixtures.inputJpgClahe) + .clahe({ width: 100, height: 100, maxSlope: 0 }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + fixtures.assertSimilar(fixtures.expected('clahe-100-100-0.jpg'), data, done); + }); + }); + + it('invalid maxSlope', () => { + assert.throws(() => { + sharp(fixtures.inputJpgClahe).clahe({ width: 100, height: 100, maxSlope: -5 }); + }); + assert.throws(() => { + sharp(fixtures.inputJpgClahe).clahe({ width: 100, height: 100, maxSlope: 110 }); + }); + assert.throws(() => { + sharp(fixtures.inputJpgClahe).clahe({ width: 100, height: 100, maxSlope: 5.5 }); + }); + assert.throws(() => { + sharp(fixtures.inputJpgClahe).clahe({ width: 100, height: 100, maxSlope: 'a string' }); + }); + }); + + it('invalid width', () => { + assert.throws(() => { + sharp(fixtures.inputJpgClahe).clahe({ width: 100.5, height: 100 }); + }); + assert.throws(() => { + sharp(fixtures.inputJpgClahe).clahe({ width: -5, height: 100 }); + }); + assert.throws(() => { + sharp(fixtures.inputJpgClahe).clahe({ width: true, height: 100 }); + }); + assert.throws(() => { + sharp(fixtures.inputJpgClahe).clahe({ width: 'string test', height: 100 }); + }); + }); + + it('invalid height', () => { + assert.throws(() => { + sharp(fixtures.inputJpgClahe).clahe({ width: 100, height: 100.5 }); + }); + assert.throws(() => { + sharp(fixtures.inputJpgClahe).clahe({ width: 100, height: -5 }); + }); + assert.throws(() => { + sharp(fixtures.inputJpgClahe).clahe({ width: 100, height: true }); + }); + assert.throws(() => { + sharp(fixtures.inputJpgClahe).clahe({ width: 100, height: 'string test' }); + }); + }); + + it('invalid options object', () => { + assert.throws(() => { + sharp(fixtures.inputJpgClahe).clahe(100, 100, 5); + }); + }); + + it('uses default maxSlope of 3', (_t, done) => { + sharp(fixtures.inputJpgClahe) + .clahe({ width: 100, height: 50 }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + fixtures.assertSimilar(fixtures.expected('clahe-100-50-3.jpg'), data, done); + }); + }); +}); diff --git a/test/unit/clone.js b/test/unit/clone.js index 7e404328d..dae53b20b 100644 --- a/test/unit/clone.js +++ b/test/unit/clone.js @@ -1,26 +1,30 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const fs = require('fs'); -const assert = require('assert'); +const fs = require('node:fs'); +const { afterEach, beforeEach, describe, it } = require('node:test'); +const assert = require('node:assert'); const sharp = require('../../'); const fixtures = require('../fixtures'); -describe('Clone', function () { - beforeEach(function () { +describe('Clone', () => { + beforeEach(() => { sharp.cache(false); }); - afterEach(function () { + afterEach(() => { sharp.cache(true); }); - it('Read from Stream and write to multiple Streams', function (done) { + it('Read from Stream and write to multiple Streams', (_t, done) => { let finishEventsExpected = 2; // Output stream 1 const output1 = fixtures.path('output.multi-stream.1.jpg'); const writable1 = fs.createWriteStream(output1); - writable1.on('finish', function () { - sharp(output1).toBuffer(function (err, data, info) { + writable1.on('finish', () => { + sharp(output1).toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual(data.length, info.size); @@ -37,8 +41,8 @@ describe('Clone', function () { // Output stream 2 const output2 = fixtures.path('output.multi-stream.2.jpg'); const writable2 = fs.createWriteStream(output2); - writable2.on('finish', function () { - sharp(output2).toBuffer(function (err, data, info) { + writable2.on('finish', () => { + sharp(output2).toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual(data.length, info.size); @@ -60,4 +64,40 @@ describe('Clone', function () { // Go fs.createReadStream(fixtures.inputJpg).pipe(rotator); }); + + it('Stream-based input attaches finish event listener to original', () => { + const original = sharp(); + const clone = original.clone(); + assert.strictEqual(1, original.listenerCount('finish')); + assert.strictEqual(0, clone.listenerCount('finish')); + }); + + it('Non Stream-based input does not attach finish event listeners', () => { + const original = sharp(fixtures.inputJpg); + const clone = original.clone(); + assert.strictEqual(0, original.listenerCount('finish')); + assert.strictEqual(0, clone.listenerCount('finish')); + }); + + it('Ensure deep clone of properties, including arrays', async () => { + const alpha = await sharp({ + create: { width: 320, height: 240, channels: 3, background: 'red' } + }).toColourspace('b-w').png().toBuffer(); + + const original = sharp(); + const joiner = original.clone().joinChannel(alpha); + const negater = original.clone().negate(); + + fs.createReadStream(fixtures.inputJpg320x240).pipe(original); + const joined = await joiner.png({ effort: 1 }).toBuffer(); + const negated = await negater.png({ effort: 1 }).toBuffer(); + + const joinedMetadata = await sharp(joined).metadata(); + assert.strictEqual(joinedMetadata.channels, 4); + assert.strictEqual(joinedMetadata.hasAlpha, true); + + const negatedMetadata = await sharp(negated).metadata(); + assert.strictEqual(negatedMetadata.channels, 3); + assert.strictEqual(negatedMetadata.hasAlpha, false); + }); }); diff --git a/test/unit/colourspace.js b/test/unit/colourspace.js index d3d041b4b..9c908051c 100644 --- a/test/unit/colourspace.js +++ b/test/unit/colourspace.js @@ -1,19 +1,23 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const sharp = require('../../'); const fixtures = require('../fixtures'); -describe('Colour space conversion', function () { - it('To greyscale', function (done) { +describe('Colour space conversion', () => { + it('To greyscale', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .greyscale() .toFile(fixtures.path('output.greyscale-gamma-0.0.jpg'), done); }); - it('To greyscale with gamma correction', function (done) { + it('To greyscale with gamma correction', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .gamma() @@ -21,19 +25,19 @@ describe('Colour space conversion', function () { .toFile(fixtures.path('output.greyscale-gamma-2.2.jpg'), done); }); - it('Not to greyscale', function (done) { + it('Not to greyscale', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .greyscale(false) .toFile(fixtures.path('output.greyscale-not.jpg'), done); }); - it('Greyscale with single channel output', function (done) { + it('Greyscale with single channel output', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .greyscale() .toColourspace('b-w') - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(1, info.channels); assert.strictEqual(320, info.width); @@ -42,23 +46,20 @@ describe('Colour space conversion', function () { }); }); - if (sharp.format.tiff.input.file && sharp.format.webp.output.buffer) { - it('From 1-bit TIFF to sRGB WebP [slow]', function (done) { - sharp(fixtures.inputTiff) - .webp() - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('webp', info.format); - done(); - }); - }); - } + it('From 1-bit TIFF to sRGB WebP', async () => { + const data = await sharp(fixtures.inputTiff) + .resize(8, 8) + .webp() + .toBuffer(); + + const { format } = await sharp(data).metadata(); + assert.strictEqual(format, 'webp'); + }); - it('From CMYK to sRGB', function (done) { + it('From CMYK to sRGB', (_t, done) => { sharp(fixtures.inputJpgWithCmykProfile) .resize(320) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -67,12 +68,13 @@ describe('Colour space conversion', function () { }); }); - it('From CMYK to sRGB with white background, not yellow', function (done) { + it('From CMYK to sRGB with white background, not yellow', (_t, done) => { sharp(fixtures.inputJpgWithCmykProfile) - .resize(320, 240) - .background('white') - .embed() - .toBuffer(function (err, data, info) { + .resize(320, 240, { + fit: sharp.fit.contain, + background: 'white' + }) + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -81,10 +83,10 @@ describe('Colour space conversion', function () { }); }); - it('From profile-less CMYK to sRGB', function (done) { + it('From profile-less CMYK to sRGB', (_t, done) => { sharp(fixtures.inputJpgWithCmykNoProfile) .resize(320) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -92,8 +94,99 @@ describe('Colour space conversion', function () { }); }); - it('Invalid input', function () { - assert.throws(function () { + it('Profile-less CMYK roundtrip', async () => { + const [c, m, y, k] = await sharp(fixtures.inputJpgWithCmykNoProfile) + .pipelineColourspace('cmyk') + .toColourspace('cmyk') + .raw() + .toBuffer(); + + assert.deepStrictEqual( + { c, m, y, k }, + { c: 55, m: 27, y: 0, k: 0 } + ); + }); + + it('CMYK profile to CMYK profile conversion using perceptual intent', async () => { + const data = await sharp(fixtures.inputTiffFogra) + .resize(320, 240) + .toColourspace('cmyk') + .pipelineColourspace('cmyk') + .withIccProfile(fixtures.path('XCMYK 2017.icc')) + .raw() + .toBuffer(); + + const [c, m, y, k] = data; + assert.deepStrictEqual( + { c, m, y, k }, + { c: 1, m: 239, y: 227, k: 5 } + ); + }); + + it('CMYK profile to CMYK profile with negate', (_t, done) => { + sharp(fixtures.inputTiffFogra) + .resize(320, 240) + .toColourspace('cmyk') + .pipelineColourspace('cmyk') + .withIccProfile(fixtures.path('XCMYK 2017.icc')) + .negate() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('tiff', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + fixtures.assertSimilar( + fixtures.expected('colourspace.cmyk-to-cmyk-negated.tif'), + data, + { threshold: 0 }, + done + ); + }); + }); + + it('From sRGB with RGB16 pipeline, resize with gamma, to sRGB', (_t, done) => { + sharp(fixtures.inputPngGradients) + .pipelineColourspace('rgb16') + .resize(320) + .gamma() + .toColourspace('srgb') + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(320, info.width); + fixtures.assertSimilar(fixtures.expected('colourspace-gradients-gamma-resize.png'), data, { + threshold: 0 + }, done); + }); + }); + + it('Convert P3 to sRGB', async () => { + const [r, g, b] = await sharp(fixtures.inputPngP3) + .raw() + .toBuffer(); + assert.strictEqual(r, 255); + assert.strictEqual(g, 0); + assert.strictEqual(b, 0); + }); + + it('Passthrough P3', async () => { + const [r, g, b] = await sharp(fixtures.inputPngP3) + .withMetadata({ icc: 'p3' }) + .raw() + .toBuffer(); + assert.strictEqual(r, 234); + assert.strictEqual(g, 51); + assert.strictEqual(b, 34); + }); + + it('Invalid pipelineColourspace input', () => { + assert.throws(() => { + sharp(fixtures.inputJpg) + .pipelineColorspace(null); + }, /Expected string for colourspace but received null of type object/); + }); + + it('Invalid toColourspace input', () => { + assert.throws(() => { sharp(fixtures.inputJpg) .toColourspace(null); }); diff --git a/test/unit/composite.js b/test/unit/composite.js new file mode 100644 index 000000000..0df0007aa --- /dev/null +++ b/test/unit/composite.js @@ -0,0 +1,536 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const fixtures = require('../fixtures'); +const sharp = require('../../'); + +const red = { r: 255, g: 0, b: 0, alpha: 0.5 }; +const green = { r: 0, g: 255, b: 0, alpha: 0.5 }; +const blue = { r: 0, g: 0, b: 255, alpha: 0.5 }; + +const redRect = { + create: { + width: 80, + height: 60, + channels: 4, + background: red + } +}; + +const greenRect = { + create: { + width: 40, + height: 40, + channels: 4, + background: green + } +}; + +const blueRect = { + create: { + width: 60, + height: 40, + channels: 4, + background: blue + } +}; + +const blends = [ + 'over', + 'xor', + 'saturate', + 'dest-over' +]; + +// Test +describe('composite', () => { + blends.forEach(blend => { + it(`blend ${blend}`, async () => { + const filename = `composite.blend.${blend}.png`; + const actual = fixtures.path(`output.${filename}`); + const expected = fixtures.expected(filename); + await sharp(redRect) + .composite([{ + input: blueRect, + blend + }]) + .toFile(actual); + fixtures.assertMaxColourDistance(actual, expected); + }); + }); + + it('premultiplied true', () => { + const filename = 'composite.premultiplied.png'; + const below = fixtures.path(`input.below.${filename}`); + const above = fixtures.path(`input.above.${filename}`); + const actual = fixtures.path(`output.true.${filename}`); + const expected = fixtures.expected(`expected.true.${filename}`); + return sharp(below) + .composite([{ + input: above, + blend: 'color-burn', + top: 0, + left: 0, + premultiplied: true + }]) + .toFile(actual) + .then(() => { + fixtures.assertMaxColourDistance(actual, expected); + }); + }); + + it('premultiplied false', () => { + const filename = 'composite.premultiplied.png'; + const below = fixtures.path(`input.below.${filename}`); + const above = fixtures.path(`input.above.${filename}`); + const actual = fixtures.path(`output.false.${filename}`); + const expected = fixtures.expected(`expected.false.${filename}`); + return sharp(below) + .composite([{ + input: above, + blend: 'color-burn', + top: 0, + left: 0, + premultiplied: false + }]) + .toFile(actual) + .then(() => { + fixtures.assertMaxColourDistance(actual, expected); + }); + }); + + it('premultiplied absent', () => { + const filename = 'composite.premultiplied.png'; + const below = fixtures.path(`input.below.${filename}`); + const above = fixtures.path(`input.above.${filename}`); + const actual = fixtures.path(`output.absent.${filename}`); + const expected = fixtures.expected(`expected.absent.${filename}`); + return sharp(below) + .composite([{ + input: above, + blend: 'color-burn', + top: 0, + left: 0 + }]) + .toFile(actual) + .then(() => { + fixtures.assertMaxColourDistance(actual, expected); + }); + }); + + it('scrgb pipeline', () => { + const filename = 'composite-red-scrgb.png'; + const actual = fixtures.path(`output.${filename}`); + const expected = fixtures.expected(filename); + return sharp({ + create: { + width: 32, height: 32, channels: 4, background: red + } + }) + .pipelineColourspace('scrgb') + .composite([{ + input: fixtures.inputPngWithTransparency16bit, + blend: 'color-burn' + }]) + .toFile(actual) + .then(() => { + fixtures.assertMaxColourDistance(actual, expected); + }); + }); + + it('multiple', async () => { + const filename = 'composite-multiple.png'; + const actual = fixtures.path(`output.${filename}`); + const expected = fixtures.expected(filename); + await sharp(redRect) + .composite([{ + input: blueRect, + gravity: 'northeast' + }, { + input: greenRect, + gravity: 'southwest' + }]) + .toFile(actual); + fixtures.assertMaxColourDistance(actual, expected); + }); + + it('autoOrient', async () => { + const data = await sharp({ + create: { + width: 600, height: 600, channels: 4, background: { ...red, alpha: 1 } + } + }) + .composite([{ + input: fixtures.inputJpgWithExif, + autoOrient: true + }]) + .jpeg() + .toBuffer(); + + await fixtures.assertSimilar(fixtures.expected('composite-autoOrient.jpg'), data); + }); + + it('zero offset', done => { + sharp(fixtures.inputJpg) + .resize(80) + .composite([{ + input: fixtures.inputPngWithTransparency16bit, + top: 0, + left: 0 + }]) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(3, info.channels); + fixtures.assertSimilar(fixtures.expected('overlay-offset-0.jpg'), data, done); + }); + }); + + it('offset and gravity', done => { + sharp(fixtures.inputJpg) + .resize(80) + .composite([{ + input: fixtures.inputPngWithTransparency16bit, + left: 10, + top: 10, + gravity: 4 + }]) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(3, info.channels); + fixtures.assertSimilar(fixtures.expected('overlay-offset-with-gravity.jpg'), data, done); + }); + }); + + it('negative offset and gravity', done => { + sharp(fixtures.inputJpg) + .resize(400) + .composite([{ + input: fixtures.inputPngWithTransparency16bit, + left: -10, + top: -10, + gravity: 4 + }]) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(3, info.channels); + fixtures.assertSimilar( + fixtures.expected('overlay-negative-offset-with-gravity.jpg'), data, done); + }); + }); + + it('offset, gravity and tile', done => { + sharp(fixtures.inputJpg) + .resize(80) + .composite([{ + input: fixtures.inputPngWithTransparency16bit, + left: 10, + top: 10, + gravity: 4, + tile: true + }]) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(3, info.channels); + fixtures.assertSimilar(fixtures.expected('overlay-offset-with-gravity-tile.jpg'), data, done); + }); + }); + + it('offset and tile', done => { + sharp(fixtures.inputJpg) + .resize(80) + .composite([{ + input: fixtures.inputPngWithTransparency16bit, + left: 10, + top: 10, + tile: true + }]) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(3, info.channels); + fixtures.assertSimilar(fixtures.expected('overlay-offset-with-tile.jpg'), data, done); + }); + }); + + it('centre gravity should replicate correct number of tiles', async () => { + const red = { r: 255, g: 0, b: 0 }; + const [r, g, b] = await sharp({ + create: { + width: 40, height: 40, channels: 4, background: red + } + }) + .composite([{ + input: fixtures.inputPngWithTransparency16bit, + gravity: 'centre', + tile: true + }]) + .raw() + .toBuffer(); + + assert.deepStrictEqual({ r, g, b }, red); + }); + + it('cutout via dest-in', done => { + sharp(fixtures.inputJpg) + .resize(300, 300) + .composite([{ + input: Buffer.from(''), + density: 96, + blend: 'dest-in', + cutout: true + }]) + .png() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(300, info.width); + assert.strictEqual(300, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('composite-cutout.png'), data, done); + }); + }); + + describe('numeric gravity', () => { + Object.keys(sharp.gravity).forEach(gravity => { + it(gravity, done => { + sharp(fixtures.inputJpg) + .resize(80) + .composite([{ + input: fixtures.inputPngWithTransparency16bit, + gravity + }]) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(80, info.width); + assert.strictEqual(65, info.height); + assert.strictEqual(3, info.channels); + fixtures.assertSimilar(fixtures.expected(`overlay-gravity-${gravity}.jpg`), data, done); + }); + }); + }); + }); + + describe('string gravity', () => { + Object.keys(sharp.gravity).forEach(gravity => { + it(gravity, done => { + const expected = fixtures.expected(`overlay-gravity-${gravity}.jpg`); + sharp(fixtures.inputJpg) + .resize(80) + .composite([{ + input: fixtures.inputPngWithTransparency16bit, + gravity: sharp.gravity[gravity] + }]) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(80, info.width); + assert.strictEqual(65, info.height); + assert.strictEqual(3, info.channels); + fixtures.assertSimilar(expected, data, done); + }); + }); + }); + }); + + describe('tile and gravity', () => { + Object.keys(sharp.gravity).forEach(gravity => { + it(gravity, done => { + const expected = fixtures.expected(`overlay-tile-gravity-${gravity}.jpg`); + sharp(fixtures.inputJpg) + .resize(80) + .composite([{ + input: fixtures.inputPngWithTransparency16bit, + tile: true, + gravity + }]) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(80, info.width); + assert.strictEqual(65, info.height); + assert.strictEqual(3, info.channels); + fixtures.assertSimilar(expected, data, done); + }); + }); + }); + }); + + describe('validation', () => { + it('missing images', () => { + assert.throws(() => { + sharp().composite(); + }, /Expected array for images to composite but received undefined of type undefined/); + }); + + it('invalid images', () => { + assert.throws(() => { + sharp().composite(['invalid']); + }, /Expected object for image to composite but received invalid of type string/); + }); + + it('missing input', () => { + assert.throws(() => { + sharp().composite([{}]); + }, /Unsupported input/); + }); + + it('invalid blend', () => { + assert.throws(() => { + sharp().composite([{ input: 'test', blend: 'invalid' }]); + }, /Expected valid blend name for blend but received invalid of type string/); + }); + + it('invalid tile', () => { + assert.throws(() => { + sharp().composite([{ input: 'test', tile: 'invalid' }]); + }, /Expected boolean for tile but received invalid of type string/); + }); + + it('invalid premultiplied', () => { + assert.throws(() => { + sharp().composite([{ input: 'test', premultiplied: 'invalid' }]); + }, /Expected boolean for premultiplied but received invalid of type string/); + }); + + it('invalid left', () => { + assert.throws(() => { + sharp().composite([{ input: 'test', left: 0.5 }]); + }, /Expected integer for left but received 0.5 of type number/); + assert.throws(() => { + sharp().composite([{ input: 'test', left: 'invalid' }]); + }, /Expected integer for left but received invalid of type string/); + assert.throws(() => { + sharp().composite([{ input: 'test', left: 'invalid', top: 10 }]); + }, /Expected integer for left but received invalid of type string/); + }); + + it('invalid top', () => { + assert.throws(() => { + sharp().composite([{ input: 'test', top: 0.5 }]); + }, /Expected integer for top but received 0.5 of type number/); + assert.throws(() => { + sharp().composite([{ input: 'test', top: 'invalid' }]); + }, /Expected integer for top but received invalid of type string/); + assert.throws(() => { + sharp().composite([{ input: 'test', top: 'invalid', left: 10 }]); + }, /Expected integer for top but received invalid of type string/); + }); + + it('left but no top', () => { + assert.throws(() => { + sharp().composite([{ input: 'test', left: 1 }]); + }, /Expected both left and top to be set/); + }); + + it('top but no left', () => { + assert.throws(() => { + sharp().composite([{ input: 'test', top: 1 }]); + }, /Expected both left and top to be set/); + }); + + it('invalid gravity', () => { + assert.throws(() => { + sharp().composite([{ input: 'test', gravity: 'invalid' }]); + }, /Expected valid gravity for gravity but received invalid of type string/); + }); + }); + + it('Allow offset beyond bottom/right edge', async () => { + const red = { r: 255, g: 0, b: 0 }; + const blue = { r: 0, g: 0, b: 255 }; + + const [r, g, b] = await sharp({ create: { width: 2, height: 2, channels: 4, background: red } }) + .composite([{ + input: { create: { width: 2, height: 2, channels: 4, background: blue } }, + top: 1, + left: 1 + }]) + .raw() + .toBuffer(); + + assert.deepStrictEqual(red, { r, g, b }); + }); + + it('Ensure tiled composition works with resized fit=outside', async () => { + const { info } = await sharp({ + create: { + width: 41, height: 41, channels: 3, background: 'red' + } + }) + .resize({ + width: 10, + height: 40, + fit: 'outside' + }) + .composite([ + { + input: { + create: { + width: 16, height: 16, channels: 3, background: 'green' + } + }, + tile: true + } + ]) + .toBuffer({ resolveWithObject: true }); + + assert.strictEqual(info.width, 40); + assert.strictEqual(info.height, 40); + }); + + it('Ensure implicit unpremultiply after resize but before composite', async () => { + const [r, g, b, a] = await sharp({ + create: { + width: 1, height: 1, channels: 4, background: 'saddlebrown' + } + }) + .resize({ width: 8 }) + .composite([{ + input: Buffer.from([255, 255, 255, 128]), + raw: { width: 1, height: 1, channels: 4 }, + tile: true, + blend: 'dest-in' + }]) + .raw() + .toBuffer(); + + assert.strictEqual(r, 139); + assert.strictEqual(g, 69); + assert.strictEqual(b, 19); + assert.strictEqual(a, 128); + }); + + it('Ensure tiled overlay is fully decoded', async () => { + const tile = await sharp({ + create: { + width: 8, height: 513, channels: 3, background: 'red' + } + }) + .png({ compressionLevel: 0 }) + .toBuffer(); + + const { info } = await sharp({ + create: { + width: 8, height: 514, channels: 3, background: 'green' + } + }) + .composite([{ + input: tile, + tile: true + }]) + .toBuffer({ resolveWithObject: true }); + + assert.strictEqual(info.width, 8); + assert.strictEqual(info.height, 514); + }); +}); diff --git a/test/unit/convolve.js b/test/unit/convolve.js index 2d4155490..bc430178a 100644 --- a/test/unit/convolve.js +++ b/test/unit/convolve.js @@ -1,12 +1,16 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const sharp = require('../../'); const fixtures = require('../fixtures'); -describe('Convolve', function () { - it('specific convolution kernel 1', function (done) { +describe('Convolve', () => { + it('specific convolution kernel 1', (_t, done) => { sharp(fixtures.inputPngStripesV) .convolve({ width: 3, @@ -19,7 +23,7 @@ describe('Convolve', function () { 10, 20, 10 ] }) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('png', info.format); assert.strictEqual(320, info.width); @@ -28,7 +32,7 @@ describe('Convolve', function () { }); }); - it('specific convolution kernel 2', function (done) { + it('specific convolution kernel 2', (_t, done) => { sharp(fixtures.inputPngStripesH) .convolve({ width: 3, @@ -39,7 +43,7 @@ describe('Convolve', function () { 1, 0, 1 ] }) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('png', info.format); assert.strictEqual(320, info.width); @@ -48,7 +52,7 @@ describe('Convolve', function () { }); }); - it('horizontal Sobel operator', function (done) { + it('horizontal Sobel operator', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .convolve({ @@ -60,7 +64,7 @@ describe('Convolve', function () { -1, 0, 1 ] }) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -69,14 +73,14 @@ describe('Convolve', function () { }); }); - describe('invalid kernel specification', function () { - it('missing', function () { - assert.throws(function () { + describe('invalid kernel specification', () => { + it('missing', () => { + assert.throws(() => { sharp(fixtures.inputJpg).convolve({}); }); }); - it('incorrect data format', function () { - assert.throws(function () { + it('incorrect data format', () => { + assert.throws(() => { sharp(fixtures.inputJpg).convolve({ width: 3, height: 3, @@ -84,8 +88,8 @@ describe('Convolve', function () { }); }); }); - it('incorrect dimensions', function () { - assert.throws(function () { + it('incorrect dimensions', () => { + assert.throws(() => { sharp(fixtures.inputJpg).convolve({ width: 3, height: 4, diff --git a/test/unit/crop.js b/test/unit/crop.js deleted file mode 100644 index 9a5a27590..000000000 --- a/test/unit/crop.js +++ /dev/null @@ -1,249 +0,0 @@ -'use strict'; - -const assert = require('assert'); - -const sharp = require('../../'); -const fixtures = require('../fixtures'); - -describe('Crop', function () { - [ - { - name: 'North', - width: 320, - height: 80, - gravity: sharp.gravity.north, - fixture: 'gravity-north.jpg' - }, - { - name: 'East', - width: 80, - height: 320, - gravity: sharp.gravity.east, - fixture: 'gravity-east.jpg' - }, - { - name: 'South', - width: 320, - height: 80, - gravity: sharp.gravity.south, - fixture: 'gravity-south.jpg' - }, - { - name: 'West', - width: 80, - height: 320, - gravity: sharp.gravity.west, - fixture: 'gravity-west.jpg' - }, - { - name: 'Center', - width: 320, - height: 80, - gravity: sharp.gravity.center, - fixture: 'gravity-center.jpg' - }, - { - name: 'Centre', - width: 80, - height: 320, - gravity: sharp.gravity.centre, - fixture: 'gravity-centre.jpg' - }, - { - name: 'Default (centre)', - width: 80, - height: 320, - gravity: undefined, - fixture: 'gravity-centre.jpg' - }, - { - name: 'Northeast', - width: 320, - height: 80, - gravity: sharp.gravity.northeast, - fixture: 'gravity-north.jpg' - }, - { - name: 'Northeast', - width: 80, - height: 320, - gravity: sharp.gravity.northeast, - fixture: 'gravity-east.jpg' - }, - { - name: 'Southeast', - width: 320, - height: 80, - gravity: sharp.gravity.southeast, - fixture: 'gravity-south.jpg' - }, - { - name: 'Southeast', - width: 80, - height: 320, - gravity: sharp.gravity.southeast, - fixture: 'gravity-east.jpg' - }, - { - name: 'Southwest', - width: 320, - height: 80, - gravity: sharp.gravity.southwest, - fixture: 'gravity-south.jpg' - }, - { - name: 'Southwest', - width: 80, - height: 320, - gravity: sharp.gravity.southwest, - fixture: 'gravity-west.jpg' - }, - { - name: 'Northwest', - width: 320, - height: 80, - gravity: sharp.gravity.northwest, - fixture: 'gravity-north.jpg' - }, - { - name: 'Northwest', - width: 80, - height: 320, - gravity: sharp.gravity.northwest, - fixture: 'gravity-west.jpg' - } - ].forEach(function (settings) { - it(settings.name + ' gravity', function (done) { - sharp(fixtures.inputJpg) - .resize(settings.width, settings.height) - .crop(settings.gravity) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(settings.width, info.width); - assert.strictEqual(settings.height, info.height); - fixtures.assertSimilar(fixtures.expected(settings.fixture), data, done); - }); - }); - }); - - it('Allows specifying the gravity as a string', function (done) { - sharp(fixtures.inputJpg) - .resize(80, 320) - .crop('east') - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(80, info.width); - assert.strictEqual(320, info.height); - fixtures.assertSimilar(fixtures.expected('gravity-east.jpg'), data, done); - }); - }); - - it('Invalid values fail', function () { - assert.throws(function () { - sharp().crop(9); - }, /Expected valid crop id\/name\/strategy for crop but received 9 of type number/); - assert.throws(function () { - sharp().crop(1.1); - }, /Expected valid crop id\/name\/strategy for crop but received 1.1 of type number/); - assert.throws(function () { - sharp().crop(-1); - }, /Expected valid crop id\/name\/strategy for crop but received -1 of type number/); - assert.throws(function () { - sharp().crop('zoinks'); - }, /Expected valid crop id\/name\/strategy for crop but received zoinks of type string/); - }); - - it('Uses default value when none specified', function () { - assert.doesNotThrow(function () { - sharp().crop(); - }); - }); - - describe('Entropy-based strategy', function () { - it('JPEG', function (done) { - sharp(fixtures.inputJpg) - .resize(80, 320) - .crop(sharp.strategy.entropy) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(3, info.channels); - assert.strictEqual(80, info.width); - assert.strictEqual(320, info.height); - fixtures.assertSimilar(fixtures.expected('crop-strategy-entropy.jpg'), data, done); - }); - }); - - it('PNG', function (done) { - sharp(fixtures.inputPngWithTransparency) - .resize(320, 80) - .crop(sharp.strategy.entropy) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('png', info.format); - assert.strictEqual(4, info.channels); - assert.strictEqual(320, info.width); - assert.strictEqual(80, info.height); - fixtures.assertSimilar(fixtures.expected('crop-strategy.png'), data, done); - }); - }); - - it('supports the strategy passed as a string', function (done) { - sharp(fixtures.inputPngWithTransparency) - .resize(320, 80) - .crop('entropy') - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('png', info.format); - assert.strictEqual(4, info.channels); - assert.strictEqual(320, info.width); - assert.strictEqual(80, info.height); - fixtures.assertSimilar(fixtures.expected('crop-strategy.png'), data, done); - }); - }); - }); - - describe('Attention strategy', function () { - it('JPEG', function (done) { - sharp(fixtures.inputJpg) - .resize(80, 320) - .crop(sharp.strategy.attention) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(3, info.channels); - assert.strictEqual(80, info.width); - assert.strictEqual(320, info.height); - fixtures.assertSimilar(fixtures.expected('crop-strategy-attention.jpg'), data, done); - }); - }); - - it('PNG', function (done) { - sharp(fixtures.inputPngWithTransparency) - .resize(320, 80) - .crop(sharp.strategy.attention) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('png', info.format); - assert.strictEqual(4, info.channels); - assert.strictEqual(320, info.width); - assert.strictEqual(80, info.height); - fixtures.assertSimilar(fixtures.expected('crop-strategy.png'), data, done); - }); - }); - - it('supports the strategy passed as a string', function (done) { - sharp(fixtures.inputPngWithTransparency) - .resize(320, 80) - .crop('attention') - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('png', info.format); - assert.strictEqual(4, info.channels); - assert.strictEqual(320, info.width); - assert.strictEqual(80, info.height); - fixtures.assertSimilar(fixtures.expected('crop-strategy.png'), data, done); - }); - }); - }); -}); diff --git a/test/unit/dilate.js b/test/unit/dilate.js new file mode 100644 index 000000000..f1f982cb6 --- /dev/null +++ b/test/unit/dilate.js @@ -0,0 +1,37 @@ +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('Dilate', () => { + it('dilate 1 png', (_t, done) => { + sharp(fixtures.inputPngDotAndLines) + .dilate(1) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(100, info.width); + assert.strictEqual(100, info.height); + fixtures.assertSimilar(fixtures.expected('dilate-1.png'), data, done); + }); + }); + + it('dilate 1 png - default width', (_t, done) => { + sharp(fixtures.inputPngDotAndLines) + .dilate() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(100, info.width); + assert.strictEqual(100, info.height); + fixtures.assertSimilar(fixtures.expected('dilate-1.png'), data, done); + }); + }); + + it('invalid dilation width', () => { + assert.throws(() => { + sharp(fixtures.inputJpg).dilate(-1); + }); + }); +}); diff --git a/test/unit/embed.js b/test/unit/embed.js deleted file mode 100644 index 62aef91b3..000000000 --- a/test/unit/embed.js +++ /dev/null @@ -1,135 +0,0 @@ -'use strict'; - -const assert = require('assert'); - -const sharp = require('../../'); -const fixtures = require('../fixtures'); - -describe('Embed', function () { - it('JPEG within PNG, no alpha channel', function (done) { - sharp(fixtures.inputJpg) - .embed() - .resize(320, 240) - .png() - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('png', info.format); - assert.strictEqual(320, info.width); - assert.strictEqual(240, info.height); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(fixtures.expected('embed-3-into-3.png'), data, done); - }); - }); - - it('JPEG within WebP, to include alpha channel', function (done) { - sharp(fixtures.inputJpg) - .resize(320, 240) - .background({r: 0, g: 0, b: 0, alpha: 0}) - .embed() - .webp() - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('webp', info.format); - assert.strictEqual(320, info.width); - assert.strictEqual(240, info.height); - assert.strictEqual(4, info.channels); - fixtures.assertSimilar(fixtures.expected('embed-3-into-4.webp'), data, done); - }); - }); - - it('PNG with alpha channel', function (done) { - sharp(fixtures.inputPngWithTransparency) - .resize(50, 50) - .embed() - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('png', info.format); - assert.strictEqual(50, info.width); - assert.strictEqual(50, info.height); - assert.strictEqual(4, info.channels); - fixtures.assertSimilar(fixtures.expected('embed-4-into-4.png'), data, done); - }); - }); - - it('16-bit PNG with alpha channel', function (done) { - sharp(fixtures.inputPngWithTransparency16bit) - .resize(32, 16) - .embed() - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('png', info.format); - assert.strictEqual(32, info.width); - assert.strictEqual(16, info.height); - assert.strictEqual(4, info.channels); - fixtures.assertSimilar(fixtures.expected('embed-16bit.png'), data, done); - }); - }); - - it('16-bit PNG with alpha channel onto RGBA', function (done) { - sharp(fixtures.inputPngWithTransparency16bit) - .resize(32, 16) - .embed() - .background({r: 0, g: 0, b: 0, alpha: 0}) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('png', info.format); - assert.strictEqual(32, info.width); - assert.strictEqual(16, info.height); - assert.strictEqual(4, info.channels); - fixtures.assertSimilar(fixtures.expected('embed-16bit-rgba.png'), data, done); - }); - }); - - it('PNG with 2 channels', function (done) { - sharp(fixtures.inputPngWithGreyAlpha) - .resize(32, 16) - .embed() - .background({r: 0, g: 0, b: 0, alpha: 0}) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('png', info.format); - assert.strictEqual(32, info.width); - assert.strictEqual(16, info.height); - assert.strictEqual(4, info.channels); - fixtures.assertSimilar(fixtures.expected('embed-2channel.png'), data, done); - }); - }); - - it('embed TIFF in LAB colourspace onto RGBA background', function (done) { - sharp(fixtures.inputTiffCielab) - .resize(64, 128) - .embed() - .background({r: 255, g: 102, b: 0, alpha: 0.5}) - .png() - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('png', info.format); - assert.strictEqual(64, info.width); - assert.strictEqual(128, info.height); - assert.strictEqual(4, info.channels); - fixtures.assertSimilar(fixtures.expected('embed-lab-into-rgba.png'), data, done); - }); - }); - - it('Enlarge and embed', function (done) { - sharp(fixtures.inputPngWithOneColor) - .embed() - .resize(320, 240) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('png', info.format); - assert.strictEqual(320, info.width); - assert.strictEqual(240, info.height); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(fixtures.expected('embed-enlarge.png'), data, done); - }); - }); -}); diff --git a/test/unit/erode.js b/test/unit/erode.js new file mode 100644 index 000000000..0afa874df --- /dev/null +++ b/test/unit/erode.js @@ -0,0 +1,37 @@ +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('Erode', () => { + it('erode 1 png', (_t, done) => { + sharp(fixtures.inputPngDotAndLines) + .erode(1) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(100, info.width); + assert.strictEqual(100, info.height); + fixtures.assertSimilar(fixtures.expected('erode-1.png'), data, done); + }); + }); + + it('erode 1 png - default width', (_t, done) => { + sharp(fixtures.inputPngDotAndLines) + .erode() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(100, info.width); + assert.strictEqual(100, info.height); + fixtures.assertSimilar(fixtures.expected('erode-1.png'), data, done); + }); + }); + + it('invalid erosion width', () => { + assert.throws(() => { + sharp(fixtures.inputJpg).erode(-1); + }); + }); +}); diff --git a/test/unit/extend.js b/test/unit/extend.js index 853503fae..f9d91112a 100644 --- a/test/unit/extend.js +++ b/test/unit/extend.js @@ -1,59 +1,205 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const sharp = require('../../'); const fixtures = require('../fixtures'); -describe('Extend', function () { - it('extend all sides equally with RGB', function (done) { - sharp(fixtures.inputJpg) - .resize(120) - .background({r: 255, g: 0, b: 0}) - .extend(10) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(140, info.width); - assert.strictEqual(118, info.height); - fixtures.assertSimilar(fixtures.expected('extend-equal.jpg'), data, done); - }); +describe('Extend', () => { + describe('extend all sides equally via a single value', () => { + it('JPEG', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(120) + .extend(10) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(140, info.width); + assert.strictEqual(118, info.height); + fixtures.assertSimilar(fixtures.expected('extend-equal-single.jpg'), data, done); + }); + }); + + it('Animated WebP', (_t, done) => { + sharp(fixtures.inputWebPAnimated, { pages: -1 }) + .resize(120) + .extend(10) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(140, info.width); + assert.strictEqual(140 * 9, info.height); + fixtures.assertSimilar(fixtures.expected('extend-equal-single.webp'), data, done); + }); + }); }); - it('extend sides unequally with RGBA', function (done) { - sharp(fixtures.inputPngWithTransparency16bit) - .resize(120) - .background({r: 0, g: 0, b: 0, alpha: 0}) - .extend({top: 50, bottom: 0, left: 10, right: 35}) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(165, info.width); - assert.strictEqual(170, info.height); - fixtures.assertSimilar(fixtures.expected('extend-unequal.png'), data, done); - }); + ['background', 'copy', 'mirror', 'repeat'].forEach(extendWith => { + it(`extends all sides with animated WebP (${extendWith})`, (_t, done) => { + sharp(fixtures.inputWebPAnimated, { pages: -1 }) + .resize(120) + .extend({ + extendWith, + top: 40, + bottom: 40, + left: 40, + right: 40 + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(200, info.width); + assert.strictEqual(200 * 9, info.height); + fixtures.assertSimilar(fixtures.expected(`extend-equal-${extendWith}.webp`), data, done); + }); + }); + + it(`extend all sides equally with RGB (${extendWith})`, (_t, done) => { + sharp(fixtures.inputJpg) + .resize(120) + .extend({ + extendWith, + top: 10, + bottom: 10, + left: 10, + right: 10, + background: { r: 255, g: 0, b: 0 } + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(140, info.width); + assert.strictEqual(118, info.height); + fixtures.assertSimilar(fixtures.expected(`extend-equal-${extendWith}.jpg`), data, done); + }); + }); + + it(`extend sides unequally with RGBA (${extendWith})`, (_t, done) => { + sharp(fixtures.inputPngWithTransparency16bit) + .resize(120) + .extend({ + extendWith, + top: 50, + left: 10, + right: 35, + background: { r: 0, g: 0, b: 0, alpha: 0 } + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(165, info.width); + assert.strictEqual(170, info.height); + fixtures.assertSimilar(fixtures.expected(`extend-unequal-${extendWith}.png`), data, done); + }); + }); + + it(`PNG with 2 channels (${extendWith})`, (_t, done) => { + sharp(fixtures.inputPngWithGreyAlpha) + .extend({ + extendWith, + top: 50, + bottom: 50, + left: 80, + right: 80, + background: 'transparent' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(560, info.width); + assert.strictEqual(400, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected(`extend-2channel-${extendWith}.png`), data, done); + }); + }); }); - it('missing parameter fails', function () { - assert.throws(function () { + it('extend top with mirroring uses ordered read', async () => { + const data = await sharp(fixtures.inputJpg) + .extend({ + extendWith: 'mirror', + top: 1 + }) + .png({ compressionLevel: 0 }) + .toBuffer(); + + const { width, height } = await sharp(data).metadata(); + assert.strictEqual(2725, width); + assert.strictEqual(2226, height); + }); + + it('multi-page extend uses ordered read', async () => { + const multiPageTiff = await sharp(fixtures.inputGifAnimated, { animated: true }) + .resize({ width: 8, height: 48 }) + .tiff() + .toBuffer(); + + const data = await sharp(multiPageTiff, { pages: -1 }) + .extend({ + background: 'red', + top: 1 + }) + .png({ compressionLevel: 0 }) + .toBuffer(); + + const { width, height } = await sharp(data).metadata(); + assert.strictEqual(8, width); + assert.strictEqual(1470, height); + }); + + it('missing parameter fails', () => { + assert.throws(() => { sharp().extend(); }); }); - it('negative fails', function () { - assert.throws(function () { + it('negative fails', () => { + assert.throws(() => { sharp().extend(-1); }); }); - it('partial object fails', function () { - assert.throws(function () { - sharp().extend({top: 1}); - }); + it('invalid top fails', () => { + assert.throws( + () => sharp().extend({ top: 'fail' }), + /Expected positive integer for top but received fail of type string/ + ); + }); + it('invalid bottom fails', () => { + assert.throws( + () => sharp().extend({ bottom: -1 }), + /Expected positive integer for bottom but received -1 of type number/ + ); + }); + it('invalid left fails', () => { + assert.throws( + () => sharp().extend({ left: 0.1 }), + /Expected positive integer for left but received 0.1 of type number/ + ); + }); + it('invalid right fails', () => { + assert.throws( + () => sharp().extend({ right: {} }), + /Expected positive integer for right but received \[object Object\] of type object/ + ); + }); + it('invalid extendWith fails', () => { + assert.throws( + () => sharp().extend({ extendWith: 'invalid-value' }), + /Expected one of: background, copy, repeat, mirror for extendWith but received invalid-value of type string/ + ); + }); + it('can set all edges apart from right', () => { + assert.doesNotThrow(() => sharp().extend({ top: 1, left: 2, bottom: 3 })); }); - it('should add alpha channel before extending with a transparent Background', function (done) { + it('should add alpha channel before extending with a transparent Background', (_t, done) => { sharp(fixtures.inputJpgWithLandscapeExif1) - .background({r: 0, g: 0, b: 0, alpha: 0}) + .extend({ + bottom: 10, + right: 10, + background: { r: 0, g: 0, b: 0, alpha: 0 } + }) .toFormat(sharp.format.png) - .extend({top: 0, bottom: 10, left: 0, right: 10}) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(610, info.width); assert.strictEqual(460, info.height); @@ -61,18 +207,25 @@ describe('Extend', function () { }); }); - it('PNG with 2 channels', function (done) { - sharp(fixtures.inputPngWithGreyAlpha) - .background('transparent') - .extend({top: 0, bottom: 20, left: 0, right: 20}) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('png', info.format); - assert.strictEqual(420, info.width); - assert.strictEqual(320, info.height); - assert.strictEqual(4, info.channels); - fixtures.assertSimilar(fixtures.expected('extend-2channel.png'), data, done); - }); + it('Premultiply background when compositing', async () => { + const background = { r: 191, g: 25, b: 66, alpha: 0.8 }; + const data = await sharp({ + create: { + width: 1, height: 1, channels: 4, background: '#fff0' + } + }) + .composite([{ + input: { + create: { + width: 1, height: 1, channels: 4, background + } + } + }]) + .extend({ + left: 1, background + }) + .raw() + .toBuffer(); + assert.deepStrictEqual(Array.from(data), [191, 25, 66, 204, 191, 25, 66, 204]); }); }); diff --git a/test/unit/extract.js b/test/unit/extract.js index 22a70b746..4fee38f08 100644 --- a/test/unit/extract.js +++ b/test/unit/extract.js @@ -1,26 +1,30 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const sharp = require('../../'); const fixtures = require('../fixtures'); -describe('Partial image extraction', function () { - it('JPEG', function (done) { +describe('Partial image extraction', () => { + it('JPEG', (_t, done) => { sharp(fixtures.inputJpg) .extract({ left: 2, top: 2, width: 20, height: 20 }) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(20, info.width); assert.strictEqual(20, info.height); - fixtures.assertSimilar(fixtures.expected('extract.jpg'), data, { threshold: 8 }, done); + fixtures.assertSimilar(fixtures.expected('extract.jpg'), data, done); }); }); - it('PNG', function (done) { + it('PNG', (_t, done) => { sharp(fixtures.inputPng) .extract({ left: 200, top: 300, width: 400, height: 200 }) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(400, info.width); assert.strictEqual(200, info.height); @@ -28,38 +32,59 @@ describe('Partial image extraction', function () { }); }); - if (sharp.format.webp.output.file) { - it('WebP', function (done) { - sharp(fixtures.inputWebP) - .extract({ left: 100, top: 50, width: 125, height: 200 }) - .toBuffer(function (err, data, info) { + it('WebP', (_t, done) => { + sharp(fixtures.inputWebP) + .extract({ left: 100, top: 50, width: 125, height: 200 }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(125, info.width); + assert.strictEqual(200, info.height); + fixtures.assertSimilar(fixtures.expected('extract.webp'), data, done); + }); + }); + + describe('Animated WebP', () => { + it('Before resize', (_t, done) => { + sharp(fixtures.inputWebPAnimated, { pages: -1 }) + .extract({ left: 0, top: 30, width: 80, height: 20 }) + .resize(320, 80) + .toBuffer((err, data, info) => { if (err) throw err; - assert.strictEqual(125, info.width); - assert.strictEqual(200, info.height); - fixtures.assertSimilar(fixtures.expected('extract.webp'), data, done); + assert.strictEqual(320, info.width); + assert.strictEqual(80 * 9, info.height); + fixtures.assertSimilar(fixtures.expected('gravity-center-height.webp'), data, done); }); }); - } - - if (sharp.format.tiff.output.file) { - it('TIFF', function (done) { - sharp(fixtures.inputTiff) - .extract({ left: 34, top: 63, width: 341, height: 529 }) - .jpeg() - .toBuffer(function (err, data, info) { + + it('After resize', (_t, done) => { + sharp(fixtures.inputWebPAnimated, { pages: -1 }) + .resize(320, 320) + .extract({ left: 0, top: 120, width: 320, height: 80 }) + .toBuffer((err, data, info) => { if (err) throw err; - assert.strictEqual(341, info.width); - assert.strictEqual(529, info.height); - fixtures.assertSimilar(fixtures.expected('extract.tiff'), data, done); + assert.strictEqual(320, info.width); + assert.strictEqual(80 * 9, info.height); + fixtures.assertSimilar(fixtures.expected('gravity-center-height.webp'), data, done); }); }); - } + }); + + it('TIFF', (_t, done) => { + sharp(fixtures.inputTiff) + .extract({ left: 34, top: 63, width: 341, height: 529 }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(341, info.width); + assert.strictEqual(529, info.height); + fixtures.assertSimilar(fixtures.expected('extract.tiff'), data, done); + }); + }); - it('Before resize', function (done) { + it('Before resize', (_t, done) => { sharp(fixtures.inputJpg) .extract({ left: 10, top: 10, width: 10, height: 500 }) .resize(100, 100) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(100, info.width); assert.strictEqual(100, info.height); @@ -67,12 +92,13 @@ describe('Partial image extraction', function () { }); }); - it('After resize and crop', function (done) { + it('After resize and crop', (_t, done) => { sharp(fixtures.inputJpg) - .resize(500, 500) - .crop(sharp.gravity.north) + .resize(500, 500, { + position: sharp.gravity.north + }) .extract({ left: 10, top: 10, width: 100, height: 100 }) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(100, info.width); assert.strictEqual(100, info.height); @@ -80,13 +106,14 @@ describe('Partial image extraction', function () { }); }); - it('Before and after resize and crop', function (done) { + it('Before and after resize and crop', (_t, done) => { sharp(fixtures.inputJpg) .extract({ left: 0, top: 0, width: 700, height: 700 }) - .resize(500, 500) - .crop(sharp.gravity.north) + .resize(500, 500, { + position: sharp.gravity.north + }) .extract({ left: 10, top: 10, width: 100, height: 100 }) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(100, info.width); assert.strictEqual(100, info.height); @@ -94,11 +121,12 @@ describe('Partial image extraction', function () { }); }); - it('Extract then rotate', function (done) { + it('Extract then rotate', (_t, done) => { sharp(fixtures.inputPngWithGreyAlpha) .extract({ left: 20, top: 10, width: 380, height: 280 }) .rotate(90) - .toBuffer(function (err, data, info) { + .jpeg() + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(280, info.width); assert.strictEqual(380, info.height); @@ -106,83 +134,202 @@ describe('Partial image extraction', function () { }); }); - it('Rotate then extract', function (done) { + it('Rotate then extract', (_t, done) => { sharp(fixtures.inputPngWithGreyAlpha) .rotate(90) .extract({ left: 20, top: 10, width: 280, height: 380 }) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(280, info.width); assert.strictEqual(380, info.height); - fixtures.assertSimilar(fixtures.expected('rotate-extract.jpg'), data, { threshold: 6 }, done); + fixtures.assertSimilar(fixtures.expected('rotate-extract.jpg'), data, done); + }); + }); + + it('Extract then rotate then extract', (_t, done) => { + sharp(fixtures.inputPngWithGreyAlpha) + .extract({ left: 20, top: 10, width: 180, height: 280 }) + .rotate(90) + .extract({ left: 20, top: 10, width: 200, height: 100 }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + fixtures.assertSimilar(fixtures.expected('extract-rotate-extract.jpg'), data, done); + }); + }); + + it('Extract then rotate non-90 anagle', (_t, done) => { + sharp(fixtures.inputPngWithGreyAlpha) + .extract({ left: 20, top: 10, width: 380, height: 280 }) + .rotate(45) + .jpeg() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(467, info.width); + assert.strictEqual(467, info.height); + fixtures.assertSimilar(fixtures.expected('extract-rotate-45.jpg'), data, done); + }); + }); + + it('Rotate then extract non-90 angle', (_t, done) => { + sharp(fixtures.inputPngWithGreyAlpha) + .rotate(45) + .extract({ left: 20, top: 10, width: 380, height: 280 }) + .jpeg() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(380, info.width); + assert.strictEqual(280, info.height); + fixtures.assertSimilar(fixtures.expected('rotate-extract-45.jpg'), data, done); }); }); - describe('Invalid parameters', function () { - describe('using the legacy extract(top,left,width,height) syntax', function () { - it('String top', function () { - assert.throws(function () { + describe('Apply exif orientation and mirroring then extract', () => { + [ + { + name: 'EXIF-1', + image: fixtures.inputJpgWithLandscapeExif1 + }, + { + name: 'EXIF-2', + image: fixtures.inputJpgWithLandscapeExif2 + }, + { + name: 'EXIF-3', + image: fixtures.inputJpgWithLandscapeExif3 + }, + { + name: 'EXIF-4', + image: fixtures.inputJpgWithLandscapeExif4 + }, + { + name: 'EXIF-5', + image: fixtures.inputJpgWithLandscapeExif5 + }, + { + name: 'EXIF-6', + image: fixtures.inputJpgWithLandscapeExif6 + }, + { + name: 'EXIF-7', + image: fixtures.inputJpgWithLandscapeExif7 + }, + { + name: 'EXIF-8', + image: fixtures.inputJpgWithLandscapeExif8 + } + ].forEach(({ name, image }) => { + it(name, (_t, done) => { + sharp(image) + .rotate() + .extract({ left: 0, top: 208, width: 60, height: 40 }) + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(fixtures.expected('rotate-mirror-extract.jpg'), data, done); + }); + }); + }); + }); + + describe('Invalid parameters', () => { + describe('using the legacy extract(top,left,width,height) syntax', () => { + it('String top', () => { + assert.throws(() => { sharp(fixtures.inputJpg).extract('spoons', 10, 10, 10); }); }); - it('Non-integral left', function () { - assert.throws(function () { + it('Non-integral left', () => { + assert.throws(() => { sharp(fixtures.inputJpg).extract(10, 10.2, 10, 10); }); }); - it('Negative width - negative', function () { - assert.throws(function () { + it('Negative width - negative', () => { + assert.throws(() => { sharp(fixtures.inputJpg).extract(10, 10, -10, 10); }); }); - it('Null height', function () { - assert.throws(function () { + it('Null height', () => { + assert.throws(() => { sharp(fixtures.inputJpg).extract(10, 10, 10, null); }); }); }); - it('Undefined', function () { - assert.throws(function () { + it('Undefined', () => { + assert.throws(() => { sharp(fixtures.inputJpg).extract(); }); }); - it('String top', function () { - assert.throws(function () { + it('String top', () => { + assert.throws(() => { sharp(fixtures.inputJpg).extract({ left: 10, top: 'spoons', width: 10, height: 10 }); }); }); - it('Non-integral left', function () { - assert.throws(function () { + it('Non-integral left', () => { + assert.throws(() => { sharp(fixtures.inputJpg).extract({ left: 10.2, top: 10, width: 10, height: 10 }); }); }); - it('Negative width - negative', function () { - assert.throws(function () { + it('Negative width - negative', () => { + assert.throws(() => { sharp(fixtures.inputJpg).extract({ left: 10, top: 10, width: -10, height: 10 }); }); }); - it('Null height', function () { - assert.throws(function () { + it('Null height', () => { + assert.throws(() => { sharp(fixtures.inputJpg).extract({ left: 10, top: 10, width: 10, height: null }); }); }); - it('Bad image area', function (done) { + it('Bad image area', (_t, done) => { sharp(fixtures.inputJpg) .extract({ left: 3000, top: 10, width: 10, height: 10 }) - .toBuffer(function (err) { + .toBuffer((err) => { assert(err instanceof Error); - assert.strictEqual(err.message, 'extract_area: bad extract area\n'); + assert.strictEqual(err.message, 'extract_area: bad extract area'); done(); }); }); + + it('Multiple extract emits warning', () => { + let warningMessage = ''; + const s = sharp(); + s.on('warning', (msg) => { warningMessage = msg; }); + const options = { top: 0, left: 0, width: 1, height: 1 }; + s.extract(options).extract(options); + assert.strictEqual(warningMessage, ''); + s.extract(options); + assert.strictEqual(warningMessage, 'ignoring previous extract options'); + }); + + it('Multiple rotate+extract emits warning', () => { + let warningMessage = ''; + const s = sharp().rotate(); + s.on('warning', (msg) => { warningMessage = msg; }); + const options = { top: 0, left: 0, width: 1, height: 1 }; + s.extract(options).extract(options); + assert.strictEqual(warningMessage, ''); + s.extract(options); + assert.strictEqual(warningMessage, 'ignoring previous extract options'); + }); + + it('Multiple extract+resize emits warning', () => { + let warningMessage = ''; + const s = sharp(); + s.on('warning', (msg) => { warningMessage = msg; }); + const options = { top: 0, left: 0, width: 1, height: 1 }; + s.extract(options).extract(options); + assert.strictEqual(warningMessage, ''); + s.resize(1); + assert.strictEqual(warningMessage, 'operation order will be: extract, resize, extract'); + }); }); }); diff --git a/test/unit/extractChannel.js b/test/unit/extractChannel.js index a12fa7080..761b816df 100644 --- a/test/unit/extractChannel.js +++ b/test/unit/extractChannel.js @@ -1,16 +1,20 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const sharp = require('../../'); const fixtures = require('../fixtures'); -describe('Image channel extraction', function () { - it('Red channel', function (done) { +describe('Image channel extraction', () => { + it('Red channel', (_t, done) => { sharp(fixtures.inputJpg) .extractChannel('red') .resize(320, 240) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); @@ -18,11 +22,11 @@ describe('Image channel extraction', function () { }); }); - it('Green channel', function (done) { + it('Green channel', (_t, done) => { sharp(fixtures.inputJpg) .extractChannel('green') .resize(320, 240) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); @@ -30,11 +34,11 @@ describe('Image channel extraction', function () { }); }); - it('Blue channel', function (done) { + it('Blue channel', (_t, done) => { sharp(fixtures.inputJpg) .extractChannel('blue') .resize(320, 240) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); @@ -42,11 +46,11 @@ describe('Image channel extraction', function () { }); }); - it('Blue channel by number', function (done) { + it('Blue channel by number', (_t, done) => { sharp(fixtures.inputJpg) .extractChannel(2) .resize(320, 240) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); @@ -54,26 +58,59 @@ describe('Image channel extraction', function () { }); }); - it('Invalid channel number', function () { - assert.throws(function () { + it('With colorspace conversion', async () => { + const [chroma] = await sharp({ create: { width: 1, height: 1, channels: 3, background: 'red' } }) + .toColourspace('lch') + .extractChannel(1) + .toBuffer(); + + assert.strictEqual(chroma, 104); + }); + + it('Alpha from 16-bit PNG', (_t, done) => { + const output = fixtures.path('output.extract-alpha-16bit.png'); + sharp(fixtures.inputPngWithTransparency16bit) + .resize(16) + .extractChannel(3) + .toFile(output, (err) => { + if (err) throw err; + fixtures.assertMaxColourDistance(output, fixtures.expected('extract-alpha-16bit.png')); + done(); + }); + }); + + it('Alpha from 2-channel input', (_t, done) => { + const output = fixtures.path('output.extract-alpha-2-channel.png'); + sharp(fixtures.inputPngWithGreyAlpha) + .extractChannel('alpha') + .toFile(output, (err, info) => { + if (err) throw err; + assert.strictEqual(1, info.channels); + fixtures.assertMaxColourDistance(output, fixtures.expected('extract-alpha-2-channel.png')); + done(); + }); + }); + + it('Invalid channel number', () => { + assert.throws(() => { sharp(fixtures.inputJpg) .extractChannel(-1); }); }); - it('No arguments', function () { - assert.throws(function () { + it('No arguments', () => { + assert.throws(() => { sharp(fixtures.inputJpg) .extractChannel(); }); }); - it('Non-existant channel', function (done) { - sharp(fixtures.inputPng) - .extractChannel(1) - .toBuffer(function (err) { - assert(err instanceof Error); - done(); - }); - }); + it('Non-existent channel', async () => + await assert.rejects( + () => sharp({ create: { width: 1, height: 1, channels: 3, background: 'red' } }) + .extractChannel(3) + .toBuffer(), + /Cannot extract channel 3 from image with channels 0-2/ + ) + ); }); diff --git a/test/unit/failOn.js b/test/unit/failOn.js new file mode 100644 index 000000000..c4d44e892 --- /dev/null +++ b/test/unit/failOn.js @@ -0,0 +1,115 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); +const fs = require('node:fs'); + +const sharp = require('../../lib'); +const fixtures = require('../fixtures'); + +describe('failOn', () => { + it('handles truncated JPEG', (_t, done) => { + sharp(fixtures.inputJpgTruncated, { failOn: 'none' }) + .resize(32, 24) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(32, info.width); + assert.strictEqual(24, info.height); + fixtures.assertSimilar(fixtures.expected('truncated.jpg'), data, done); + }); + }); + + it('handles truncated PNG, emits warnings', (_t, done) => { + let isWarningEmitted = false; + sharp(fixtures.inputPngTruncated, { failOn: 'none' }) + .on('warning', (warning) => { + assert.ok( + ['read gave 2 warnings', 'not enough data', 'end of stream'] + .some(m => warning.includes(m))); + isWarningEmitted = true; + }) + .resize(32, 24) + .toBuffer((err, _data, info) => { + if (err) throw err; + assert.strictEqual(true, isWarningEmitted); + assert.strictEqual('png', info.format); + assert.strictEqual(32, info.width); + assert.strictEqual(24, info.height); + done(); + }); + }); + + it('throws for invalid options', () => { + assert.throws( + () => sharp({ failOn: 'zoinks' }), + /Expected one of: none, truncated, error, warning for failOn but received zoinks of type string/ + ); + assert.throws( + () => sharp({ failOn: 1 }), + /Expected one of: none, truncated, error, warning for failOn but received 1 of type number/ + ); + }); + + it('deprecated failOnError', () => { + assert.doesNotThrow( + () => sharp({ failOnError: true }) + ); + assert.doesNotThrow( + () => sharp({ failOnError: false }) + ); + assert.throws( + () => sharp({ failOnError: 'zoinks' }), + /Expected boolean for failOnError but received zoinks of type string/ + ); + assert.throws( + () => sharp({ failOnError: 1 }), + /Expected boolean for failOnError but received 1 of type number/ + ); + }); + + it('returns errors to callback for truncated JPEG', (_t, done) => { + sharp(fixtures.inputJpgTruncated, { failOn: 'truncated' }).toBuffer((err, data, info) => { + assert.ok(err.message.includes('VipsJpeg: premature end of'), err); + assert.strictEqual(data, undefined); + assert.strictEqual(info, undefined); + done(); + }); + }); + + it('returns errors to callback for truncated PNG', (_t, done) => { + sharp(fixtures.inputPngTruncated, { failOn: 'truncated' }).toBuffer((err, data, info) => { + assert.ok(err.message.includes('read error'), err); + assert.strictEqual(data, undefined); + assert.strictEqual(info, undefined); + done(); + }); + }); + + it('rejects promises for truncated JPEG', (_t, done) => { + sharp(fixtures.inputJpgTruncated, { failOn: 'error' }) + .toBuffer() + .then(() => { + throw new Error('Expected rejection'); + }) + .catch(err => { + done(err.message.includes('VipsJpeg: premature end of') ? undefined : err); + }); + }); + + it('handles stream-based input', async () => { + const writable = sharp({ failOn: 'none' }).resize(32, 24); + fs.createReadStream(fixtures.inputJpgTruncated).pipe(writable); + return writable.toBuffer(); + }); + + it('converts warnings to error for GeoTIFF', async () => { + await assert.rejects( + sharp(fixtures.inputTiffGeo).toBuffer(), + /Tag 34737/ + ); + }); +}); diff --git a/test/unit/fixtures.js b/test/unit/fixtures.js index 8fecf2dec..9d4583ea2 100644 --- a/test/unit/fixtures.js +++ b/test/unit/fixtures.js @@ -1,25 +1,29 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const fixtures = require('../fixtures'); -describe('Test fixtures', function () { - describe('assertMaxColourDistance', function () { - it('should throw an Error when images have a different number of channels', function () { - assert.throws(function () { +describe('Test fixtures', () => { + describe('assertMaxColourDistance', () => { + it('should throw an Error when images have a different number of channels', () => { + assert.throws(() => { fixtures.assertMaxColourDistance(fixtures.inputPngOverlayLayer1, fixtures.inputJpg); }); }); - it('should throw an Error when images have different dimensions', function () { - assert.throws(function () { + it('should throw an Error when images have different dimensions', () => { + assert.throws(() => { fixtures.assertMaxColourDistance(fixtures.inputJpg, fixtures.inputJpgWithExif); }); }); - it('should accept a zero threshold when comparing an image to itself', function () { + it('should accept a zero threshold when comparing an image to itself', () => { const image = fixtures.inputPngOverlayLayer0; fixtures.assertMaxColourDistance(image, image, 0); }); - it('should accept a numeric threshold for two different images', function () { + it('should accept a numeric threshold for two different images', () => { fixtures.assertMaxColourDistance(fixtures.inputPngOverlayLayer0, fixtures.inputPngOverlayLayer1, 100); }); }); diff --git a/test/unit/gamma.js b/test/unit/gamma.js index f5c3e45c3..2e0b3fcde 100644 --- a/test/unit/gamma.js +++ b/test/unit/gamma.js @@ -1,15 +1,19 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const sharp = require('../../'); const fixtures = require('../fixtures'); -describe('Gamma correction', function () { - it('value of 0.0 (disabled)', function (done) { +describe('Gamma correction', () => { + it('value of 0.0 (disabled)', (_t, done) => { sharp(fixtures.inputJpgWithGammaHoliness) .resize(129, 111) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(129, info.width); @@ -18,11 +22,11 @@ describe('Gamma correction', function () { }); }); - it('value of 2.2 (default)', function (done) { + it('value of 2.2 (default)', (_t, done) => { sharp(fixtures.inputJpgWithGammaHoliness) .resize(129, 111) .gamma() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(129, info.width); @@ -31,11 +35,11 @@ describe('Gamma correction', function () { }); }); - it('value of 3.0', function (done) { + it('value of 3.0', (_t, done) => { sharp(fixtures.inputJpgWithGammaHoliness) .resize(129, 111) .gamma(3) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(129, info.width); @@ -44,21 +48,41 @@ describe('Gamma correction', function () { }); }); - it('alpha transparency', function (done) { + it('input value of 2.2, output value of 3.0', (_t, done) => { + sharp(fixtures.inputJpgWithGammaHoliness) + .resize(129, 111) + .gamma(2.2, 3.0) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(129, info.width); + assert.strictEqual(111, info.height); + fixtures.assertSimilar(fixtures.expected('gamma-in-2.2-out-3.0.jpg'), data, { threshold: 6 }, done); + }); + }); + + it('alpha transparency', (_t, done) => { sharp(fixtures.inputPngOverlayLayer1) .resize(320) .gamma() - .toBuffer(function (err, data, info) { + .jpeg() + .toBuffer((err, data, info) => { if (err) throw err; - assert.strictEqual('png', info.format); + assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); - fixtures.assertSimilar(fixtures.expected('gamma-alpha.jpg'), data, { threshold: 20 }, done); + fixtures.assertSimilar(fixtures.expected('gamma-alpha.jpg'), data, done); }); }); - it('invalid value', function () { - assert.throws(function () { + it('invalid first parameter value', () => { + assert.throws(() => { sharp(fixtures.inputJpgWithGammaHoliness).gamma(4); }); }); + + it('invalid second parameter value', () => { + assert.throws(() => { + sharp(fixtures.inputJpgWithGammaHoliness).gamma(2.2, 4); + }); + }); }); diff --git a/test/unit/gif.js b/test/unit/gif.js new file mode 100644 index 000000000..4b7d4a76a --- /dev/null +++ b/test/unit/gif.js @@ -0,0 +1,278 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const fs = require('node:fs'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('GIF input', () => { + it('GIF Buffer to JPEG Buffer', () => + sharp(fs.readFileSync(fixtures.inputGif)) + .resize(8, 4) + .jpeg() + .toBuffer({ resolveWithObject: true }) + .then(({ data, info }) => { + assert.strictEqual(true, data.length > 0); + assert.strictEqual(data.length, info.size); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(8, info.width); + assert.strictEqual(4, info.height); + }) + ); + + it('2 channel GIF file to PNG Buffer', () => + sharp(fixtures.inputGifGreyPlusAlpha) + .resize(8, 4) + .png() + .toBuffer({ resolveWithObject: true }) + .then(({ data, info }) => { + assert.strictEqual(true, data.length > 0); + assert.strictEqual(data.length, info.size); + assert.strictEqual('png', info.format); + assert.strictEqual(8, info.width); + assert.strictEqual(4, info.height); + assert.strictEqual(4, info.channels); + }) + ); + + it('Animated GIF first page to non-animated GIF', () => + sharp(fixtures.inputGifAnimated) + .toBuffer({ resolveWithObject: true }) + .then(({ data, info }) => { + assert.strictEqual(true, data.length > 0); + assert.strictEqual(data.length, info.size); + assert.strictEqual('gif', info.format); + assert.strictEqual(80, info.width); + assert.strictEqual(80, info.height); + assert.strictEqual(4, info.channels); + assert.strictEqual(undefined, info.pages); + assert.strictEqual(undefined, info.pageHeight); + }) + ); + + it('Animated GIF round trip', () => + sharp(fixtures.inputGifAnimated, { pages: -1 }) + .toBuffer({ resolveWithObject: true }) + .then(({ data, info }) => { + assert.strictEqual(true, data.length > 0); + assert.strictEqual(data.length, info.size); + assert.strictEqual('gif', info.format); + assert.strictEqual(80, info.width); + assert.strictEqual(2400, info.height); + assert.strictEqual(4, info.channels); + assert.strictEqual(30, info.pages); + assert.strictEqual(80, info.pageHeight); + }) + ); + + it('GIF with reduced colours, no dither, low effort reduces file size', async () => { + const original = await sharp(fixtures.inputJpg) + .resize(120, 80) + .gif() + .toBuffer(); + + const reduced = await sharp(fixtures.inputJpg) + .resize(120, 80) + .gif({ + colours: 128, + dither: 0, + effort: 1 + }) + .toBuffer(); + + assert.strictEqual(true, reduced.length < original.length); + }); + + it('valid reuse', () => { + assert.doesNotThrow(() => sharp().gif({ reuse: true })); + assert.doesNotThrow(() => sharp().gif({ reuse: false })); + }); + + it('invalid reuse throws', () => { + assert.throws( + () => sharp().gif({ reuse: -1 }), + /Expected boolean for gifReuse but received -1 of type number/ + ); + assert.throws( + () => sharp().gif({ reuse: 'fail' }), + /Expected boolean for gifReuse but received fail of type string/ + ); + }); + + it('progressive changes file size', async () => { + const nonProgressive = await sharp(fixtures.inputGif).gif({ progressive: false }).toBuffer(); + const progressive = await sharp(fixtures.inputGif).gif({ progressive: true }).toBuffer(); + assert(nonProgressive.length !== progressive.length); + }); + + it('invalid progressive throws', () => { + assert.throws( + () => sharp().gif({ progressive: -1 }), + /Expected boolean for gifProgressive but received -1 of type number/ + ); + assert.throws( + () => sharp().gif({ progressive: 'fail' }), + /Expected boolean for gifProgressive but received fail of type string/ + ); + }); + + it('invalid loop throws', () => { + assert.throws(() => { + sharp().gif({ loop: -1 }); + }); + assert.throws(() => { + sharp().gif({ loop: 65536 }); + }); + }); + + it('invalid delay throws', () => { + assert.throws(() => { + sharp().gif({ delay: -1 }); + }); + assert.throws(() => { + sharp().gif({ delay: [65536] }); + }); + }); + + it('invalid colour throws', () => { + assert.throws(() => { + sharp().gif({ colours: 1 }); + }); + assert.throws(() => { + sharp().gif({ colours: 'fail' }); + }); + }); + + it('invalid effort throws', () => { + assert.throws(() => { + sharp().gif({ effort: 0 }); + }); + assert.throws(() => { + sharp().gif({ effort: 'fail' }); + }); + }); + + it('invalid dither throws', () => { + assert.throws(() => { + sharp().gif({ dither: 1.1 }); + }); + assert.throws(() => { + sharp().gif({ effort: 'fail' }); + }); + }); + + it('invalid interFrameMaxError throws', () => { + assert.throws( + () => sharp().gif({ interFrameMaxError: 33 }), + /Expected number between 0.0 and 32.0 for interFrameMaxError but received 33 of type number/ + ); + assert.throws( + () => sharp().gif({ interFrameMaxError: 'fail' }), + /Expected number between 0.0 and 32.0 for interFrameMaxError but received fail of type string/ + ); + }); + + it('invalid interPaletteMaxError throws', () => { + assert.throws( + () => sharp().gif({ interPaletteMaxError: 257 }), + /Expected number between 0.0 and 256.0 for interPaletteMaxError but received 257 of type number/ + ); + assert.throws( + () => sharp().gif({ interPaletteMaxError: 'fail' }), + /Expected number between 0.0 and 256.0 for interPaletteMaxError but received fail of type string/ + ); + }); + + it('invalid keepDuplicateFrames throws', () => { + assert.throws( + () => sharp().gif({ keepDuplicateFrames: -1 }), + /Expected boolean for keepDuplicateFrames but received -1 of type number/ + ); + assert.throws( + () => sharp().gif({ keepDuplicateFrames: 'fail' }), + /Expected boolean for keepDuplicateFrames but received fail of type string/ + ); + }); + + it('should work with streams when only animated is set', (_t, done) => { + fs.createReadStream(fixtures.inputGifAnimated) + .pipe(sharp({ animated: true })) + .gif() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('gif', info.format); + fixtures.assertSimilar(fixtures.inputGifAnimated, data, done); + }); + }); + + it('should work with streams when only pages is set', (_t, done) => { + fs.createReadStream(fixtures.inputGifAnimated) + .pipe(sharp({ pages: -1 })) + .gif() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('gif', info.format); + fixtures.assertSimilar(fixtures.inputGifAnimated, data, done); + }); + }); + + it('should optimise file size via interFrameMaxError', async () => { + const input = sharp(fixtures.inputGifAnimated, { animated: true }); + const before = await input.gif({ interFrameMaxError: 0 }).toBuffer(); + const after = await input.gif({ interFrameMaxError: 10 }).toBuffer(); + assert.strict(before.length > after.length); + }); + + it('should optimise file size via interPaletteMaxError', async () => { + const input = sharp(fixtures.inputGifAnimated, { animated: true }); + const before = await input.gif({ interPaletteMaxError: 0 }).toBuffer(); + const after = await input.gif({ interPaletteMaxError: 100 }).toBuffer(); + assert.strict(before.length > after.length); + }); + + it('should keep duplicate frames via keepDuplicateFrames', async () => { + const create = { width: 8, height: 8, channels: 4, background: 'blue' }; + const input = sharp([{ create }, { create }], { join: { animated: true } }); + + const before = await input.gif({ keepDuplicateFrames: false }).toBuffer(); + const after = await input.gif({ keepDuplicateFrames: true }).toBuffer(); + assert.strict(before.length < after.length); + + const beforeMeta = await sharp(before).metadata(); + const afterMeta = await sharp(after).metadata(); + assert.strictEqual(beforeMeta.pages, 1); + assert.strictEqual(afterMeta.pages, 2); + }); + + it('non-animated input defaults to no-loop', async () => { + for (const input of [fixtures.inputGif, fixtures.inputPng]) { + const data = await sharp(input) + .resize(8) + .gif({ effort: 1 }) + .toBuffer(); + + const { format, pages, loop } = await sharp(data).metadata(); + assert.strictEqual('gif', format); + assert.strictEqual(1, pages); + assert.strictEqual(1, loop); + } + }); + + it('Animated GIF to animated WebP merges identical frames', async () => { + const webp = await sharp(fixtures.inputGifAnimated, { animated: true }) + .webp() + .toBuffer(); + + const { delay, loop, pages } = await sharp(webp).metadata(); + assert.deepStrictEqual([120, 120, 90, 120, 120, 90, 120, 90, 30], delay); + assert.strictEqual(0, loop); + assert.strictEqual(9, pages); + }); +}); diff --git a/test/unit/heif.js b/test/unit/heif.js new file mode 100644 index 000000000..adb7b2969 --- /dev/null +++ b/test/unit/heif.js @@ -0,0 +1,99 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); + +describe('HEIF', () => { + it('called without options throws an error', () => { + assert.throws(() => { + sharp().heif(); + }); + }); + it('valid quality does not throw an error', () => { + assert.doesNotThrow(() => { + sharp().heif({ compression: 'av1', quality: 80 }); + }); + }); + it('invalid quality should throw an error', () => { + assert.throws(() => { + sharp().heif({ compression: 'av1', quality: 101 }); + }); + }); + it('non-numeric quality should throw an error', () => { + assert.throws(() => { + sharp().heif({ compression: 'av1', quality: 'fail' }); + }); + }); + it('valid lossless does not throw an error', () => { + assert.doesNotThrow(() => { + sharp().heif({ compression: 'av1', lossless: true }); + }); + }); + it('non-boolean lossless should throw an error', () => { + assert.throws(() => { + sharp().heif({ compression: 'av1', lossless: 'fail' }); + }); + }); + it('valid compression does not throw an error', () => { + assert.doesNotThrow(() => { + sharp().heif({ compression: 'hevc' }); + }); + }); + it('unknown compression should throw an error', () => { + assert.throws(() => { + sharp().heif({ compression: 'fail' }); + }); + }); + it('invalid compression should throw an error', () => { + assert.throws(() => { + sharp().heif({ compression: 1 }); + }); + }); + it('valid effort does not throw an error', () => { + assert.doesNotThrow(() => { + sharp().heif({ compression: 'av1', effort: 6 }); + }); + }); + it('out of range effort should throw an error', () => { + assert.throws(() => { + sharp().heif({ compression: 'av1', effort: 10 }); + }); + }); + it('invalid effort should throw an error', () => { + assert.throws(() => { + sharp().heif({ compression: 'av1', effort: 'fail' }); + }); + }); + it('invalid chromaSubsampling should throw an error', () => { + assert.throws(() => { + sharp().heif({ compression: 'av1', chromaSubsampling: 'fail' }); + }); + }); + it('valid chromaSubsampling does not throw an error', () => { + assert.doesNotThrow(() => { + sharp().heif({ compression: 'av1', chromaSubsampling: '4:4:4' }); + }); + }); + it('valid bitdepth value does not throw an error', () => { + const { heif } = sharp.versions; + delete sharp.versions.heif; + assert.doesNotThrow(() => { + sharp().heif({ compression: 'av1', bitdepth: 12 }); + }); + sharp.versions.heif = '1.2.3'; + assert.throws(() => { + sharp().heif({ compression: 'av1', bitdepth: 10 }); + }, /Error: Expected 8 for bitdepth when using prebuilt binaries but received 10 of type number/); + sharp.versions.heif = heif; + }); + it('invalid bitdepth value should throw an error', () => { + assert.throws(() => { + sharp().heif({ compression: 'av1', bitdepth: 11 }); + }, /Error: Expected 8, 10 or 12 for bitdepth but received 11 of type number/); + }); +}); diff --git a/test/unit/interpolation.js b/test/unit/interpolation.js deleted file mode 100644 index 85372f6be..000000000 --- a/test/unit/interpolation.js +++ /dev/null @@ -1,99 +0,0 @@ -'use strict'; - -const assert = require('assert'); - -const sharp = require('../../'); -const fixtures = require('../fixtures'); - -describe('Interpolators and kernels', function () { - describe('Reducers', function () { - [ - sharp.kernel.nearest, - sharp.kernel.cubic, - sharp.kernel.lanczos2, - sharp.kernel.lanczos3 - ].forEach(function (kernel) { - it(kernel, function (done) { - sharp(fixtures.inputJpg) - .resize(320, null, { kernel: kernel }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(320, info.width); - fixtures.assertSimilar(fixtures.inputJpg, data, done); - }); - }); - }); - }); - - describe('Enlargers', function () { - [ - sharp.interpolator.nearest, - sharp.interpolator.bilinear, - sharp.interpolator.bicubic, - sharp.interpolator.nohalo, - sharp.interpolator.locallyBoundedBicubic, - sharp.interpolator.vertexSplitQuadraticBasisSpline - ].forEach(function (interpolator) { - describe(interpolator, function () { - it('x and y', function (done) { - sharp(fixtures.inputTiff8BitDepth) - .resize(200, 200, { interpolator: interpolator }) - .png() - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(200, info.width); - assert.strictEqual(200, info.height); - done(); - }); - }); - it('x only', function (done) { - sharp(fixtures.inputTiff8BitDepth) - .resize(200, 21, { interpolator: interpolator }) - .png() - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(200, info.width); - assert.strictEqual(21, info.height); - done(); - }); - }); - it('y only', function (done) { - sharp(fixtures.inputTiff8BitDepth) - .resize(21, 200, { interpolator: interpolator }) - .png() - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(21, info.width); - assert.strictEqual(200, info.height); - done(); - }); - }); - }); - }); - - it('nearest with integral factor', function (done) { - sharp(fixtures.inputTiff8BitDepth) - .resize(210, 210, { interpolator: 'nearest' }) - .png() - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(210, info.width); - assert.strictEqual(210, info.height); - done(); - }); - }); - }); - - it('unknown kernel throws', function () { - assert.throws(function () { - sharp().resize(null, null, { kernel: 'unknown' }); - }); - }); - - it('unknown interpolator throws', function () { - assert.throws(function () { - sharp().resize(null, null, { interpolator: 'unknown' }); - }); - }); -}); diff --git a/test/unit/io.js b/test/unit/io.js index 6898dc0a3..ad13f48b6 100644 --- a/test/unit/io.js +++ b/test/unit/io.js @@ -1,71 +1,75 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const fs = require('fs'); -const assert = require('assert'); +const fs = require('node:fs'); +const path = require('node:path'); +const { afterEach, beforeEach, describe, it } = require('node:test'); +const assert = require('node:assert'); const sharp = require('../../'); const fixtures = require('../fixtures'); -describe('Input/output', function () { - beforeEach(function () { +const outputJpg = fixtures.path('output.jpg'); + +describe('Input/output', () => { + beforeEach(() => { sharp.cache(false); }); - afterEach(function () { + afterEach(() => { sharp.cache(true); }); - it('Read from File and write to Stream', function (done) { - const writable = fs.createWriteStream(fixtures.outputJpg); - writable.on('finish', function () { - sharp(fixtures.outputJpg).toBuffer(function (err, data, info) { + it('Read from File and write to Stream', (_t, done) => { + const writable = fs.createWriteStream(outputJpg); + writable.on('close', () => { + sharp(outputJpg).toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual(data.length, info.size); assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); - fs.unlinkSync(fixtures.outputJpg); - done(); + fs.rm(outputJpg, done); }); }); sharp(fixtures.inputJpg).resize(320, 240).pipe(writable); }); - it('Read from Buffer and write to Stream', function (done) { + it('Read from Buffer and write to Stream', (_t, done) => { const inputJpgBuffer = fs.readFileSync(fixtures.inputJpg); - const writable = fs.createWriteStream(fixtures.outputJpg); - writable.on('finish', function () { - sharp(fixtures.outputJpg).toBuffer(function (err, data, info) { + const writable = fs.createWriteStream(outputJpg); + writable.on('close', () => { + sharp(outputJpg).toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual(data.length, info.size); assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); - fs.unlinkSync(fixtures.outputJpg); - done(); + fs.rm(outputJpg, done); }); }); sharp(inputJpgBuffer).resize(320, 240).pipe(writable); }); - it('Read from Stream and write to File', function (done) { + it('Read from Stream and write to File', (_t, done) => { const readable = fs.createReadStream(fixtures.inputJpg); - const pipeline = sharp().resize(320, 240).toFile(fixtures.outputJpg, function (err, info) { + const pipeline = sharp().resize(320, 240).toFile(outputJpg, (err, info) => { if (err) throw err; assert.strictEqual(true, info.size > 0); assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); - fs.unlinkSync(fixtures.outputJpg); - done(); + fs.rm(outputJpg, done); }); readable.pipe(pipeline); }); - it('Read from Stream and write to Buffer', function (done) { + it('Read from Stream and write to Buffer', (_t, done) => { const readable = fs.createReadStream(fixtures.inputJpg); - const pipeline = sharp().resize(320, 240).toBuffer(function (err, data, info) { + const pipeline = sharp().resize(320, 240).toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual(data.length, info.size); @@ -77,23 +81,23 @@ describe('Input/output', function () { readable.pipe(pipeline); }); - it('Read from Stream and write to Buffer via Promise resolved with Buffer', function () { + it('Read from Stream and write to Buffer via Promise resolved with Buffer', () => { const pipeline = sharp().resize(1, 1); fs.createReadStream(fixtures.inputJpg).pipe(pipeline); return pipeline - .toBuffer({resolveWithObject: false}) - .then(function (data) { + .toBuffer({ resolveWithObject: false }) + .then((data) => { assert.strictEqual(true, data instanceof Buffer); assert.strictEqual(true, data.length > 0); }); }); - it('Read from Stream and write to Buffer via Promise resolved with Object', function () { + it('Read from Stream and write to Buffer via Promise resolved with Object', () => { const pipeline = sharp().resize(1, 1); fs.createReadStream(fixtures.inputJpg).pipe(pipeline); return pipeline - .toBuffer({resolveWithObject: true}) - .then(function (object) { + .toBuffer({ resolveWithObject: true }) + .then((object) => { assert.strictEqual('object', typeof object); assert.strictEqual('object', typeof object.info); assert.strictEqual('jpeg', object.info.format); @@ -105,21 +109,18 @@ describe('Input/output', function () { }); }); - it('Read from File and write to Buffer via Promise resolved with Buffer', function () { - return sharp(fixtures.inputJpg) + it('Read from File and write to Buffer via Promise resolved with Buffer', () => sharp(fixtures.inputJpg) .resize(1, 1) - .toBuffer({resolveWithObject: false}) - .then(function (data) { + .toBuffer({ resolveWithObject: false }) + .then((data) => { assert.strictEqual(true, data instanceof Buffer); assert.strictEqual(true, data.length > 0); - }); - }); + })); - it('Read from File and write to Buffer via Promise resolved with Object', function () { - return sharp(fixtures.inputJpg) + it('Read from File and write to Buffer via Promise resolved with Object', () => sharp(fixtures.inputJpg) .resize(1, 1) - .toBuffer({resolveWithObject: true}) - .then(function (object) { + .toBuffer({ resolveWithObject: true }) + .then((object) => { assert.strictEqual('object', typeof object); assert.strictEqual('object', typeof object.info); assert.strictEqual('jpeg', object.info.format); @@ -128,121 +129,228 @@ describe('Input/output', function () { assert.strictEqual(3, object.info.channels); assert.strictEqual(true, object.data instanceof Buffer); assert.strictEqual(true, object.data.length > 0); - }); - }); + })); - it('Read from Stream and write to Stream', function (done) { + it('Read from Stream and write to Stream', (_t, done) => { const readable = fs.createReadStream(fixtures.inputJpg); - const writable = fs.createWriteStream(fixtures.outputJpg); - writable.on('finish', function () { - sharp(fixtures.outputJpg).toBuffer(function (err, data, info) { + const writable = fs.createWriteStream(outputJpg); + writable.on('close', () => { + sharp(outputJpg).toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual(data.length, info.size); assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); - fs.unlinkSync(fixtures.outputJpg); - done(); + fs.rm(outputJpg, done); }); }); const pipeline = sharp().resize(320, 240); readable.pipe(pipeline).pipe(writable); }); - it('Stream should emit info event', function (done) { + it('Read from ArrayBuffer and write to Buffer', async () => { + const uint8array = Uint8Array.from([255, 255, 255, 0, 0, 0]); + const arrayBuffer = new ArrayBuffer(uint8array.byteLength); + new Uint8Array(arrayBuffer).set(uint8array); + const { data, info } = await sharp(arrayBuffer, { + raw: { + width: 2, + height: 1, + channels: 3 + } + }).toBuffer({ resolveWithObject: true }); + + assert.deepStrictEqual(uint8array, new Uint8Array(data)); + assert.strictEqual(info.width, 2); + assert.strictEqual(info.height, 1); + }); + + it('Read from Uint8Array and write to Buffer', async () => { + const uint8array = Uint8Array.from([255, 255, 255, 0, 0, 0]); + const { data, info } = await sharp(uint8array, { + raw: { + width: 2, + height: 1, + channels: 3 + } + }).toBuffer({ resolveWithObject: true }); + + assert.deepStrictEqual(uint8array, new Uint8Array(data)); + assert.strictEqual(info.width, 2); + assert.strictEqual(info.height, 1); + }); + + it('Read from Uint8ClampedArray and output to Buffer', async () => { + // since a Uint8ClampedArray is the same as Uint8Array but clamps the values + // between 0-255 it seemed good to add this also + const uint8array = Uint8ClampedArray.from([255, 255, 255, 0, 0, 0]); + const { data, info } = await sharp(uint8array, { + raw: { + width: 2, + height: 1, + channels: 3 + } + }).toBuffer({ resolveWithObject: true }); + + assert.deepStrictEqual(uint8array, new Uint8ClampedArray(data)); + assert.strictEqual(info.width, 2); + assert.strictEqual(info.height, 1); + }); + + it('Read from Uint8ClampedArray with byteOffset and output to Buffer', async () => { + // since a Uint8ClampedArray is the same as Uint8Array but clamps the values + // between 0-255 it seemed good to add this also + const uint8array = Uint8ClampedArray.from([0, 0, 0, 255, 255, 255, 0, 0, 0, 255, 255, 255]); + const uint8ArrayWithByteOffset = new Uint8ClampedArray(uint8array.buffer, 3, 6); + const { data, info } = await sharp(uint8ArrayWithByteOffset, { + raw: { + width: 2, + height: 1, + channels: 3 + } + }).toBuffer({ resolveWithObject: true }); + + assert.deepStrictEqual(Uint8ClampedArray.from([255, 255, 255, 0, 0, 0]), new Uint8ClampedArray(data)); + assert.strictEqual(info.width, 2); + assert.strictEqual(info.height, 1); + }); + + it('Stream should emit info event', (_t, done) => { const readable = fs.createReadStream(fixtures.inputJpg); - const writable = fs.createWriteStream(fixtures.outputJpg); + const writable = fs.createWriteStream(outputJpg); const pipeline = sharp().resize(320, 240); let infoEventEmitted = false; - pipeline.on('info', function (info) { + pipeline.on('info', (info) => { assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); assert.strictEqual(3, info.channels); infoEventEmitted = true; }); - writable.on('finish', function () { + writable.on('close', () => { assert.strictEqual(true, infoEventEmitted); - fs.unlinkSync(fixtures.outputJpg); - done(); + fs.rm(outputJpg, done); + }); + readable.pipe(pipeline).pipe(writable); + }); + + it('Stream should emit close event', (_t, done) => { + const readable = fs.createReadStream(fixtures.inputJpg); + const writable = fs.createWriteStream(outputJpg); + const pipeline = sharp().resize(320, 240); + let closeEventEmitted = false; + pipeline.on('close', () => { + closeEventEmitted = true; + }); + writable.on('close', () => { + assert.strictEqual(true, closeEventEmitted); + fs.rm(outputJpg, done); }); readable.pipe(pipeline).pipe(writable); }); - it('Handle Stream to Stream error ', function (done) { + it('Handle Stream to Stream error ', (_t, done) => { const pipeline = sharp().resize(320, 240); let anErrorWasEmitted = false; - pipeline.on('error', function (err) { + pipeline.on('error', (err) => { anErrorWasEmitted = !!err; - }).on('end', function () { + }).on('end', () => { assert(anErrorWasEmitted); - fs.unlinkSync(fixtures.outputJpg); - done(); + fs.rm(outputJpg, done); }); const readableButNotAnImage = fs.createReadStream(__filename); - const writable = fs.createWriteStream(fixtures.outputJpg); + const writable = fs.createWriteStream(outputJpg); readableButNotAnImage.pipe(pipeline).pipe(writable); }); - it('Handle File to Stream error', function (done) { + it('Handle File to Stream error', (_t, done) => { const readableButNotAnImage = sharp(__filename).resize(320, 240); let anErrorWasEmitted = false; - readableButNotAnImage.on('error', function (err) { + readableButNotAnImage.on('error', (err) => { anErrorWasEmitted = !!err; - }).on('end', function () { + }).on('end', () => { assert(anErrorWasEmitted); - fs.unlinkSync(fixtures.outputJpg); - done(); + fs.rm(outputJpg, done); }); - const writable = fs.createWriteStream(fixtures.outputJpg); + const writable = fs.createWriteStream(outputJpg); readableButNotAnImage.pipe(writable); }); - it('Readable side of Stream can start flowing after Writable side has finished', function (done) { + it('Readable side of Stream can start flowing after Writable side has finished', (_t, done) => { const readable = fs.createReadStream(fixtures.inputJpg); - const writable = fs.createWriteStream(fixtures.outputJpg); - writable.on('finish', function () { - sharp(fixtures.outputJpg).toBuffer(function (err, data, info) { + const writable = fs.createWriteStream(outputJpg); + writable.on('close', () => { + sharp(outputJpg).toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual(data.length, info.size); assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); - fs.unlinkSync(fixtures.outputJpg); - done(); + fs.rm(outputJpg, done); }); }); const pipeline = sharp().resize(320, 240); readable.pipe(pipeline); - pipeline.on('finish', function () { + pipeline.on('finish', () => { pipeline.pipe(writable); }); }); - it('Sequential read, force JPEG', function (done) { - sharp(fixtures.inputJpg) - .sequentialRead() + it('Non-Stream input generates error when provided Stream-like data', (_t, done) => { + sharp('input')._write('fail', null, (err) => { + assert.strictEqual(err.message, 'Unexpected data on Writable Stream'); + done(); + }); + }); + + it('Non-Buffer chunk on Stream input generates error', (_t, done) => { + sharp()._write('fail', null, (err) => { + assert.strictEqual(err.message, 'Non-Buffer data on Writable Stream'); + done(); + }); + }); + + it('Invalid sequential read option throws', () => { + assert.throws(() => { + sharp({ sequentialRead: 'fail' }); + }, /Expected boolean for sequentialRead but received fail of type string/); + }); + + it('Sequential read, force JPEG', () => + sharp(fixtures.inputJpg, { sequentialRead: true }) .resize(320, 240) .toFormat(sharp.format.jpeg) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); + .toBuffer({ resolveWithObject: true }) + .then(({ data, info }) => { + assert.strictEqual(data.length > 0, true); assert.strictEqual(data.length, info.size); - assert.strictEqual('jpeg', info.format); - assert.strictEqual(320, info.width); - assert.strictEqual(240, info.height); - done(); - }); - }); + assert.strictEqual(info.format, 'jpeg'); + assert.strictEqual(info.width, 320); + assert.strictEqual(info.height, 240); + }) + ); - it('Not sequential read, force JPEG', function (done) { - sharp(fixtures.inputJpg) - .sequentialRead(false) + it('Not sequential read, force JPEG', () => + sharp(fixtures.inputJpg, { sequentialRead: false }) .resize(320, 240) .toFormat('jpeg') - .toBuffer(function (err, data, info) { + .toBuffer({ resolveWithObject: true }) + .then(({ data, info }) => { + assert.strictEqual(data.length > 0, true); + assert.strictEqual(data.length, info.size); + assert.strictEqual(info.format, 'jpeg'); + assert.strictEqual(info.width, 320); + assert.strictEqual(info.height, 240); + }) + ); + + it('Support output to jpg format', (_t, done) => { + sharp(fixtures.inputPng) + .resize(320, 240) + .toFormat('jpg') + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual(data.length, info.size); @@ -253,866 +361,284 @@ describe('Input/output', function () { }); }); - it('Support output to jpg format', function (done) { - sharp(fixtures.inputPng) + it('Support output to tif format', (_t, done) => { + sharp(fixtures.inputTiff) .resize(320, 240) - .toFormat('jpg') - .toBuffer(function (err, data, info) { + .toFormat('tif') + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual(data.length, info.size); - assert.strictEqual('jpeg', info.format); + assert.strictEqual('tiff', info.format); assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); done(); }); }); - it('Fail when output File is input File', function (done) { - sharp(fixtures.inputJpg).toFile(fixtures.inputJpg, function (err) { - assert(!!err); - done(); + it('Allow use of toBuffer and toFile with same instance', async () => { + const instance = sharp({ + create: { + width: 8, + height: 8, + channels: 3, + background: 'red' + } }); + await instance.toFile(fixtures.path('output.jpg')); + const data = await instance.toBuffer(); + assert.strictEqual(Buffer.isBuffer(data), true); }); - it('Fail when output File is input File via Promise', function (done) { - sharp(fixtures.inputJpg).toFile(fixtures.inputJpg).then(function (data) { - assert(false); - done(); - }).catch(function (err) { - assert(!!err); + it('Fail when output File is input File', (_t, done) => { + sharp(fixtures.inputJpg).toFile(fixtures.inputJpg, (err) => { + assert(err instanceof Error); + assert.strictEqual('Cannot use same file for input and output', err.message); done(); }); }); - it('Fail when output File is empty', function (done) { - sharp(fixtures.inputJpg).toFile('', function (err) { - assert(!!err); + it('Fail when output File is input File via Promise', (_t, done) => { + sharp(fixtures.inputJpg).toFile(fixtures.inputJpg).then(() => { + done(new Error('Unexpectedly resolved Promise')); + }).catch((err) => { + assert(err instanceof Error); + assert.strictEqual('Cannot use same file for input and output', err.message); done(); }); }); - it('Fail when output File is empty via Promise', function (done) { - sharp(fixtures.inputJpg).toFile('').then(function (data) { - assert(false); + it('Fail when output File is input File (relative output, absolute input)', (_t, done) => { + const relativePath = path.relative(process.cwd(), fixtures.inputJpg); + sharp(fixtures.inputJpg).toFile(relativePath, (err) => { + assert(err instanceof Error); + assert.strictEqual('Cannot use same file for input and output', err.message); done(); - }).catch(function (err) { - assert(!!err); + }); + }); + + it('Fail when output File is input File via Promise (relative output, absolute input)', (_t, done) => { + const relativePath = path.relative(process.cwd(), fixtures.inputJpg); + sharp(fixtures.inputJpg).toFile(relativePath).then(() => { + done(new Error('Unexpectedly resolved Promise')); + }).catch((err) => { + assert(err instanceof Error); + assert.strictEqual('Cannot use same file for input and output', err.message); done(); }); }); - it('Fail when input is empty Buffer', function (done) { - sharp(Buffer.alloc(0)).toBuffer().then(function () { - assert(false); + it('Fail when output File is input File (relative input, absolute output)', (_t, done) => { + const relativePath = path.relative(process.cwd(), fixtures.inputJpg); + sharp(relativePath).toFile(fixtures.inputJpg, (err) => { + assert(err instanceof Error); + assert.strictEqual('Cannot use same file for input and output', err.message); done(); - }).catch(function (err) { + }); + }); + + it('Fail when output File is input File via Promise (relative input, absolute output)', (_t, done) => { + const relativePath = path.relative(process.cwd(), fixtures.inputJpg); + sharp(relativePath).toFile(fixtures.inputJpg).then(() => { + done(new Error('Unexpectedly resolved Promise')); + }).catch((err) => { assert(err instanceof Error); + assert.strictEqual('Cannot use same file for input and output', err.message); done(); }); }); - it('Fail when input is invalid Buffer', function (done) { - sharp(Buffer.from([0x1, 0x2, 0x3, 0x4])).toBuffer().then(function () { - assert(false); + it('Fail when output File is empty', (_t, done) => { + sharp(fixtures.inputJpg).toFile('', (err) => { + assert(err instanceof Error); + assert.strictEqual('Missing output file path', err.message); done(); - }).catch(function (err) { + }); + }); + + it('Fail when output File is empty via Promise', (_t, done) => { + sharp(fixtures.inputJpg).toFile('').then(() => { + done(new Error('Unexpectedly resolved Promise')); + }).catch((err) => { assert(err instanceof Error); + assert.strictEqual('Missing output file path', err.message); done(); }); }); - describe('Fail for unsupported input', function () { - it('Undefined', function () { - assert.throws(function () { + it('Fail when input is invalid Buffer', async () => + assert.rejects( + () => sharp(Buffer.from([0x1, 0x2, 0x3, 0x4])).toBuffer(), + (err) => { + assert.strictEqual(err.message, 'Input buffer contains unsupported image format'); + assert(err.stack.includes('at Sharp.toBuffer')); + assert(err.stack.includes(__filename)); + return true; + } + ) + ); + + it('Fail when input file path is missing', async () => + assert.rejects( + () => sharp('does-not-exist').toFile('fail'), + (err) => { + assert.strictEqual(err.message, 'Input file is missing: does-not-exist'); + assert(err.stack.includes('at Sharp.toFile')); + assert(err.stack.includes(__filename)); + return true; + } + ) + ); + + describe('Fail for unsupported input', () => { + it('Undefined', () => { + assert.throws(() => { sharp(undefined); }); }); - it('Null', function () { - assert.throws(function () { + it('Null', () => { + assert.throws(() => { sharp(null); }); }); - it('Numeric', function () { - assert.throws(function () { + it('Numeric', () => { + assert.throws(() => { sharp(1); }); }); - it('Boolean', function () { - assert.throws(function () { + it('Boolean', () => { + assert.throws(() => { sharp(true); }); }); - it('Error Object', function () { - assert.throws(function () { + it('Error Object', () => { + assert.throws(() => { sharp(new Error()); }); }); }); - it('Promises/A+', function () { - return sharp(fixtures.inputJpg) - .resize(320, 240) - .toBuffer(); - }); - - it('JPEG quality', function (done) { - sharp(fixtures.inputJpg) - .resize(320, 240) - .jpeg({ quality: 70 }) - .toBuffer(function (err, buffer70) { - if (err) throw err; - sharp(fixtures.inputJpg) - .resize(320, 240) - .toBuffer(function (err, buffer80) { - if (err) throw err; - sharp(fixtures.inputJpg) - .resize(320, 240) - .jpeg({ quality: 90 }) - .toBuffer(function (err, buffer90) { - if (err) throw err; - assert(buffer70.length < buffer80.length); - assert(buffer80.length < buffer90.length); - done(); - }); - }); - }); - }); - - describe('Invalid JPEG quality', function () { - [-1, 88.2, 'test'].forEach(function (quality) { - it(quality.toString(), function () { - assert.throws(function () { - sharp().jpeg({ quality: quality }); - }); - }); - }); - }); - - it('Progressive JPEG image', function (done) { - sharp(fixtures.inputJpg) + it('Promises/A+', () => sharp(fixtures.inputJpg) .resize(320, 240) - .jpeg({ progressive: false }) - .toBuffer(function (err, nonProgressiveData, nonProgressiveInfo) { - if (err) throw err; - assert.strictEqual(true, nonProgressiveData.length > 0); - assert.strictEqual(nonProgressiveData.length, nonProgressiveInfo.size); - assert.strictEqual('jpeg', nonProgressiveInfo.format); - assert.strictEqual(320, nonProgressiveInfo.width); - assert.strictEqual(240, nonProgressiveInfo.height); - sharp(fixtures.inputJpg) - .resize(320, 240) - .jpeg({ progressive: true }) - .toBuffer(function (err, progressiveData, progressiveInfo) { - if (err) throw err; - assert.strictEqual(true, progressiveData.length > 0); - assert.strictEqual(progressiveData.length, progressiveInfo.size); - assert.strictEqual(false, progressiveData.length === nonProgressiveData.length); - assert.strictEqual('jpeg', progressiveInfo.format); - assert.strictEqual(320, progressiveInfo.width); - assert.strictEqual(240, progressiveInfo.height); - done(); - }); - }); - }); - - it('Progressive PNG image', function (done) { - sharp(fixtures.inputJpg) - .resize(320, 240) - .png({ progressive: false }) - .toBuffer(function (err, nonProgressiveData, nonProgressiveInfo) { - if (err) throw err; - assert.strictEqual(true, nonProgressiveData.length > 0); - assert.strictEqual(nonProgressiveData.length, nonProgressiveInfo.size); - assert.strictEqual('png', nonProgressiveInfo.format); - assert.strictEqual(320, nonProgressiveInfo.width); - assert.strictEqual(240, nonProgressiveInfo.height); - sharp(nonProgressiveData) - .png({ progressive: true }) - .toBuffer(function (err, progressiveData, progressiveInfo) { - if (err) throw err; - assert.strictEqual(true, progressiveData.length > 0); - assert.strictEqual(progressiveData.length, progressiveInfo.size); - assert.strictEqual(true, progressiveData.length > nonProgressiveData.length); - assert.strictEqual('png', progressiveInfo.format); - assert.strictEqual(320, progressiveInfo.width); - assert.strictEqual(240, progressiveInfo.height); - done(); - }); - }); - }); - - if (sharp.format.webp.output.buffer) { - it('WebP output', function (done) { - sharp(fixtures.inputJpg) - .resize(320, 240) - .toFormat(sharp.format.webp) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('webp', info.format); - assert.strictEqual(320, info.width); - assert.strictEqual(240, info.height); - done(); - }); - }); - - it('should work for webp alpha quality', function (done) { - sharp(fixtures.inputPngAlphaPremultiplicationSmall) - .webp({alphaQuality: 80}) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('webp', info.format); - fixtures.assertSimilar(fixtures.expected('webp-alpha-80.webp'), data, done); - }); - }); + .toBuffer()); - it('should work for webp lossless', function (done) { - sharp(fixtures.inputPngAlphaPremultiplicationSmall) - .webp({lossless: true}) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('webp', info.format); - fixtures.assertSimilar(fixtures.expected('webp-lossless.webp'), data, done); - }); - }); - - it('should work for webp near-lossless', function (done) { - sharp(fixtures.inputPngAlphaPremultiplicationSmall) - .webp({nearLossless: true, quality: 50}) - .toBuffer(function (err50, data50, info50) { - if (err50) throw err50; - assert.strictEqual(true, data50.length > 0); - assert.strictEqual('webp', info50.format); - fixtures.assertSimilar(fixtures.expected('webp-near-lossless-50.webp'), data50, done); - }); - }); - - it('should use near-lossless when both lossless and nearLossless are specified', function (done) { - sharp(fixtures.inputPngAlphaPremultiplicationSmall) - .webp({nearLossless: true, quality: 50, lossless: true}) - .toBuffer(function (err50, data50, info50) { - if (err50) throw err50; - assert.strictEqual(true, data50.length > 0); - assert.strictEqual('webp', info50.format); - fixtures.assertSimilar(fixtures.expected('webp-near-lossless-50.webp'), data50, done); - }); - }); - } - - it('Invalid output format', function (done) { + it('Invalid output format', (_t, done) => { let isValid = false; try { sharp().toFormat('zoinks'); isValid = true; - } catch (e) {} + } catch (_err) {} assert(!isValid); done(); }); - it('File input with corrupt header fails gracefully', function (done) { + it('File input with corrupt header fails gracefully', (_t, done) => { sharp(fixtures.inputJpgWithCorruptHeader) - .toBuffer(function (err) { + .toBuffer((err) => { assert.strictEqual(true, !!err); done(); }); }); - it('Buffer input with corrupt header fails gracefully', function (done) { + it('Buffer input with corrupt header fails gracefully', (_t, done) => { sharp(fs.readFileSync(fixtures.inputJpgWithCorruptHeader)) - .toBuffer(function (err) { + .toBuffer((err) => { + assert.strictEqual(true, !!err); + done(); + }); + }); + + it('Stream input with corrupt header fails gracefully', (_t, done) => { + const transformer = sharp(); + transformer + .toBuffer() + .then(() => { + done(new Error('Unexpectedly resolved Promise')); + }) + .catch((err) => { assert.strictEqual(true, !!err); done(); }); + fs + .createReadStream(fixtures.inputJpgWithCorruptHeader) + .pipe(transformer); }); - describe('Output filename with unknown extension', function () { - it('Match JPEG input', function (done) { + describe('Output filename with unknown extension', () => { + const outputZoinks = fixtures.path('output.zoinks'); + + it('Match JPEG input', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 80) - .toFile(fixtures.outputZoinks, function (err, info) { + .toFile(outputZoinks, (err, info) => { if (err) throw err; assert.strictEqual(true, info.size > 0); assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); assert.strictEqual(80, info.height); - fs.unlinkSync(fixtures.outputZoinks); - done(); + fs.rm(outputZoinks, done); }); }); - it('Match PNG input', function (done) { + it('Match PNG input', (_t, done) => { sharp(fixtures.inputPng) .resize(320, 80) - .toFile(fixtures.outputZoinks, function (err, info) { + .toFile(outputZoinks, (err, info) => { if (err) throw err; assert.strictEqual(true, info.size > 0); assert.strictEqual('png', info.format); assert.strictEqual(320, info.width); assert.strictEqual(80, info.height); - fs.unlinkSync(fixtures.outputZoinks); - done(); + fs.rm(outputZoinks, done); }); }); - it('Match WebP input', function (done) { + it('Match WebP input', (_t, done) => { sharp(fixtures.inputWebP) .resize(320, 80) - .toFile(fixtures.outputZoinks, function (err, info) { + .toFile(outputZoinks, (err, info) => { if (err) throw err; assert.strictEqual(true, info.size > 0); assert.strictEqual('webp', info.format); assert.strictEqual(320, info.width); assert.strictEqual(80, info.height); - fs.unlinkSync(fixtures.outputZoinks); - done(); + fs.rm(outputZoinks, done); }); }); - it('Match TIFF input', function (done) { + it('Match TIFF input', (_t, done) => { sharp(fixtures.inputTiff) .resize(320, 80) - .toFile(fixtures.outputZoinks, function (err, info) { + .toFile(outputZoinks, (err, info) => { if (err) throw err; assert.strictEqual(true, info.size > 0); assert.strictEqual('tiff', info.format); assert.strictEqual(320, info.width); assert.strictEqual(80, info.height); - fs.unlinkSync(fixtures.outputZoinks); - done(); - }); - }); - - it('Autoconvert GIF input to PNG output', function (done) { - sharp(fixtures.inputGif) - .resize(320, 80) - .toFile(fixtures.outputZoinks, function (err, info) { - if (err) throw err; - assert.strictEqual(true, info.size > 0); - assert.strictEqual('png', info.format); - assert.strictEqual(320, info.width); - assert.strictEqual(80, info.height); - fs.unlinkSync(fixtures.outputZoinks); - done(); + fs.rm(outputZoinks, done); }); }); - it('Force JPEG format for PNG input', function (done) { + it('Force JPEG format for PNG input', (_t, done) => { sharp(fixtures.inputPng) .resize(320, 80) .jpeg() - .toFile(fixtures.outputZoinks, function (err, info) { + .toFile(outputZoinks, (err, info) => { if (err) throw err; assert.strictEqual(true, info.size > 0); assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); assert.strictEqual(80, info.height); - fs.unlinkSync(fixtures.outputZoinks); - done(); - }); - }); - }); - - describe('PNG output', function () { - it('compression level is valid', function () { - assert.doesNotThrow(function () { - sharp().png({ compressionLevel: 0 }); - }); - }); - - it('compression level is invalid', function () { - assert.throws(function () { - sharp().png({ compressionLevel: -1 }); - }); - }); - - it('without adaptiveFiltering generates smaller file', function (done) { - // First generate with adaptive filtering - sharp(fixtures.inputPng) - .resize(320, 240) - .png({ adaptiveFiltering: true }) - .toBuffer(function (err, adaptiveData, adaptiveInfo) { - if (err) throw err; - assert.strictEqual(true, adaptiveData.length > 0); - assert.strictEqual(adaptiveData.length, adaptiveInfo.size); - assert.strictEqual('png', adaptiveInfo.format); - assert.strictEqual(320, adaptiveInfo.width); - assert.strictEqual(240, adaptiveInfo.height); - // Then generate without - sharp(fixtures.inputPng) - .resize(320, 240) - .png({ adaptiveFiltering: false }) - .toBuffer(function (err, withoutAdaptiveData, withoutAdaptiveInfo) { - if (err) throw err; - assert.strictEqual(true, withoutAdaptiveData.length > 0); - assert.strictEqual(withoutAdaptiveData.length, withoutAdaptiveInfo.size); - assert.strictEqual('png', withoutAdaptiveInfo.format); - assert.strictEqual(320, withoutAdaptiveInfo.width); - assert.strictEqual(240, withoutAdaptiveInfo.height); - assert.strictEqual(true, withoutAdaptiveData.length < adaptiveData.length); - done(); - }); - }); - }); - - it('Invalid PNG adaptiveFiltering value throws error', function () { - assert.throws(function () { - sharp().png({ adaptiveFiltering: 1 }); - }); - }); - }); - - it('Without chroma subsampling generates larger file', function (done) { - // First generate with chroma subsampling (default) - sharp(fixtures.inputJpg) - .resize(320, 240) - .jpeg({ chromaSubsampling: '4:2:0' }) - .toBuffer(function (err, withChromaSubsamplingData, withChromaSubsamplingInfo) { - if (err) throw err; - assert.strictEqual(true, withChromaSubsamplingData.length > 0); - assert.strictEqual(withChromaSubsamplingData.length, withChromaSubsamplingInfo.size); - assert.strictEqual('jpeg', withChromaSubsamplingInfo.format); - assert.strictEqual(320, withChromaSubsamplingInfo.width); - assert.strictEqual(240, withChromaSubsamplingInfo.height); - // Then generate without - sharp(fixtures.inputJpg) - .resize(320, 240) - .jpeg({ chromaSubsampling: '4:4:4' }) - .toBuffer(function (err, withoutChromaSubsamplingData, withoutChromaSubsamplingInfo) { - if (err) throw err; - assert.strictEqual(true, withoutChromaSubsamplingData.length > 0); - assert.strictEqual(withoutChromaSubsamplingData.length, withoutChromaSubsamplingInfo.size); - assert.strictEqual('jpeg', withoutChromaSubsamplingInfo.format); - assert.strictEqual(320, withoutChromaSubsamplingInfo.width); - assert.strictEqual(240, withoutChromaSubsamplingInfo.height); - assert.strictEqual(true, withChromaSubsamplingData.length < withoutChromaSubsamplingData.length); - done(); - }); - }); - }); - - it('Invalid JPEG chromaSubsampling value throws error', function () { - assert.throws(function () { - sharp().jpeg({ chromaSubsampling: '4:2:2' }); - }); - }); - - it('Trellis quantisation', function (done) { - // First generate without - sharp(fixtures.inputJpg) - .resize(320, 240) - .jpeg({ trellisQuantisation: false }) - .toBuffer(function (err, withoutData, withoutInfo) { - if (err) throw err; - assert.strictEqual(true, withoutData.length > 0); - assert.strictEqual(withoutData.length, withoutInfo.size); - assert.strictEqual('jpeg', withoutInfo.format); - assert.strictEqual(320, withoutInfo.width); - assert.strictEqual(240, withoutInfo.height); - // Then generate with - sharp(fixtures.inputJpg) - .resize(320, 240) - .jpeg({ trellisQuantization: true }) - .toBuffer(function (err, withData, withInfo) { - if (err) throw err; - assert.strictEqual(true, withData.length > 0); - assert.strictEqual(withData.length, withInfo.size); - assert.strictEqual('jpeg', withInfo.format); - assert.strictEqual(320, withInfo.width); - assert.strictEqual(240, withInfo.height); - // Verify image is same (as mozjpeg may not be present) size or less - assert.strictEqual(true, withData.length <= withoutData.length); - done(); - }); - }); - }); - - it('Overshoot deringing', function (done) { - // First generate without - sharp(fixtures.inputJpg) - .resize(320, 240) - .jpeg({ overshootDeringing: false }) - .toBuffer(function (err, withoutData, withoutInfo) { - if (err) throw err; - assert.strictEqual(true, withoutData.length > 0); - assert.strictEqual(withoutData.length, withoutInfo.size); - assert.strictEqual('jpeg', withoutInfo.format); - assert.strictEqual(320, withoutInfo.width); - assert.strictEqual(240, withoutInfo.height); - // Then generate with - sharp(fixtures.inputJpg) - .resize(320, 240) - .jpeg({ overshootDeringing: true }) - .toBuffer(function (err, withData, withInfo) { - if (err) throw err; - assert.strictEqual(true, withData.length > 0); - assert.strictEqual(withData.length, withInfo.size); - assert.strictEqual('jpeg', withInfo.format); - assert.strictEqual(320, withInfo.width); - assert.strictEqual(240, withInfo.height); - done(); - }); - }); - }); - - it('Optimise scans generates different output length', function (done) { - // First generate without - sharp(fixtures.inputJpg) - .resize(320, 240) - .jpeg({ optimiseScans: false }) - .toBuffer(function (err, withoutData, withoutInfo) { - if (err) throw err; - assert.strictEqual(true, withoutData.length > 0); - assert.strictEqual(withoutData.length, withoutInfo.size); - assert.strictEqual('jpeg', withoutInfo.format); - assert.strictEqual(320, withoutInfo.width); - assert.strictEqual(240, withoutInfo.height); - // Then generate with - sharp(fixtures.inputJpg) - .resize(320, 240) - .jpeg({ optimizeScans: true }) - .toBuffer(function (err, withData, withInfo) { - if (err) throw err; - assert.strictEqual(true, withData.length > 0); - assert.strictEqual(withData.length, withInfo.size); - assert.strictEqual('jpeg', withInfo.format); - assert.strictEqual(320, withInfo.width); - assert.strictEqual(240, withInfo.height); - // Verify image is of a different size (progressive output even without mozjpeg) - assert.notEqual(withData.length, withoutData.length); - done(); - }); - }); - }); - - it('Convert SVG to PNG at default 72DPI', function (done) { - sharp(fixtures.inputSvg) - .resize(1024) - .extract({left: 290, top: 760, width: 40, height: 40}) - .toFormat('png') - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('png', info.format); - assert.strictEqual(40, info.width); - assert.strictEqual(40, info.height); - fixtures.assertSimilar(fixtures.expected('svg72.png'), data, function (err) { - if (err) throw err; - sharp(data).metadata(function (err, info) { - if (err) throw err; - assert.strictEqual(72, info.density); - done(); - }); - }); - }); - }); - - it('Convert SVG to PNG at 300DPI', function (done) { - sharp(fixtures.inputSvg, { density: 1200 }) - .resize(1024) - .extract({left: 290, top: 760, width: 40, height: 40}) - .toFormat('png') - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('png', info.format); - assert.strictEqual(40, info.width); - assert.strictEqual(40, info.height); - fixtures.assertSimilar(fixtures.expected('svg1200.png'), data, function (err) { - if (err) throw err; - sharp(data).metadata(function (err, info) { - if (err) throw err; - assert.strictEqual(1200, info.density); - done(); - }); - }); - }); - }); - - it('Convert SVG with embedded images to PNG, respecting dimensions, autoconvert to PNG', function (done) { - sharp(fixtures.inputSvgWithEmbeddedImages) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('png', info.format); - assert.strictEqual(480, info.width); - assert.strictEqual(360, info.height); - assert.strictEqual(4, info.channels); - fixtures.assertSimilar(fixtures.expected('svg-embedded.png'), data, done); - }); - }); - - it('Load TIFF from Buffer', function (done) { - const inputTiffBuffer = fs.readFileSync(fixtures.inputTiff); - sharp(inputTiffBuffer) - .resize(320, 240) - .jpeg() - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual(data.length, info.size); - assert.strictEqual('jpeg', info.format); - assert.strictEqual(320, info.width); - assert.strictEqual(240, info.height); - done(); - }); - }); - - it('Save TIFF to Buffer', function (done) { - sharp(fixtures.inputTiff) - .resize(320, 240) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual(data.length, info.size); - assert.strictEqual('tiff', info.format); - assert.strictEqual(320, info.width); - assert.strictEqual(240, info.height); - done(); - }); - }); - - it('Invalid WebP quality throws error', function () { - assert.throws(function () { - sharp().webp({ quality: 101 }); - }); - }); - - it('Invalid WebP alpha quality throws error', function () { - assert.throws(function () { - sharp().webp({ alphaQuality: 101 }); - }); - }); - - it('Invalid TIFF quality throws error', function () { - assert.throws(function () { - sharp().tiff({ quality: 101 }); - }); - }); - - it('Missing TIFF quality does not throw error', function () { - assert.doesNotThrow(function () { - sharp().tiff(); - }); - }); - - it('Not squashing TIFF to a bit depth of 1 should not change the file size', function (done) { - const startSize = fs.statSync(fixtures.inputTiff8BitDepth).size; - sharp(fixtures.inputTiff8BitDepth) - .toColourspace('b-w') // can only squash 1 band uchar images - .tiff({ - squash: false, - compression: 'none' - }) - .toFile(fixtures.outputTiff, (err, info) => { - if (err) throw err; - assert.strictEqual('tiff', info.format); - assert(info.size === startSize); - fs.unlink(fixtures.outputTiff, done); - }); - }); - - it('Squashing TIFF to a bit depth of 1 should significantly reduce file size', function (done) { - const startSize = fs.statSync(fixtures.inputTiff8BitDepth).size; - sharp(fixtures.inputTiff8BitDepth) - .toColourspace('b-w') // can only squash 1 band uchar images - .tiff({ - squash: true, - compression: 'none' - }) - .toFile(fixtures.outputTiff, (err, info) => { - if (err) throw err; - assert.strictEqual('tiff', info.format); - assert(info.size < (startSize / 2)); - fs.unlink(fixtures.outputTiff, done); - }); - }); - - it('Invalid TIFF squash value throws error', function () { - assert.throws(function () { - sharp().tiff({ squash: 'true' }); - }); - }); - - it('TIFF setting xres and yres on file', function (done) { - const res = 1000.0; // inputTiff has a dpi of 300 (res*2.54) - sharp(fixtures.inputTiff) - .tiff({ - xres: (res), - yres: (res) - }) - .toFile(fixtures.outputTiff, (err, info) => { - if (err) throw err; - assert.strictEqual('tiff', info.format); - sharp(fixtures.outputTiff).metadata(function (err, metadata) { - if (err) throw err; - assert.strictEqual(metadata.density, res * 2.54); // convert to dpi - fs.unlink(fixtures.outputTiff, done); - }); - }); - }); - - it('TIFF setting xres and yres on buffer', function (done) { - const res = 1000.0; // inputTiff has a dpi of 300 (res*2.54) - sharp(fixtures.inputTiff) - .tiff({ - xres: (res), - yres: (res) - }) - .toBuffer(function (err, data, info) { - if (err) throw err; - sharp(data).metadata(function (err, metadata) { - if (err) throw err; - assert.strictEqual(metadata.density, res * 2.54); // convert to dpi - done(); + fs.rm(outputZoinks, done); }); - }); - }); - - it('TIFF invalid xres value should throw an error', function () { - assert.throws(function () { - sharp().tiff({ xres: '1000.0' }); - }); - }); - - it('TIFF invalid yres value should throw an error', function () { - assert.throws(function () { - sharp().tiff({ yres: '1000.0' }); - }); - }); - - it('TIFF lzw compression with horizontal predictor shrinks test file', function (done) { - const startSize = fs.statSync(fixtures.inputTiffUncompressed).size; - sharp(fixtures.inputTiffUncompressed) - .tiff({ - compression: 'lzw', - predictor: 'horizontal' - }) - .toFile(fixtures.outputTiff, (err, info) => { - if (err) throw err; - assert.strictEqual('tiff', info.format); - assert(info.size < startSize); - fs.unlink(fixtures.outputTiff, done); - }); - }); - - it('TIFF deflate compression with horizontal predictor shrinks test file', function (done) { - const startSize = fs.statSync(fixtures.inputTiffUncompressed).size; - sharp(fixtures.inputTiffUncompressed) - .tiff({ - compression: 'deflate', - predictor: 'horizontal' - }) - .toFile(fixtures.outputTiff, (err, info) => { - if (err) throw err; - assert.strictEqual('tiff', info.format); - assert(info.size < startSize); - fs.unlink(fixtures.outputTiff, done); - }); - }); - - it('TIFF deflate compression with float predictor shrinks test file', function (done) { - const startSize = fs.statSync(fixtures.inputTiffUncompressed).size; - sharp(fixtures.inputTiffUncompressed) - .tiff({ - compression: 'deflate', - predictor: 'float' - }) - .toFile(fixtures.outputTiff, (err, info) => { - if (err) throw err; - assert.strictEqual('tiff', info.format); - assert(info.size < startSize); - fs.unlink(fixtures.outputTiff, done); - }); - }); - - it('TIFF deflate compression without predictor shrinks test file', function (done) { - const startSize = fs.statSync(fixtures.inputTiffUncompressed).size; - sharp(fixtures.inputTiffUncompressed) - .tiff({ - compression: 'deflate', - predictor: 'none' - }) - .toFile(fixtures.outputTiff, (err, info) => { - if (err) throw err; - assert.strictEqual('tiff', info.format); - assert(info.size < startSize); - fs.unlink(fixtures.outputTiff, done); - }); - }); - - it('TIFF jpeg compression shrinks test file', function (done) { - const startSize = fs.statSync(fixtures.inputTiffUncompressed).size; - sharp(fixtures.inputTiffUncompressed) - .tiff({ - compression: 'jpeg' - }) - .toFile(fixtures.outputTiff, (err, info) => { - if (err) throw err; - assert.strictEqual('tiff', info.format); - assert(info.size < startSize); - fs.unlink(fixtures.outputTiff, done); - }); - }); - - it('TIFF none compression does not throw error', function () { - assert.doesNotThrow(function () { - sharp().tiff({ compression: 'none' }); - }); - }); - - it('TIFF lzw compression does not throw error', function () { - assert.doesNotThrow(function () { - sharp().tiff({ compression: 'lzw' }); - }); - }); - - it('TIFF deflate compression does not throw error', function () { - assert.doesNotThrow(function () { - sharp().tiff({ compression: 'deflate' }); - }); - }); - - it('TIFF invalid compression option throws', function () { - assert.throws(function () { - sharp().tiff({ compression: 0 }); - }); - }); - - it('TIFF invalid compression option throws', function () { - assert.throws(function () { - sharp().tiff({ compression: 'a' }); - }); - }); - - it('TIFF invalid predictor option throws', function () { - assert.throws(function () { - sharp().tiff({ predictor: 'a' }); - }); - }); - - it('TIFF horizontal predictor does not throw error', function () { - assert.doesNotThrow(function () { - sharp().tiff({ predictor: 'horizontal' }); - }); - }); - - it('TIFF float predictor does not throw error', function () { - assert.doesNotThrow(function () { - sharp().tiff({ predictor: 'float' }); }); }); - it('TIFF none predictor does not throw error', function () { - assert.doesNotThrow(function () { - sharp().tiff({ predictor: 'none' }); - }); - }); - - it('Input and output formats match when not forcing', function (done) { + it('Input and output formats match when not forcing', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .png({ compressionLevel: 1, force: false }) - .toBuffer(function (err, data, info) { + .toBuffer((err, _data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -1121,41 +647,42 @@ describe('Input/output', function () { }); }); - it('Load GIF from Buffer', function (done) { - const inputGifBuffer = fs.readFileSync(fixtures.inputGif); - sharp(inputGifBuffer) + it('Can force output format with output chaining', () => sharp(fixtures.inputJpg) .resize(320, 240) + .png({ force: true }) + .jpeg({ force: false }) + .toBuffer({ resolveWithObject: true }) + .then((out) => { + assert.strictEqual('png', out.info.format); + })); + + it('toFormat=JPEG takes precedence over WebP extension', (_t, done) => { + const outputWebP = fixtures.path('output.webp'); + sharp(fixtures.inputPng) + .resize(8) .jpeg() - .toBuffer(function (err, data, info) { + .toFile(outputWebP, (err, info) => { if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual(data.length, info.size); assert.strictEqual('jpeg', info.format); - assert.strictEqual(320, info.width); - assert.strictEqual(240, info.height); - done(); + fs.rm(outputWebP, done); }); }); - it('Load GIF grey+alpha from file, auto convert to PNG', function (done) { - sharp(fixtures.inputGifGreyPlusAlpha) - .resize(8, 4) - .toBuffer(function (err, data, info) { + it('toFormat=WebP takes precedence over JPEG extension', (_t, done) => { + sharp(fixtures.inputPng) + .resize(8) + .webp() + .toFile(outputJpg, (err, info) => { if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual(data.length, info.size); - assert.strictEqual('png', info.format); - assert.strictEqual(8, info.width); - assert.strictEqual(4, info.height); - assert.strictEqual(4, info.channels); + assert.strictEqual('webp', info.format); done(); }); }); - it('Load Vips V file', function (done) { + it('Load Vips V file', (_t, done) => { sharp(fixtures.inputV) .jpeg() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -1165,255 +692,296 @@ describe('Input/output', function () { }); }); - it('Save Vips V file', function (done) { + it('Save Vips V file', (_t, done) => { + const outputV = fixtures.path('output.v'); sharp(fixtures.inputJpg) - .extract({left: 910, top: 1105, width: 70, height: 60}) - .toFile(fixtures.outputV, function (err, info) { + .extract({ left: 910, top: 1105, width: 70, height: 60 }) + .toFile(outputV, (err, info) => { if (err) throw err; assert.strictEqual(true, info.size > 0); assert.strictEqual('v', info.format); assert.strictEqual(70, info.width); assert.strictEqual(60, info.height); - fs.unlinkSync(fixtures.outputV); - done(); + fs.rm(outputV, done); }); }); - describe('Ouput raw, uncompressed image data', function () { - it('1 channel greyscale image', function (done) { - sharp(fixtures.inputJpg) - .greyscale() - .resize(32, 24) - .raw() - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(32 * 24 * 1, info.size); - assert.strictEqual(data.length, info.size); - assert.strictEqual('raw', info.format); - assert.strictEqual(32, info.width); - assert.strictEqual(24, info.height); - done(); - }); - }); - it('3 channel colour image without transparency', function (done) { - sharp(fixtures.inputJpg) - .resize(32, 24) - .toFormat('raw') - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(32 * 24 * 3, info.size); - assert.strictEqual(data.length, info.size); - assert.strictEqual('raw', info.format); - assert.strictEqual(32, info.width); - assert.strictEqual(24, info.height); - done(); - }); - }); - it('4 channel colour image with transparency', function (done) { - sharp(fixtures.inputPngWithTransparency) - .resize(32, 24) - .toFormat(sharp.format.raw) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(32 * 24 * 4, info.size); - assert.strictEqual(data.length, info.size); - assert.strictEqual('raw', info.format); - assert.strictEqual(32, info.width); - assert.strictEqual(24, info.height); - done(); - }); - }); - }); - - describe('Limit pixel count of input image', function () { - it('Invalid fails - negative', function (done) { - let isValid = false; - try { - sharp().limitInputPixels(-1); - isValid = true; - } catch (e) {} - assert(!isValid); - done(); - }); - - it('Invalid fails - float', function (done) { - let isValid = false; - try { - sharp().limitInputPixels(12.3); - isValid = true; - } catch (e) {} - assert(!isValid); - done(); - }); - - it('Invalid fails - string', function (done) { - let isValid = false; - try { - sharp().limitInputPixels('fail'); - isValid = true; - } catch (e) {} - assert(!isValid); - done(); - }); - - it('Same size as input works', function (done) { - sharp(fixtures.inputJpg).metadata(function (err, metadata) { - if (err) throw err; - sharp(fixtures.inputJpg) - .limitInputPixels(metadata.width * metadata.height) - .toBuffer(function (err) { - assert.strictEqual(true, !err); - done(); - }); - }); - }); - - it('Disabling limit works', function (done) { - sharp(fixtures.inputJpgLarge) - .limitInputPixels(false) - .resize(2) - .toBuffer(function (err) { - assert.strictEqual(true, !err); - done(); - }); - }); + it('can ignore ICC profile', async () => { + const [r1, g1, b1] = await sharp(fixtures.inputJpgWithPortraitExif5, { ignoreIcc: true }) + .extract({ width: 1, height: 1, top: 16, left: 16 }) + .raw() + .toBuffer(); - it('Enabling default limit works and fails with a large image', function (done) { - sharp(fixtures.inputJpgLarge) - .limitInputPixels(true) - .toBuffer(function (err) { - assert.strictEqual(true, !!err); - done(); - }); - }); + const [r2, g2, b2] = await sharp(fixtures.inputJpgWithPortraitExif5, { ignoreIcc: false }) + .extract({ width: 1, height: 1, top: 16, left: 16 }) + .raw() + .toBuffer(); - it('Smaller than input fails', function (done) { - sharp(fixtures.inputJpg).metadata(function (err, metadata) { - if (err) throw err; - sharp(fixtures.inputJpg) - .limitInputPixels((metadata.width * metadata.height) - 1) - .toBuffer(function (err) { - assert.strictEqual(true, !!err); - done(); - }); - }); + assert.deepStrictEqual({ r1, g1, b1, r2, g2, b2 }, { + r1: 60, + r2: 77, + g1: 54, + g2: 69, + b1: 20, + b2: 25 }); }); - describe('Input options', function () { - it('Non-Object options fails', function () { - assert.throws(function () { - sharp(null, 'zoinks'); - }); - }); - it('Invalid density: string', function () { - assert.throws(function () { - sharp(null, { density: 'zoinks' }); - }); - }); - it('Invalid density: float', function () { - assert.throws(function () { - sharp(null, { density: 0.5 }); + describe('Switch off safety limits for certain formats', () => { + it('Valid', () => { + assert.doesNotThrow(() => { + sharp({ unlimited: true }); }); }); - it('Ignore unknown attribute', function () { - sharp(null, { unknown: true }); + it('Invalid', () => { + assert.throws(() => { + sharp({ unlimited: -1 }); + }, /Expected boolean for unlimited but received -1 of type number/); }); }); - describe('Raw pixel input', function () { - it('Missing options', function () { - assert.throws(function () { - sharp({ raw: {} }); + describe('Limit pixel count of input image', () => { + it('Invalid fails - negative', () => { + assert.throws(() => { + sharp({ limitInputPixels: -1 }); }); }); - it('Incomplete options', function () { - assert.throws(function () { - sharp({ raw: { width: 1, height: 1 } }); - }); - }); - it('Invalid channels', function () { - assert.throws(function () { - sharp({ raw: { width: 1, height: 1, channels: 5 } }); + + it('Invalid fails - float', () => { + assert.throws(() => { + sharp({ limitInputPixels: 12.3 }); }); }); - it('Invalid height', function () { - assert.throws(function () { - sharp({ raw: { width: 1, height: 0, channels: 4 } }); + + it('Invalid fails - integer overflow', () => { + assert.throws(() => { + sharp({ limitInputPixels: Number.MAX_SAFE_INTEGER + 1 }); }); }); - it('Invalid width', function () { - assert.throws(function () { - sharp({ raw: { width: 'zoinks', height: 1, channels: 4 } }); + + it('Invalid fails - string', () => { + assert.throws(() => { + sharp({ limitInputPixels: 'fail' }); }); }); - it('RGB', function (done) { - // Convert to raw pixel data + + it('Same size as input works', () => sharp(fixtures.inputJpg) - .resize(256) - .raw() - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(256, info.width); - assert.strictEqual(209, info.height); - assert.strictEqual(3, info.channels); - // Convert back to JPEG - sharp(data, { - raw: { - width: info.width, - height: info.height, - channels: info.channels - }}) - .jpeg() - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(256, info.width); - assert.strictEqual(209, info.height); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(fixtures.inputJpg, data, done); - }); - }); - }); - it('RGBA', function (done) { - // Convert to raw pixel data - sharp(fixtures.inputPngOverlayLayer1) - .resize(256) - .raw() - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(256, info.width); - assert.strictEqual(192, info.height); - assert.strictEqual(4, info.channels); - // Convert back to PNG - sharp(data, { - raw: { - width: info.width, - height: info.height, - channels: info.channels - }}) - .png() - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(256, info.width); - assert.strictEqual(192, info.height); - assert.strictEqual(4, info.channels); - fixtures.assertSimilar(fixtures.inputPngOverlayLayer1, data, { threshold: 7 }, done); - }); - }); - }); + .metadata() + .then(({ width, height }) => + sharp(fixtures.inputJpg, { limitInputPixels: width * height }) + .resize(2) + .toBuffer() + ) + ); + + it('Disabling limit works', () => + sharp(fixtures.inputJpgLarge, { limitInputPixels: false }) + .resize(2) + .toBuffer() + ); + + it('Enabling default limit works and fails with a large image', () => + sharp(fixtures.inputJpgLarge, { limitInputPixels: true }) + .toBuffer() + .then(() => { + assert.fail('Expected to fail'); + }) + .catch(err => { + assert.strictEqual(err.message, 'Input image exceeds pixel limit'); + }) + ); + + it('Enabling default limit works and fails for an image with resolution higher than uint32 limit', () => + sharp(fixtures.inputPngUint32Limit, { limitInputPixels: true }) + .toBuffer() + .then(() => { + assert.fail('Expected to fail'); + }) + .catch(err => { + assert.strictEqual(err.message, 'Input image exceeds pixel limit'); + }) + ); + + it('Smaller than input fails', () => + sharp(fixtures.inputJpg) + .metadata() + .then(({ width, height }) => + sharp(fixtures.inputJpg, { limitInputPixels: width * height - 1 }) + .toBuffer() + .then(() => { + assert.fail('Expected to fail'); + }) + .catch(err => { + assert.strictEqual(err.message, 'Input image exceeds pixel limit'); + }) + ) + ); + }); + + describe('Input options', () => { + it('Option-less', () => { + sharp(); + }); + it('Ignore unknown attribute', () => { + sharp({ unknown: true }); + }); + it('undefined with options fails', () => { + assert.throws(() => { + sharp(undefined, {}); + }, /Unsupported input 'undefined' of type undefined when also providing options of type object/); + }); + it('null with options fails', () => { + assert.throws(() => { + sharp(null, {}); + }, /Unsupported input 'null' of type object when also providing options of type object/); + }); + it('Non-Object options fails', () => { + assert.throws(() => { + sharp('test', 'zoinks'); + }, /Invalid input options zoinks/); + }); + it('Invalid density: string', () => { + assert.throws(() => { + sharp({ density: 'zoinks' }); + }, /Expected number between 1 and 100000 for density but received zoinks of type string/); + }); + it('Invalid ignoreIcc: string', () => { + assert.throws(() => { + sharp({ ignoreIcc: 'zoinks' }); + }, /Expected boolean for ignoreIcc but received zoinks of type string/); + }); + it('Setting animated property updates pages property', () => { + assert.strictEqual(sharp({ animated: false }).options.input.pages, 1); + assert.strictEqual(sharp({ animated: true }).options.input.pages, -1); + }); + it('Invalid animated property throws', () => { + assert.throws(() => { + sharp({ animated: -1 }); + }, /Expected boolean for animated but received -1 of type number/); + }); + it('Invalid page property throws', () => { + assert.throws(() => { + sharp({ page: -1 }); + }, /Expected integer between 0 and 100000 for page but received -1 of type number/); + }); + it('Invalid pages property throws', () => { + assert.throws(() => { + sharp({ pages: '1' }); + }, /Expected integer between -1 and 100000 for pages but received 1 of type string/); + }); + it('Valid openSlide.level property', () => { + sharp({ openSlide: { level: 1 } }); + sharp({ level: 1 }); + }); + it('Invalid openSlide.level property (string) throws', () => { + assert.throws( + () => sharp({ openSlide: { level: '1' } }), + /Expected integer between 0 and 256 for openSlide.level but received 1 of type string/ + ); + assert.throws( + () => sharp({ level: '1' }), + /Expected integer between 0 and 256 for level but received 1 of type string/ + ); + }); + it('Invalid openSlide.level property (negative) throws', () => { + assert.throws( + () => sharp({ openSlide: { level: -1 } }), + /Expected integer between 0 and 256 for openSlide\.level but received -1 of type number/ + ); + assert.throws( + () => sharp({ level: -1 }), + /Expected integer between 0 and 256 for level but received -1 of type number/ + ); + }); + it('Valid tiff.subifd property', () => { + sharp({ tiff: { subifd: 1 } }); + sharp({ subifd: 1 }); + }); + it('Invalid tiff.subifd property (string) throws', () => { + assert.throws( + () => sharp({ tiff: { subifd: '1' } }), + /Expected integer between -1 and 100000 for tiff\.subifd but received 1 of type string/ + ); + assert.throws( + () => sharp({ subifd: '1' }), + /Expected integer between -1 and 100000 for subifd but received 1 of type string/ + ); + }); + it('Invalid tiff.subifd property (float) throws', () => { + assert.throws( + () => sharp({ tiff: { subifd: 1.2 } }), + /Expected integer between -1 and 100000 for tiff\.subifd but received 1.2 of type number/ + ); + assert.throws( + () => sharp({ subifd: 1.2 }), + /Expected integer between -1 and 100000 for subifd but received 1.2 of type number/ + ); + }); + it('Valid pdf.background property (string)', () => { + sharp({ pdf: { background: '#00ff00' } }); + sharp({ pdfBackground: '#00ff00' }); + }); + it('Valid pdf.background property (object)', () => { + sharp({ pdf: { background: { r: 0, g: 255, b: 0 } } }); + sharp({ pdfBackground: { r: 0, g: 255, b: 0 } }); + }); + it('Invalid pdf.background property (string) throws', () => { + assert.throws( + () => sharp({ pdf: { background: '00ff00' } }), + /Unable to parse color from string/ + ); + assert.throws( + () => sharp({ pdfBackground: '00ff00' }), + /Unable to parse color from string/ + ); + }); + it('Invalid pdf.background property (number) throws', () => { + assert.throws( + () => sharp({ pdf: { background: 255 } }), + /Expected object or string for background/ + ); + assert.throws( + () => sharp({ pdf: { background: 255 } }), + /Expected object or string for background/ + ); + }); + it('Invalid pdf.background property (object)', () => { + assert.throws( + () => sharp({ pdf: { background: { red: 0, green: 255, blue: 0 } } }), + /Unable to parse color from object/ + ); + assert.throws( + () => sharp({ pdfBackground: { red: 0, green: 255, blue: 0 } }), + /Unable to parse color from object/ + ); + }); + }); + + it('Fails when writing to missing directory', async () => { + const create = { + width: 8, + height: 8, + channels: 3, + background: { r: 0, g: 0, b: 0 } + }; + await assert.rejects( + () => sharp({ create }).toFile('does-not-exist/out.jpg'), + /unable to open for write/ + ); }); - describe('create new image', function () { - it('RGB', function (done) { + describe('create new image', () => { + it('RGB', (_t, done) => { const create = { width: 10, height: 20, channels: 3, background: { r: 0, g: 255, b: 0 } }; - sharp({ create: create }) + sharp({ create }) .jpeg() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(create.width, info.width); assert.strictEqual(create.height, info.height); @@ -1422,16 +990,16 @@ describe('Input/output', function () { fixtures.assertSimilar(fixtures.expected('create-rgb.jpg'), data, done); }); }); - it('RGBA', function (done) { + it('RGBA', (_t, done) => { const create = { width: 20, height: 10, channels: 4, background: { r: 255, g: 0, b: 0, alpha: 128 } }; - sharp({ create: create }) + sharp({ create }) .png() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(create.width, info.width); assert.strictEqual(create.height, info.height); @@ -1440,40 +1008,40 @@ describe('Input/output', function () { fixtures.assertSimilar(fixtures.expected('create-rgba.png'), data, done); }); }); - it('Invalid channels', function () { + it('Invalid channels', () => { const create = { width: 10, height: 20, channels: 2, background: { r: 0, g: 0, b: 0 } }; - assert.throws(function () { - sharp({ create: create }); + assert.throws(() => { + sharp({ create }); }); }); - it('Missing background', function () { + it('Missing background', () => { const create = { width: 10, height: 20, channels: 3 }; - assert.throws(function () { - sharp({ create: create }); + assert.throws(() => { + sharp({ create }); }); }); }); - it('Queue length change events', function (done) { + it('Queue length change events', (_t, done) => { let eventCounter = 0; - const queueListener = function (queueLength) { + const queueListener = (queueLength) => { assert.strictEqual(true, queueLength === 0 || queueLength === 1); eventCounter++; }; sharp.queue.on('change', queueListener); sharp(fixtures.inputJpg) .resize(320, 240) - .toBuffer(function (err) { - process.nextTick(function () { + .toBuffer((err) => { + process.nextTick(() => { sharp.queue.removeListener('change', queueListener); if (err) throw err; assert.strictEqual(2, eventCounter); @@ -1482,32 +1050,46 @@ describe('Input/output', function () { }); }); - it('Info event data', function (done) { + it('Info event data', (_t, done) => { const readable = fs.createReadStream(fixtures.inputJPGBig); const inPipeline = sharp() .resize(840, 472) .raw() - .on('info', function (info) { + .on('info', (info) => { assert.strictEqual(840, info.width); assert.strictEqual(472, info.height); assert.strictEqual(3, info.channels); }); - const badPipeline = sharp(null, {raw: {width: 840, height: 500, channels: 3}}) + const badPipeline = sharp({ raw: { width: 840, height: 500, channels: 3 } }) .toFormat('jpeg') - .toBuffer(function (err, data, info) { + .toBuffer((err) => { assert.strictEqual(err.message.indexOf('memory area too small') > 0, true); const readable = fs.createReadStream(fixtures.inputJPGBig); const inPipeline = sharp() .resize(840, 472) .raw(); - const goodPipeline = sharp(null, {raw: {width: 840, height: 472, channels: 3}}) + const goodPipeline = sharp({ raw: { width: 840, height: 472, channels: 3 } }) .toFormat('jpeg') - .toBuffer(function (err, data, info) { - if (err) throw err; - done(); - }); + .toBuffer(done); readable.pipe(inPipeline).pipe(goodPipeline); }); readable.pipe(inPipeline).pipe(badPipeline); }); + + it('supports wide-character filenames', async () => { + const filename = fixtures.path('output.图片.jpg'); + const create = { + width: 8, + height: 8, + channels: 3, + background: 'green' + }; + await sharp({ create }).toFile(filename); + + const { width, height, channels, format } = await sharp(filename).metadata(); + assert.strictEqual(width, 8); + assert.strictEqual(height, 8); + assert.strictEqual(channels, 3); + assert.strictEqual(format, 'jpeg'); + }); }); diff --git a/test/unit/join.js b/test/unit/join.js new file mode 100644 index 000000000..c95c35626 --- /dev/null +++ b/test/unit/join.js @@ -0,0 +1,130 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('Join input images together', () => { + it('Join two images horizontally', async () => { + const data = await sharp([ + fixtures.inputPngPalette, + { create: { width: 68, height: 68, channels: 3, background: 'green' } } + ], { join: { across: 2 } }).toBuffer(); + + const metadata = await sharp(data).metadata(); + assert.strictEqual(metadata.format, 'png'); + assert.strictEqual(metadata.width, 136); + assert.strictEqual(metadata.height, 68); + assert.strictEqual(metadata.space, 'srgb'); + assert.strictEqual(metadata.channels, 3); + assert.strictEqual(metadata.hasAlpha, false); + }); + + it('Join two images vertically with shim and alpha channel', async () => { + const data = await sharp([ + fixtures.inputPngPalette, + { create: { width: 68, height: 68, channels: 4, background: 'green' } } + ], { join: { across: 1, shim: 8 } }).toBuffer(); + + const metadata = await sharp(data).metadata(); + assert.strictEqual(metadata.format, 'png'); + assert.strictEqual(metadata.width, 68); + assert.strictEqual(metadata.height, 144); + assert.strictEqual(metadata.space, 'srgb'); + assert.strictEqual(metadata.channels, 4); + assert.strictEqual(metadata.hasAlpha, true); + }); + + it('Join four images in 2x2 grid, with centre alignment', async () => { + const output = fixtures.path('output.join2x2.png'); + const info = await sharp([ + fixtures.inputPngPalette, + { create: { width: 128, height: 128, channels: 3, background: 'green' } }, + { create: { width: 128, height: 128, channels: 3, background: 'red' } }, + fixtures.inputPngPalette + ], { join: { across: 2, halign: 'centre', valign: 'centre', background: 'blue' } }) + .toFile(output); + + fixtures.assertMaxColourDistance(output, fixtures.expected('join2x2.png')); + + assert.strictEqual(info.format, 'png'); + assert.strictEqual(info.width, 256); + assert.strictEqual(info.height, 256); + assert.strictEqual(info.channels, 3); + }); + + it('Join two images as animation', async () => { + const data = await sharp([ + fixtures.inputPngPalette, + { create: { width: 68, height: 68, channels: 3, background: 'green' } } + ], { join: { animated: true } }).gif().toBuffer(); + + const metadata = await sharp(data).metadata(); + assert.strictEqual(metadata.format, 'gif'); + assert.strictEqual(metadata.width, 68); + assert.strictEqual(metadata.height, 68); + assert.strictEqual(metadata.pages, 2); + }); + + it('Empty array of inputs throws', () => { + assert.throws( + () => sharp([]), + /Expected at least two images to join/ + ); + }); + it('Attempt to recursively join throws', () => { + assert.throws( + () => sharp([fixtures.inputJpg, [fixtures.inputJpg, fixtures.inputJpg]]), + /Recursive join is unsupported/ + ); + }); + it('Attempt to set join props on non-array input throws', () => { + assert.throws( + () => sharp(fixtures.inputJpg, { join: { across: 2 } }), + /Expected input to be an array of images to join/ + ); + }); + it('Invalid animated throws', () => { + assert.throws( + () => sharp([fixtures.inputJpg, fixtures.inputJpg], { join: { animated: 'fail' } }), + /Expected boolean for join.animated but received fail of type string/ + ); + }); + it('Invalid across throws', () => { + assert.throws( + () => sharp([fixtures.inputJpg, fixtures.inputJpg], { join: { across: 'fail' } }), + /Expected integer between 1 and 100000 for join.across but received fail of type string/ + ); + assert.throws( + () => sharp([fixtures.inputJpg, fixtures.inputJpg], { join: { across: 0 } }), + /Expected integer between 1 and 100000 for join.across but received 0 of type number/ + ); + }); + it('Invalid shim throws', () => { + assert.throws( + () => sharp([fixtures.inputJpg, fixtures.inputJpg], { join: { shim: 'fail' } }), + /Expected integer between 0 and 100000 for join.shim but received fail of type string/ + ); + assert.throws( + () => sharp([fixtures.inputJpg, fixtures.inputJpg], { join: { shim: -1 } }), + /Expected integer between 0 and 100000 for join.shim but received -1 of type number/ + ); + }); + it('Invalid halign', () => { + assert.throws( + () => sharp([fixtures.inputJpg, fixtures.inputJpg], { join: { halign: 'fail' } }), + /Expected valid alignment for join.halign but received fail of type string/ + ); + }); + it('Invalid valign', () => { + assert.throws( + () => sharp([fixtures.inputJpg, fixtures.inputJpg], { join: { valign: 'fail' } }), + /Expected valid alignment for join.valign but received fail of type string/ + ); + }); +}); diff --git a/test/unit/joinChannel.js b/test/unit/joinChannel.js index dd993d2cb..feebb1337 100644 --- a/test/unit/joinChannel.js +++ b/test/unit/joinChannel.js @@ -1,18 +1,22 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); -const fs = require('fs'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); +const fs = require('node:fs'); const sharp = require('../../'); const fixtures = require('../fixtures'); -describe('Image channel insertion', function () { - it('Grayscale to RGB, buffer', function (done) { +describe('Image channel insertion', () => { + it('Grayscale to RGB, buffer', (_t, done) => { sharp(fixtures.inputPng) // gray -> red .resize(320, 240) .joinChannel(fixtures.inputPngTestJoinChannel) // new green channel - .joinChannel(fixtures.inputPngStripesH) // new blue channel - .toBuffer(function (err, data, info) { + .joinChannel(fixtures.inputPngStripesH) // new blue channel + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); @@ -21,12 +25,12 @@ describe('Image channel insertion', function () { }); }); - it('Grayscale to RGB, file', function (done) { + it('Grayscale to RGB, file', (_t, done) => { sharp(fixtures.inputPng) // gray -> red .resize(320, 240) .joinChannel(fs.readFileSync(fixtures.inputPngTestJoinChannel)) // new green channel - .joinChannel(fs.readFileSync(fixtures.inputPngStripesH)) // new blue channel - .toBuffer(function (err, data, info) { + .joinChannel(fs.readFileSync(fixtures.inputPngStripesH)) // new blue channel + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); @@ -35,7 +39,7 @@ describe('Image channel insertion', function () { }); }); - it('Grayscale to RGBA, buffer', function (done) { + it('Grayscale to RGBA, buffer', (_t, done) => { sharp(fixtures.inputPng) // gray -> red .resize(320, 240) .joinChannel([ @@ -44,7 +48,7 @@ describe('Image channel insertion', function () { fixtures.inputPngStripesV ]) // new green + blue + alpha channel .toColourspace(sharp.colourspace.srgb) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); @@ -53,7 +57,7 @@ describe('Image channel insertion', function () { }); }); - it('Grayscale to RGBA, file', function (done) { + it('Grayscale to RGBA, file', (_t, done) => { sharp(fixtures.inputPng) // gray -> red .resize(320, 240) .joinChannel([ @@ -62,7 +66,7 @@ describe('Image channel insertion', function () { fs.readFileSync(fixtures.inputPngStripesV) // new alpha channel ]) .toColourspace('srgb') - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); @@ -71,7 +75,7 @@ describe('Image channel insertion', function () { }); }); - it('Grayscale to CMYK, buffers', function (done) { + it('Grayscale to CMYK, buffers', (_t, done) => { sharp(fixtures.inputPng) // gray -> magenta .resize(320, 240) .joinChannel([ @@ -81,7 +85,7 @@ describe('Image channel insertion', function () { ]) .toColorspace('cmyk') .toFormat('jpeg') - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); @@ -90,12 +94,12 @@ describe('Image channel insertion', function () { }); }); - it('Join raw buffers to RGB', function (done) { + it('Join raw buffers to RGB', (_t, done) => { Promise.all([ sharp(fixtures.inputPngTestJoinChannel).toColourspace('b-w').raw().toBuffer(), sharp(fixtures.inputPngStripesH).toColourspace('b-w').raw().toBuffer() ]) - .then(function (buffers) { + .then((buffers) => { sharp(fixtures.inputPng) .resize(320, 240) .joinChannel(buffers, { @@ -105,7 +109,7 @@ describe('Image channel insertion', function () { channels: 1 } }) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); @@ -113,12 +117,12 @@ describe('Image channel insertion', function () { fixtures.assertSimilar(fixtures.expected('joinChannel-rgb.jpg'), data, done); }); }) - .catch(function (err) { + .catch((err) => { throw err; }); }); - it('Grayscale to RGBA, files, two arrays', function (done) { + it('Grayscale to RGBA, files, two arrays', (_t, done) => { sharp(fixtures.inputPng) // gray -> red .resize(320, 240) .joinChannel([fs.readFileSync(fixtures.inputPngTestJoinChannel)]) // new green channel @@ -127,7 +131,7 @@ describe('Image channel insertion', function () { fs.readFileSync(fixtures.inputPngStripesV) // new alpha channel ]) .toColourspace('srgb') - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); @@ -136,20 +140,20 @@ describe('Image channel insertion', function () { }); }); - it('Invalid raw buffer description', function () { - assert.throws(function () { - sharp().joinChannel(fs.readFileSync(fixtures.inputPng), {raw: {}}); + it('Invalid raw buffer description', () => { + assert.throws(() => { + sharp().joinChannel(fs.readFileSync(fixtures.inputPng), { raw: {} }); }); }); - it('Invalid input', function () { - assert.throws(function () { + it('Invalid input', () => { + assert.throws(() => { sharp(fixtures.inputJpg).joinChannel(1); }); }); - it('No arguments', function () { - assert.throws(function () { + it('No arguments', () => { + assert.throws(() => { sharp(fixtures.inputJpg).joinChannel(); }); }); diff --git a/test/unit/jp2.js b/test/unit/jp2.js new file mode 100644 index 000000000..536e0a040 --- /dev/null +++ b/test/unit/jp2.js @@ -0,0 +1,131 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const fs = require('node:fs'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('JP2 output', () => { + if (!sharp.format.jp2k.input.buffer) { + it('JP2 output should fail due to missing OpenJPEG', () => + assert.rejects(async () => + sharp(fixtures.inputJpg) + .jp2() + .toBuffer(), + /JP2 output requires libvips with support for OpenJPEG/ + ) + ); + + it('JP2 file output should fail due to missing OpenJPEG', () => + assert.rejects(async () => sharp(fixtures.inputJpg).toFile('test.jp2'), + /JP2 output requires libvips with support for OpenJPEG/ + ) + ); + + it('File with JP2-like suffix should not fail due to missing OpenJPEG', () => { + const output = fixtures.path('output.failj2c'); + return assert.doesNotReject( + async () => sharp(fixtures.inputPngWithOneColor).toFile(output) + ); + }); + } else { + it('JP2 Buffer to PNG Buffer', () => { + sharp(fs.readFileSync(fixtures.inputJp2)) + .resize(8, 15) + .png() + .toBuffer({ resolveWithObject: true }) + .then(({ data, info }) => { + assert.strictEqual(true, data.length > 0); + assert.strictEqual(data.length, info.size); + assert.strictEqual('png', info.format); + assert.strictEqual(8, info.width); + assert.strictEqual(15, info.height); + assert.strictEqual(3, info.channels); + }); + }); + + it('JP2 quality', (_t, done) => { + sharp(fixtures.inputJp2) + .resize(320, 240) + .jp2({ quality: 70 }) + .toBuffer((err, buffer70) => { + if (err) throw err; + sharp(fixtures.inputJp2) + .resize(320, 240) + .toBuffer((err, buffer80) => { + if (err) throw err; + assert(buffer70.length < buffer80.length); + done(); + }); + }); + }); + + it('Without chroma subsampling generates larger file', (_t, done) => { + // First generate with chroma subsampling (default) + sharp(fixtures.inputJp2) + .resize(320, 240) + .jp2({ chromaSubsampling: '4:2:0' }) + .toBuffer((err, withChromaSubsamplingData, withChromaSubsamplingInfo) => { + if (err) throw err; + assert.strictEqual(true, withChromaSubsamplingData.length > 0); + assert.strictEqual(withChromaSubsamplingData.length, withChromaSubsamplingInfo.size); + assert.strictEqual('jp2', withChromaSubsamplingInfo.format); + assert.strictEqual(320, withChromaSubsamplingInfo.width); + assert.strictEqual(240, withChromaSubsamplingInfo.height); + // Then generate without + sharp(fixtures.inputJp2) + .resize(320, 240) + .jp2({ chromaSubsampling: '4:4:4' }) + .toBuffer((err, withoutChromaSubsamplingData, withoutChromaSubsamplingInfo) => { + if (err) throw err; + assert.strictEqual(true, withoutChromaSubsamplingData.length > 0); + assert.strictEqual(withoutChromaSubsamplingData.length, withoutChromaSubsamplingInfo.size); + assert.strictEqual('jp2', withoutChromaSubsamplingInfo.format); + assert.strictEqual(320, withoutChromaSubsamplingInfo.width); + assert.strictEqual(240, withoutChromaSubsamplingInfo.height); + assert.strictEqual(true, withChromaSubsamplingData.length <= withoutChromaSubsamplingData.length); + done(); + }); + }); + }); + + it('can use the jp2Oneshot option to handle multi-part tiled JPEG 2000 file', async () => { + const outputJpg = fixtures.path('output.jpg'); + await assert.rejects( + () => sharp(fixtures.inputJp2TileParts).toFile(outputJpg) + ); + await assert.doesNotReject(async () => { + await sharp(fixtures.inputJp2TileParts, { jp2Oneshot: true }).toFile(outputJpg); + const { format, width, height } = await sharp(outputJpg).metadata(); + assert.strictEqual(format, 'jpeg'); + assert.strictEqual(width, 320); + assert.strictEqual(height, 240); + }); + }); + + it('Invalid JP2 chromaSubsampling value throws error', () => { + assert.throws( + () => sharp().jp2({ chromaSubsampling: '4:2:2' }), + /Expected one of: 4:2:0, 4:4:4 for chromaSubsampling but received 4:2:2 of type string/ + ); + }); + } + + it('valid JP2 oneshot value does not throw error', () => { + assert.doesNotThrow( + () => sharp({ jp2: { oneshot: true } }) + ); + }); + + it('invalid JP2 oneshot value throws error', () => { + assert.throws( + () => sharp({ jp2: { oneshot: 'fail' } }), + /Expected boolean for jp2.oneshot but received fail of type string/ + ); + }); +}); diff --git a/test/unit/jpeg.js b/test/unit/jpeg.js new file mode 100644 index 000000000..9a802cb8f --- /dev/null +++ b/test/unit/jpeg.js @@ -0,0 +1,317 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('JPEG', () => { + it('JPEG quality', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg({ quality: 70 }) + .toBuffer((err, buffer70) => { + if (err) throw err; + sharp(fixtures.inputJpg) + .resize(320, 240) + .toBuffer((err, buffer80) => { + if (err) throw err; + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg({ quality: 90 }) + .toBuffer((err, buffer90) => { + if (err) throw err; + assert(buffer70.length < buffer80.length); + assert(buffer80.length < buffer90.length); + done(); + }); + }); + }); + }); + + describe('Invalid JPEG quality', () => { + [-1, 88.2, 'test'].forEach((quality) => { + it(quality.toString(), () => { + assert.throws(() => { + sharp().jpeg({ quality }); + }); + }); + }); + }); + + describe('Invalid JPEG quantisation table', () => { + [-1, 88.2, 'test'].forEach((table) => { + it(table.toString(), () => { + assert.throws(() => { + sharp().jpeg({ quantisationTable: table }); + }); + }); + }); + }); + + it('Progressive JPEG image', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg({ progressive: false }) + .toBuffer((err, nonProgressiveData, nonProgressiveInfo) => { + if (err) throw err; + assert.strictEqual(true, nonProgressiveData.length > 0); + assert.strictEqual(nonProgressiveData.length, nonProgressiveInfo.size); + assert.strictEqual('jpeg', nonProgressiveInfo.format); + assert.strictEqual(320, nonProgressiveInfo.width); + assert.strictEqual(240, nonProgressiveInfo.height); + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg({ progressive: true }) + .toBuffer((err, progressiveData, progressiveInfo) => { + if (err) throw err; + assert.strictEqual(true, progressiveData.length > 0); + assert.strictEqual(progressiveData.length, progressiveInfo.size); + assert.strictEqual(false, progressiveData.length === nonProgressiveData.length); + assert.strictEqual('jpeg', progressiveInfo.format); + assert.strictEqual(320, progressiveInfo.width); + assert.strictEqual(240, progressiveInfo.height); + done(); + }); + }); + }); + + it('Without chroma subsampling generates larger file', (_t, done) => { + // First generate with chroma subsampling (default) + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg({ chromaSubsampling: '4:2:0' }) + .toBuffer((err, withChromaSubsamplingData, withChromaSubsamplingInfo) => { + if (err) throw err; + assert.strictEqual(true, withChromaSubsamplingData.length > 0); + assert.strictEqual(withChromaSubsamplingData.length, withChromaSubsamplingInfo.size); + assert.strictEqual('jpeg', withChromaSubsamplingInfo.format); + assert.strictEqual(320, withChromaSubsamplingInfo.width); + assert.strictEqual(240, withChromaSubsamplingInfo.height); + // Then generate without + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg({ chromaSubsampling: '4:4:4' }) + .toBuffer((err, withoutChromaSubsamplingData, withoutChromaSubsamplingInfo) => { + if (err) throw err; + assert.strictEqual(true, withoutChromaSubsamplingData.length > 0); + assert.strictEqual(withoutChromaSubsamplingData.length, withoutChromaSubsamplingInfo.size); + assert.strictEqual('jpeg', withoutChromaSubsamplingInfo.format); + assert.strictEqual(320, withoutChromaSubsamplingInfo.width); + assert.strictEqual(240, withoutChromaSubsamplingInfo.height); + assert.strictEqual(true, withChromaSubsamplingData.length < withoutChromaSubsamplingData.length); + done(); + }); + }); + }); + + it('Invalid JPEG chromaSubsampling value throws error', () => { + assert.throws(() => { + sharp().jpeg({ chromaSubsampling: '4:2:2' }); + }); + }); + + it('Trellis quantisation', (_t, done) => { + // First generate without + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg({ trellisQuantisation: false }) + .toBuffer((err, withoutData, withoutInfo) => { + if (err) throw err; + assert.strictEqual(true, withoutData.length > 0); + assert.strictEqual(withoutData.length, withoutInfo.size); + assert.strictEqual('jpeg', withoutInfo.format); + assert.strictEqual(320, withoutInfo.width); + assert.strictEqual(240, withoutInfo.height); + // Then generate with + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg({ trellisQuantization: true }) + .toBuffer((err, withData, withInfo) => { + if (err) throw err; + assert.strictEqual(true, withData.length > 0); + assert.strictEqual(withData.length, withInfo.size); + assert.strictEqual('jpeg', withInfo.format); + assert.strictEqual(320, withInfo.width); + assert.strictEqual(240, withInfo.height); + // Verify image is same (as mozjpeg may not be present) size or less + assert.strictEqual(true, withData.length <= withoutData.length); + done(); + }); + }); + }); + + it('Overshoot deringing', (_t, done) => { + // First generate without + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg({ overshootDeringing: false }) + .toBuffer((err, withoutData, withoutInfo) => { + if (err) throw err; + assert.strictEqual(true, withoutData.length > 0); + assert.strictEqual(withoutData.length, withoutInfo.size); + assert.strictEqual('jpeg', withoutInfo.format); + assert.strictEqual(320, withoutInfo.width); + assert.strictEqual(240, withoutInfo.height); + // Then generate with + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg({ overshootDeringing: true }) + .toBuffer((err, withData, withInfo) => { + if (err) throw err; + assert.strictEqual(true, withData.length > 0); + assert.strictEqual(withData.length, withInfo.size); + assert.strictEqual('jpeg', withInfo.format); + assert.strictEqual(320, withInfo.width); + assert.strictEqual(240, withInfo.height); + done(); + }); + }); + }); + + it('Optimise scans generates different output length', (_t, done) => { + // First generate without + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg({ optimiseScans: false }) + .toBuffer((err, withoutData, withoutInfo) => { + if (err) throw err; + assert.strictEqual(true, withoutData.length > 0); + assert.strictEqual(withoutData.length, withoutInfo.size); + assert.strictEqual('jpeg', withoutInfo.format); + assert.strictEqual(320, withoutInfo.width); + assert.strictEqual(240, withoutInfo.height); + // Then generate with + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg({ optimizeScans: true }) + .toBuffer((err, withData, withInfo) => { + if (err) throw err; + assert.strictEqual(true, withData.length > 0); + assert.strictEqual(withData.length, withInfo.size); + assert.strictEqual('jpeg', withInfo.format); + assert.strictEqual(320, withInfo.width); + assert.strictEqual(240, withInfo.height); + // Verify image is of a different size (progressive output even without mozjpeg) + assert.notStrictEqual(withData.length, withoutData.length); + done(); + }); + }); + }); + + it('Optimise coding generates smaller output length', (_t, done) => { + // First generate with optimize coding enabled (default) + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg() + .toBuffer((err, withOptimiseCoding, withInfo) => { + if (err) throw err; + assert.strictEqual(true, withOptimiseCoding.length > 0); + assert.strictEqual(withOptimiseCoding.length, withInfo.size); + assert.strictEqual('jpeg', withInfo.format); + assert.strictEqual(320, withInfo.width); + assert.strictEqual(240, withInfo.height); + // Then generate with coding disabled + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg({ optimizeCoding: false }) + .toBuffer((err, withoutOptimiseCoding, withoutInfo) => { + if (err) throw err; + assert.strictEqual(true, withoutOptimiseCoding.length > 0); + assert.strictEqual(withoutOptimiseCoding.length, withoutInfo.size); + assert.strictEqual('jpeg', withoutInfo.format); + assert.strictEqual(320, withoutInfo.width); + assert.strictEqual(240, withoutInfo.height); + // Verify optimised image is of a smaller size + assert.strictEqual(true, withOptimiseCoding.length < withoutOptimiseCoding.length); + done(); + }); + }); + }); + + it('Specifying quantisation table provides different JPEG', (_t, done) => { + // First generate with default quantisation table + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg({ optimiseCoding: false }) + .toBuffer((err, withDefaultQuantisationTable, withInfo) => { + if (err) throw err; + assert.strictEqual(true, withDefaultQuantisationTable.length > 0); + assert.strictEqual(withDefaultQuantisationTable.length, withInfo.size); + assert.strictEqual('jpeg', withInfo.format); + assert.strictEqual(320, withInfo.width); + assert.strictEqual(240, withInfo.height); + // Then generate with different quantisation table + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg({ optimiseCoding: false, quantisationTable: 3 }) + .toBuffer((err, withQuantTable3, withoutInfo) => { + if (err) throw err; + assert.strictEqual(true, withQuantTable3.length > 0); + assert.strictEqual(withQuantTable3.length, withoutInfo.size); + assert.strictEqual('jpeg', withoutInfo.format); + assert.strictEqual(320, withoutInfo.width); + assert.strictEqual(240, withoutInfo.height); + + // Verify image is same (as mozjpeg may not be present) size or less + assert.strictEqual(true, withQuantTable3.length <= withDefaultQuantisationTable.length); + done(); + }); + }); + }); + + it('Specifying quantization table provides different JPEG', (_t, done) => { + // First generate with default quantization table + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg({ optimiseCoding: false }) + .toBuffer((err, withDefaultQuantizationTable, withInfo) => { + if (err) throw err; + assert.strictEqual(true, withDefaultQuantizationTable.length > 0); + assert.strictEqual(withDefaultQuantizationTable.length, withInfo.size); + assert.strictEqual('jpeg', withInfo.format); + assert.strictEqual(320, withInfo.width); + assert.strictEqual(240, withInfo.height); + // Then generate with different quantization table + sharp(fixtures.inputJpg) + .resize(320, 240) + .jpeg({ optimiseCoding: false, quantizationTable: 3 }) + .toBuffer((err, withQuantTable3, withoutInfo) => { + if (err) throw err; + assert.strictEqual(true, withQuantTable3.length > 0); + assert.strictEqual(withQuantTable3.length, withoutInfo.size); + assert.strictEqual('jpeg', withoutInfo.format); + assert.strictEqual(320, withoutInfo.width); + assert.strictEqual(240, withoutInfo.height); + + // Verify image is same (as mozjpeg may not be present) size or less + assert.strictEqual(true, withQuantTable3.length <= withDefaultQuantizationTable.length); + done(); + }); + }); + }); + + it('Can use mozjpeg defaults', async () => { + const withoutData = await sharp(fixtures.inputJpg) + .resize(32, 24) + .jpeg({ mozjpeg: false }) + .toBuffer(); + const withoutMeta = await sharp(withoutData).metadata(); + assert.strictEqual(false, withoutMeta.isProgressive); + + const withData = await sharp(fixtures.inputJpg) + .resize(32, 24) + .jpeg({ mozjpeg: true }) + .toBuffer(); + const withMeta = await sharp(withData).metadata(); + assert.strictEqual(true, withMeta.isProgressive); + }); + + it('Invalid mozjpeg value throws error', () => { + assert.throws(() => sharp().jpeg({ mozjpeg: 'fail' })); + }); +}); diff --git a/test/unit/jxl.js b/test/unit/jxl.js new file mode 100644 index 000000000..f70de1cbb --- /dev/null +++ b/test/unit/jxl.js @@ -0,0 +1,101 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); + +describe('JXL', () => { + it('called without options does not throw an error', () => { + assert.doesNotThrow(() => { + sharp().jxl(); + }); + }); + it('valid distance does not throw an error', () => { + assert.doesNotThrow(() => { + sharp().jxl({ distance: 2.3 }); + }); + }); + it('invalid distance should throw an error', () => { + assert.throws(() => { + sharp().jxl({ distance: 15.1 }); + }); + }); + it('non-numeric distance should throw an error', () => { + assert.throws(() => { + sharp().jxl({ distance: 'fail' }); + }); + }); + it('valid quality > 30 does not throw an error', () => { + const s = sharp(); + assert.doesNotThrow(() => { + s.jxl({ quality: 80 }); + }); + assert.strictEqual(s.options.jxlDistance, 1.9); + }); + it('valid quality < 30 does not throw an error', () => { + const s = sharp(); + assert.doesNotThrow(() => { + s.jxl({ quality: 20 }); + }); + assert.strictEqual(s.options.jxlDistance, 9.066666666666666); + }); + it('valid quality does not throw an error', () => { + assert.doesNotThrow(() => { + sharp().jxl({ quality: 80 }); + }); + }); + it('invalid quality should throw an error', () => { + assert.throws(() => { + sharp().jxl({ quality: 101 }); + }); + }); + it('non-numeric quality should throw an error', () => { + assert.throws(() => { + sharp().jxl({ quality: 'fail' }); + }); + }); + it('valid decodingTier does not throw an error', () => { + assert.doesNotThrow(() => { + sharp().jxl({ decodingTier: 2 }); + }); + }); + it('invalid decodingTier should throw an error', () => { + assert.throws(() => { + sharp().jxl({ decodingTier: 5 }); + }); + }); + it('non-numeric decodingTier should throw an error', () => { + assert.throws(() => { + sharp().jxl({ decodingTier: 'fail' }); + }); + }); + it('valid lossless does not throw an error', () => { + assert.doesNotThrow(() => { + sharp().jxl({ lossless: true }); + }); + }); + it('non-boolean lossless should throw an error', () => { + assert.throws(() => { + sharp().jxl({ lossless: 'fail' }); + }); + }); + it('valid effort does not throw an error', () => { + assert.doesNotThrow(() => { + sharp().jxl({ effort: 6 }); + }); + }); + it('out of range effort should throw an error', () => { + assert.throws(() => { + sharp().jxl({ effort: 10 }); + }); + }); + it('invalid effort should throw an error', () => { + assert.throws(() => { + sharp().jxl({ effort: 'fail' }); + }); + }); +}); diff --git a/test/unit/libvips.js b/test/unit/libvips.js new file mode 100644 index 000000000..8ed822458 --- /dev/null +++ b/test/unit/libvips.js @@ -0,0 +1,196 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { after, before, describe, it } = require('node:test'); +const assert = require('node:assert'); +const fs = require('node:fs'); +const semver = require('semver'); +const libvips = require('../../lib/libvips'); + +const originalPlatform = process.platform; + +const setPlatform = (platform) => { + Object.defineProperty(process, 'platform', { value: platform }); +}; + +const restorePlatform = () => { + setPlatform(originalPlatform); +}; + +describe('libvips binaries', () => { + describe('Windows platform', () => { + before(() => { setPlatform('win32'); }); + after(restorePlatform); + + it('pkgConfigPath returns empty string', () => { + assert.strictEqual('', libvips.pkgConfigPath()); + }); + it('globalLibvipsVersion returns empty string', () => { + assert.strictEqual('', libvips.globalLibvipsVersion()); + }); + it('globalLibvipsVersion is always false', () => { + assert.strictEqual(false, libvips.useGlobalLibvips()); + }); + }); + + describe('non-Windows platforms', () => { + before(() => { setPlatform('linux'); }); + after(restorePlatform); + + it('pkgConfigPath returns a string', () => { + const pkgConfigPath = libvips.pkgConfigPath(); + assert.strictEqual('string', typeof pkgConfigPath); + }); + it('globalLibvipsVersion returns a string', () => { + const globalLibvipsVersion = libvips.globalLibvipsVersion(); + assert.strictEqual('string', typeof globalLibvipsVersion); + }); + it('globalLibvipsVersion returns a boolean', () => { + const useGlobalLibvips = libvips.useGlobalLibvips(); + assert.strictEqual('boolean', typeof useGlobalLibvips); + }); + }); + + describe('platform agnostic', () => { + it('minimumLibvipsVersion returns a valid semver', () => { + const minimumLibvipsVersion = libvips.minimumLibvipsVersion; + assert.strictEqual('string', typeof minimumLibvipsVersion); + assert.notStrictEqual(null, semver.valid(minimumLibvipsVersion)); + }); + it('useGlobalLibvips can be ignored via an env var', () => { + process.env.SHARP_IGNORE_GLOBAL_LIBVIPS = 1; + + const useGlobalLibvips = libvips.useGlobalLibvips(); + assert.strictEqual(false, useGlobalLibvips); + + delete process.env.SHARP_IGNORE_GLOBAL_LIBVIPS; + }); + it('useGlobalLibvips can be forced via an env var', () => { + process.env.SHARP_FORCE_GLOBAL_LIBVIPS = 1; + + const useGlobalLibvips = libvips.useGlobalLibvips(); + assert.strictEqual(true, useGlobalLibvips); + + let logged = false; + const logger = (message) => { + assert.strictEqual(message, 'Detected SHARP_FORCE_GLOBAL_LIBVIPS, skipping search for globally-installed libvips'); + logged = true; + }; + const useGlobalLibvipsWithLogger = libvips.useGlobalLibvips(logger); + assert.strictEqual(true, useGlobalLibvipsWithLogger); + assert.strictEqual(true, logged); + + delete process.env.SHARP_FORCE_GLOBAL_LIBVIPS; + }); + }); + + describe('Build time platform detection', () => { + it('Can override platform with npm_config_platform and npm_config_libc', function () { + process.env.npm_config_platform = 'testplatform'; + process.env.npm_config_libc = 'testlibc'; + const platformArch = libvips.buildPlatformArch(); + if (platformArch === 'wasm32') { + return this.skip(); + } + const [platform] = platformArch.split('-'); + assert.strictEqual(platform, 'testplatformtestlibc'); + delete process.env.npm_config_platform; + delete process.env.npm_config_libc; + }); + it('Can override arch with npm_config_arch', function () { + process.env.npm_config_arch = 'test'; + const platformArch = libvips.buildPlatformArch(); + if (platformArch === 'wasm32') { + return this.skip(); + } + const [, arch] = platformArch.split('-'); + assert.strictEqual(arch, 'test'); + delete process.env.npm_config_arch; + }); + }); + + describe('Build time directories', () => { + it('sharp-libvips include', () => { + const dir = libvips.buildSharpLibvipsIncludeDir(); + if (dir) { + assert.strictEqual(fs.statSync(dir).isDirectory(), true); + } + }); + it('sharp-libvips cplusplus', () => { + const dir = libvips.buildSharpLibvipsCPlusPlusDir(); + if (dir) { + assert.strictEqual(fs.statSync(dir).isDirectory(), true); + } + }); + it('sharp-libvips lib', () => { + const dir = libvips.buildSharpLibvipsLibDir(); + if (dir) { + assert.strictEqual(fs.statSync(dir).isDirectory(), true); + } + }); + }); + + describe('Runtime detection', () => { + it('platform', () => { + const [platform] = libvips.runtimePlatformArch().split('-'); + assert.strict(['darwin', 'freebsd', 'linux', 'linuxmusl', 'win32'].includes(platform)); + }); + it('arch', () => { + const [, arch] = libvips.runtimePlatformArch().split('-'); + assert.strict(['arm', 'arm64', 'ia32', 'x64', 'ppc64'].includes(arch)); + }); + it('isUnsupportedNodeRuntime', () => { + assert.strictEqual(libvips.isUnsupportedNodeRuntime(), undefined); + }); + }); + + describe('logger', () => { + const consoleLog = console.log; + const consoleError = console.error; + + after(() => { + console.log = consoleLog; + console.error = consoleError; + }); + + it('logs an info message', (_t, done) => { + console.log = (msg) => { + assert.strictEqual(msg, 'sharp: progress'); + done(); + }; + libvips.log('progress'); + }); + + it('logs an error message', (_t, done) => { + console.error = (msg) => { + assert.strictEqual(msg, 'sharp: Installation error: problem'); + done(); + }; + libvips.log(new Error('problem')); + }); + }); + + describe('yarn locator hash', () => { + it('known platform', () => { + const cc = process.env.CC; + delete process.env.CC; + process.env.npm_config_platform = 'linux'; + process.env.npm_config_arch = 's390x'; + process.env.npm_config_libc = ''; + const locatorHash = libvips.yarnLocator(); + assert.strictEqual(locatorHash, '4ab19140fd'); + delete process.env.npm_config_platform; + delete process.env.npm_config_arch; + delete process.env.npm_config_libc; + process.env.CC = cc; + }); + it('unknown platform', () => { + process.env.npm_config_platform = 'unknown-platform'; + const locatorHash = libvips.yarnLocator(); + assert.strictEqual(locatorHash, ''); + delete process.env.npm_config_platform; + }); + }); +}); diff --git a/test/unit/linear.js b/test/unit/linear.js new file mode 100644 index 000000000..c6d29a7de --- /dev/null +++ b/test/unit/linear.js @@ -0,0 +1,137 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +describe('Linear adjustment', () => { + const blackPoint = 70; + const whitePoint = 203; + const a = 255 / (whitePoint - blackPoint); + const b = -blackPoint * a; + + it('applies linear levels adjustment w/o alpha ch', (_t, done) => { + sharp(fixtures.inputJpgWithLowContrast) + .linear(a, b) + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(fixtures.expected('low-contrast-linear.jpg'), data, done); + }); + }); + + it('applies slope level adjustment w/o alpha ch', (_t, done) => { + sharp(fixtures.inputJpgWithLowContrast) + .linear(a) + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(fixtures.expected('low-contrast-slope.jpg'), data, done); + }); + }); + + it('applies offset level adjustment w/o alpha ch', (_t, done) => { + sharp(fixtures.inputJpgWithLowContrast) + .linear(null, b) + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(fixtures.expected('low-contrast-offset.jpg'), data, done); + }); + }); + + it('applies linear levels adjustment w alpha ch', (_t, done) => { + sharp(fixtures.inputPngOverlayLayer1) + .resize(240) + .linear(a, b) + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(fixtures.expected('alpha-layer-1-fill-linear.png'), data, done); + }); + }); + + it('applies linear levels adjustment to 16-bit w alpha ch', (_t, done) => { + sharp(fixtures.inputPngWithTransparency16bit) + .linear(a, b) + .png({ compressionLevel: 0 }) + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(fixtures.expected('linear-16bit.png'), data, done); + }); + }); + + it('applies slope level adjustment w alpha ch', (_t, done) => { + sharp(fixtures.inputPngOverlayLayer1) + .resize(240) + .linear(a) + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(fixtures.expected('alpha-layer-1-fill-slope.png'), data, done); + }); + }); + + it('applies offset level adjustment w alpha ch', (_t, done) => { + sharp(fixtures.inputPngOverlayLayer1) + .resize(240) + .linear(null, b) + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(fixtures.expected('alpha-layer-1-fill-offset.png'), data, done); + }); + }); + + it('per channel level adjustment', (_t, done) => { + sharp(fixtures.inputWebP) + .linear([0.25, 0.5, 0.75], [150, 100, 50]).toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(fixtures.expected('linear-per-channel.jpg'), data, done); + }); + }); + + it('output is integer, not float, RGB', async () => { + const data = await sharp({ create: { width: 1, height: 1, channels: 3, background: 'red' } }) + .linear(1, 0) + .tiff({ compression: 'none' }) + .toBuffer(); + + const { channels, depth } = await sharp(data).metadata(); + assert.strictEqual(channels, 3); + assert.strictEqual(depth, 'uchar'); + }); + + it('output is integer, not float, RGBA', async () => { + const data = await sharp({ create: { width: 1, height: 1, channels: 4, background: '#ff000077' } }) + .linear(1, 0) + .tiff({ compression: 'none' }) + .toBuffer(); + + const { channels, depth } = await sharp(data).metadata(); + assert.strictEqual(channels, 4); + assert.strictEqual(depth, 'uchar'); + }); + + it('Invalid linear arguments', () => { + assert.throws( + () => sharp().linear('foo'), + /Expected number or array of numbers for a but received foo of type string/ + ); + assert.throws( + () => sharp().linear(undefined, { bar: 'baz' }), + /Expected number or array of numbers for b but received \[object Object\] of type object/ + ); + assert.throws( + () => sharp().linear([], [1]), + /Expected number or array of numbers for a but received {2}of type object/ + ); + assert.throws( + () => sharp().linear([1, 2], [1]), + /Expected a and b to be arrays of the same length/ + ); + assert.throws( + () => sharp().linear([1]), + /Expected a and b to be arrays of the same length/ + ); + }); +}); diff --git a/test/unit/median.js b/test/unit/median.js new file mode 100644 index 000000000..14d63d566 --- /dev/null +++ b/test/unit/median.js @@ -0,0 +1,55 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); + +const row = [0, 3, 15, 63, 127, 255]; +const input = Buffer.from(Array.from(row, () => row).flat()); +const raw = { + width: 6, + height: 6, + channels: 1 +}; + +describe('Median filter', () => { + it('default window (3x3)', async () => { + const data = await sharp(input, { raw }) + .median() + .toColourspace('b-w') + .raw() + .toBuffer(); + + assert.deepStrictEqual(data.subarray(0, 6), Buffer.from(row)); + }); + + it('3x3 window', async () => { + const data = await sharp(input, { raw }) + .median(3) + .toColourspace('b-w') + .raw() + .toBuffer(); + + assert.deepStrictEqual(data.subarray(0, 6), Buffer.from(row)); + }); + + it('5x5 window', async () => { + const data = await sharp(input, { raw }) + .median(5) + .toColourspace('b-w') + .raw() + .toBuffer(); + + assert.deepStrictEqual(data.subarray(0, 6), Buffer.from(row)); + }); + + it('invalid radius', () => { + assert.throws(() => { + sharp().median(0.1); + }); + }); +}); diff --git a/test/unit/metadata.js b/test/unit/metadata.js index cb8db649d..300380edd 100644 --- a/test/unit/metadata.js +++ b/test/unit/metadata.js @@ -1,24 +1,33 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const fs = require('fs'); -const assert = require('assert'); +const fs = require('node:fs'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const exifReader = require('exif-reader'); const icc = require('icc'); const sharp = require('../../'); const fixtures = require('../fixtures'); -describe('Image metadata', function () { - it('JPEG', function (done) { - sharp(fixtures.inputJpg).metadata(function (err, metadata) { +const create = { width: 1, height: 1, channels: 3, background: 'red' }; + +describe('Image metadata', () => { + it('JPEG', (_t, done) => { + sharp(fixtures.inputJpg).metadata((err, metadata) => { if (err) throw err; assert.strictEqual('jpeg', metadata.format); + assert.strictEqual('undefined', typeof metadata.size); assert.strictEqual(2725, metadata.width); assert.strictEqual(2225, metadata.height); assert.strictEqual('srgb', metadata.space); assert.strictEqual(3, metadata.channels); assert.strictEqual('uchar', metadata.depth); - assert.strictEqual('undefined', typeof metadata.density); + assert.strictEqual(true, ['undefined', 'number'].includes(typeof metadata.density)); + assert.strictEqual('4:2:0', metadata.chromaSubsampling); + assert.strictEqual(false, metadata.isProgressive); assert.strictEqual(false, metadata.hasProfile); assert.strictEqual(false, metadata.hasAlpha); assert.strictEqual('undefined', typeof metadata.orientation); @@ -28,16 +37,19 @@ describe('Image metadata', function () { }); }); - it('JPEG with EXIF/ICC', function (done) { - sharp(fixtures.inputJpgWithExif).metadata(function (err, metadata) { + it('JPEG with EXIF/ICC', (_t, done) => { + sharp(fixtures.inputJpgWithExif).metadata((err, metadata) => { if (err) throw err; assert.strictEqual('jpeg', metadata.format); + assert.strictEqual('undefined', typeof metadata.size); assert.strictEqual(450, metadata.width); assert.strictEqual(600, metadata.height); assert.strictEqual('srgb', metadata.space); assert.strictEqual(3, metadata.channels); assert.strictEqual('uchar', metadata.depth); assert.strictEqual(72, metadata.density); + assert.strictEqual('4:2:0', metadata.chromaSubsampling); + assert.strictEqual(false, metadata.isProgressive); assert.strictEqual(true, metadata.hasProfile); assert.strictEqual(false, metadata.hasAlpha); assert.strictEqual(8, metadata.orientation); @@ -46,8 +58,8 @@ describe('Image metadata', function () { assert.strictEqual(true, metadata.exif instanceof Buffer); const exif = exifReader(metadata.exif); assert.strictEqual('object', typeof exif); - assert.strictEqual('object', typeof exif.image); - assert.strictEqual('number', typeof exif.image.XResolution); + assert.strictEqual('object', typeof exif.Image); + assert.strictEqual('number', typeof exif.Image.XResolution); // ICC assert.strictEqual('object', typeof metadata.icc); assert.strictEqual(true, metadata.icc instanceof Buffer); @@ -58,54 +70,136 @@ describe('Image metadata', function () { }); }); - it('TIFF', function (done) { - sharp(fixtures.inputTiff).metadata(function (err, metadata) { + it('JPEG with IPTC/XMP', (_t, done) => { + sharp(fixtures.inputJpgWithIptcAndXmp).metadata((err, metadata) => { + if (err) throw err; + // IPTC + assert.strictEqual('object', typeof metadata.iptc); + assert.strictEqual(true, metadata.iptc instanceof Buffer); + assert.strictEqual(18250, metadata.iptc.byteLength); + assert.strictEqual(metadata.iptc.indexOf(Buffer.from('Photoshop')), 0); + // XMP + assert.strictEqual('object', typeof metadata.xmp); + assert.strictEqual(true, metadata.xmp instanceof Buffer); + assert.strictEqual(12466, metadata.xmp.byteLength); + assert.strictEqual(metadata.xmp.indexOf(Buffer.from('')); + done(); + }); + }); + + it('TIFF', (_t, done) => { + sharp(fixtures.inputTiff).metadata((err, metadata) => { if (err) throw err; assert.strictEqual('tiff', metadata.format); + assert.strictEqual('undefined', typeof metadata.size); assert.strictEqual(2464, metadata.width); assert.strictEqual(3248, metadata.height); assert.strictEqual('b-w', metadata.space); assert.strictEqual(1, metadata.channels); assert.strictEqual('uchar', metadata.depth); assert.strictEqual(300, metadata.density); + assert.strictEqual('undefined', typeof metadata.chromaSubsampling); + assert.strictEqual(false, metadata.isProgressive); assert.strictEqual(false, metadata.hasProfile); assert.strictEqual(false, metadata.hasAlpha); assert.strictEqual(1, metadata.orientation); + assert.strictEqual(2464, metadata.autoOrient.width); + assert.strictEqual(3248, metadata.autoOrient.height); assert.strictEqual('undefined', typeof metadata.exif); assert.strictEqual('undefined', typeof metadata.icc); + assert.strictEqual('undefined', typeof metadata.xmp); + assert.strictEqual('undefined', typeof metadata.xmpAsString); + assert.strictEqual('inch', metadata.resolutionUnit); done(); }); }); - it('PNG', function (done) { - sharp(fixtures.inputPng).metadata(function (err, metadata) { + it('Multipage TIFF', (_t, done) => { + sharp(fixtures.inputTiffMultipage).metadata((err, metadata) => { + if (err) throw err; + assert.strictEqual('tiff', metadata.format); + assert.strictEqual('undefined', typeof metadata.size); + assert.strictEqual(2464, metadata.width); + assert.strictEqual(3248, metadata.height); + assert.strictEqual('b-w', metadata.space); + assert.strictEqual(1, metadata.channels); + assert.strictEqual('uchar', metadata.depth); + assert.strictEqual(300, metadata.density); + assert.strictEqual('undefined', typeof metadata.chromaSubsampling); + assert.strictEqual(false, metadata.isProgressive); + assert.strictEqual(2, metadata.pages); + assert.strictEqual(false, metadata.hasProfile); + assert.strictEqual(false, metadata.hasAlpha); + assert.strictEqual(1, metadata.orientation); + assert.strictEqual('undefined', typeof metadata.exif); + assert.strictEqual('undefined', typeof metadata.icc); + done(); + }); + }); + + it('PNG', (_t, done) => { + sharp(fixtures.inputPng).metadata((err, metadata) => { if (err) throw err; assert.strictEqual('png', metadata.format); + assert.strictEqual('undefined', typeof metadata.size); assert.strictEqual(2809, metadata.width); assert.strictEqual(2074, metadata.height); assert.strictEqual('b-w', metadata.space); assert.strictEqual(1, metadata.channels); assert.strictEqual('uchar', metadata.depth); assert.strictEqual(300, metadata.density); + assert.strictEqual('undefined', typeof metadata.chromaSubsampling); + assert.strictEqual(false, metadata.isProgressive); assert.strictEqual(false, metadata.hasProfile); assert.strictEqual(false, metadata.hasAlpha); assert.strictEqual('undefined', typeof metadata.orientation); + assert.strictEqual(2809, metadata.autoOrient.width); + assert.strictEqual(2074, metadata.autoOrient.height); assert.strictEqual('undefined', typeof metadata.exif); assert.strictEqual('undefined', typeof metadata.icc); done(); }); }); - it('Transparent PNG', function (done) { - sharp(fixtures.inputPngWithTransparency).metadata(function (err, metadata) { + it('PNG with comment', (_t, done) => { + sharp(fixtures.inputPngTestJoinChannel).metadata((err, metadata) => { if (err) throw err; assert.strictEqual('png', metadata.format); + assert.strictEqual('undefined', typeof metadata.size); + assert.strictEqual(320, metadata.width); + assert.strictEqual(240, metadata.height); + assert.strictEqual('b-w', metadata.space); + assert.strictEqual(1, metadata.channels); + assert.strictEqual('uchar', metadata.depth); + assert.strictEqual(72, metadata.density); + assert.strictEqual('undefined', typeof metadata.chromaSubsampling); + assert.strictEqual(false, metadata.isProgressive); + assert.strictEqual(false, metadata.hasProfile); + assert.strictEqual(false, metadata.hasAlpha); + assert.strictEqual('undefined', typeof metadata.orientation); + assert.strictEqual('undefined', typeof metadata.exif); + assert.strictEqual('undefined', typeof metadata.icc); + assert.strictEqual(1, metadata.comments.length); + assert.strictEqual('Comment', metadata.comments[0].keyword); + assert.strictEqual('Created with GIMP', metadata.comments[0].text); + done(); + }); + }); + + it('Transparent PNG', (_t, done) => { + sharp(fixtures.inputPngWithTransparency).metadata((err, metadata) => { + if (err) throw err; + assert.strictEqual('png', metadata.format); + assert.strictEqual('undefined', typeof metadata.size); assert.strictEqual(2048, metadata.width); assert.strictEqual(1536, metadata.height); assert.strictEqual('srgb', metadata.space); assert.strictEqual(4, metadata.channels); assert.strictEqual('uchar', metadata.depth); assert.strictEqual(72, metadata.density); + assert.strictEqual('undefined', typeof metadata.chromaSubsampling); + assert.strictEqual(false, metadata.isProgressive); assert.strictEqual(false, metadata.hasProfile); assert.strictEqual(true, metadata.hasAlpha); assert.strictEqual('undefined', typeof metadata.orientation); @@ -115,16 +209,69 @@ describe('Image metadata', function () { }); }); - it('WebP', function (done) { - sharp(fixtures.inputWebP).metadata(function (err, metadata) { + it('PNG with greyscale bKGD chunk - 8 bit', async () => { + const data = await sharp(fixtures.inputPng8BitGreyBackground).metadata(); + assert.deepStrictEqual(data, { + background: { + gray: 0 + }, + bitsPerSample: 8, + channels: 2, + density: 72, + depth: 'uchar', + format: 'png', + hasAlpha: true, + hasProfile: false, + height: 32, + isPalette: false, + isProgressive: false, + space: 'b-w', + width: 32, + autoOrient: { + width: 32, + height: 32 + } + }); + }); + + it('PNG with greyscale bKGD chunk - 16 bit', async () => { + const data = await sharp(fixtures.inputPng16BitGreyBackground).metadata(); + assert.deepStrictEqual(data, { + background: { + gray: 67 + }, + bitsPerSample: 16, + channels: 2, + density: 72, + depth: 'ushort', + format: 'png', + hasAlpha: true, + hasProfile: false, + height: 32, + isPalette: false, + isProgressive: false, + space: 'grey16', + width: 32, + autoOrient: { + width: 32, + height: 32 + } + }); + }); + + it('WebP', (_t, done) => { + sharp(fixtures.inputWebP).metadata((err, metadata) => { if (err) throw err; assert.strictEqual('webp', metadata.format); + assert.strictEqual('undefined', typeof metadata.size); assert.strictEqual(1024, metadata.width); assert.strictEqual(772, metadata.height); assert.strictEqual('srgb', metadata.space); assert.strictEqual(3, metadata.channels); assert.strictEqual('uchar', metadata.depth); assert.strictEqual('undefined', typeof metadata.density); + assert.strictEqual('undefined', typeof metadata.chromaSubsampling); + assert.strictEqual(false, metadata.isProgressive); assert.strictEqual(false, metadata.hasProfile); assert.strictEqual(false, metadata.hasAlpha); assert.strictEqual('undefined', typeof metadata.orientation); @@ -134,34 +281,109 @@ describe('Image metadata', function () { }); }); - it('GIF via giflib', function (done) { - sharp(fixtures.inputGif).metadata(function (err, metadata) { + it('Animated WebP', () => + sharp(fixtures.inputWebPAnimated) + .metadata() + .then(({ + format, width, height, space, channels, depth, + isProgressive, pages, loop, delay, hasProfile, + hasAlpha + }) => { + assert.strictEqual(format, 'webp'); + assert.strictEqual(width, 80); + assert.strictEqual(height, 80); + assert.strictEqual(space, 'srgb'); + assert.strictEqual(channels, 4); + assert.strictEqual(depth, 'uchar'); + assert.strictEqual(isProgressive, false); + assert.strictEqual(pages, 9); + assert.strictEqual(loop, 0); + assert.deepStrictEqual(delay, [120, 120, 90, 120, 120, 90, 120, 90, 30]); + assert.strictEqual(hasProfile, false); + assert.strictEqual(hasAlpha, true); + }) + ); + + it('Animated WebP with all pages', () => + sharp(fixtures.inputWebPAnimated, { pages: -1 }) + .metadata() + .then(({ + format, width, height, space, channels, depth, + isProgressive, pages, pageHeight, loop, delay, + hasProfile, hasAlpha + }) => { + assert.strictEqual(format, 'webp'); + assert.strictEqual(width, 80); + assert.strictEqual(height, 720); + assert.strictEqual(space, 'srgb'); + assert.strictEqual(channels, 4); + assert.strictEqual(depth, 'uchar'); + assert.strictEqual(isProgressive, false); + assert.strictEqual(pages, 9); + assert.strictEqual(pageHeight, 80); + assert.strictEqual(loop, 0); + assert.deepStrictEqual(delay, [120, 120, 90, 120, 120, 90, 120, 90, 30]); + assert.strictEqual(hasProfile, false); + assert.strictEqual(hasAlpha, true); + }) + ); + + it('Animated WebP with limited looping', () => + sharp(fixtures.inputWebPAnimatedLoop3) + .metadata() + .then(({ + format, width, height, space, channels, depth, + isProgressive, pages, loop, delay, hasProfile, + hasAlpha + }) => { + assert.strictEqual(format, 'webp'); + assert.strictEqual(width, 370); + assert.strictEqual(height, 285); + assert.strictEqual(space, 'srgb'); + assert.strictEqual(channels, 4); + assert.strictEqual(depth, 'uchar'); + assert.strictEqual(isProgressive, false); + assert.strictEqual(pages, 10); + assert.strictEqual(loop, 3); + assert.deepStrictEqual(delay, [...Array(9).fill(3000), 15000]); + assert.strictEqual(hasProfile, false); + assert.strictEqual(hasAlpha, true); + }) + ); + + it('GIF', (_t, done) => { + sharp(fixtures.inputGif).metadata((err, metadata) => { if (err) throw err; assert.strictEqual('gif', metadata.format); + assert.strictEqual('undefined', typeof metadata.size); assert.strictEqual(800, metadata.width); assert.strictEqual(533, metadata.height); assert.strictEqual(3, metadata.channels); assert.strictEqual('uchar', metadata.depth); assert.strictEqual('undefined', typeof metadata.density); + assert.strictEqual('undefined', typeof metadata.chromaSubsampling); + assert.strictEqual(false, metadata.isProgressive); assert.strictEqual(false, metadata.hasProfile); - assert.strictEqual(false, metadata.hasAlpha); assert.strictEqual('undefined', typeof metadata.orientation); assert.strictEqual('undefined', typeof metadata.exif); assert.strictEqual('undefined', typeof metadata.icc); + assert.deepStrictEqual(metadata.background, { r: 138, g: 148, b: 102 }); done(); }); }); - it('GIF grey+alpha via giflib', function (done) { - sharp(fixtures.inputGifGreyPlusAlpha).metadata(function (err, metadata) { + it('GIF grey+alpha', (_t, done) => { + sharp(fixtures.inputGifGreyPlusAlpha).metadata((err, metadata) => { if (err) throw err; assert.strictEqual('gif', metadata.format); + assert.strictEqual('undefined', typeof metadata.size); assert.strictEqual(2, metadata.width); assert.strictEqual(1, metadata.height); - assert.strictEqual(2, metadata.channels); + assert.strictEqual(4, metadata.channels); assert.strictEqual('uchar', metadata.depth); assert.strictEqual('undefined', typeof metadata.density); + assert.strictEqual('undefined', typeof metadata.chromaSubsampling); + assert.strictEqual(false, metadata.isProgressive); assert.strictEqual(false, metadata.hasProfile); - assert.strictEqual(true, metadata.hasAlpha); assert.strictEqual('undefined', typeof metadata.orientation); assert.strictEqual('undefined', typeof metadata.exif); assert.strictEqual('undefined', typeof metadata.icc); @@ -169,15 +391,86 @@ describe('Image metadata', function () { }); }); - it('File in, Promise out', function (done) { - sharp(fixtures.inputJpg).metadata().then(function (metadata) { + it('Animated GIF', () => + sharp(fixtures.inputGifAnimated) + .metadata() + .then(({ + format, width, height, space, channels, depth, + isProgressive, pages, loop, delay, background, + hasProfile, hasAlpha + }) => { + assert.strictEqual(format, 'gif'); + assert.strictEqual(width, 80); + assert.strictEqual(height, 80); + assert.strictEqual(space, 'srgb'); + assert.strictEqual(channels, 4); + assert.strictEqual(depth, 'uchar'); + assert.strictEqual(isProgressive, false); + assert.strictEqual(pages, 30); + assert.strictEqual(loop, 0); + assert.deepStrictEqual(delay, Array(30).fill(30)); + assert.deepStrictEqual(background, { r: 0, g: 0, b: 0 }); + assert.strictEqual(hasProfile, false); + assert.strictEqual(hasAlpha, true); + }) + ); + + it('Animated GIF with limited looping', () => + sharp(fixtures.inputGifAnimatedLoop3) + .metadata() + .then(({ + format, width, height, space, channels, depth, + isProgressive, pages, loop, delay, hasProfile, + hasAlpha + }) => { + assert.strictEqual(format, 'gif'); + assert.strictEqual(width, 370); + assert.strictEqual(height, 285); + assert.strictEqual(space, 'srgb'); + assert.strictEqual(channels, 4); + assert.strictEqual(depth, 'uchar'); + assert.strictEqual(isProgressive, false); + assert.strictEqual(pages, 10); + assert.strictEqual(loop, 3); + assert.deepStrictEqual(delay, [...Array(9).fill(3000), 15000]); + assert.strictEqual(hasProfile, false); + assert.strictEqual(hasAlpha, true); + }) + ); + + it('vips', () => + sharp(fixtures.inputV) + .metadata() + .then(metadata => { + assert.strictEqual('vips', metadata.format); + assert.strictEqual('undefined', typeof metadata.size); + assert.strictEqual(70, metadata.width); + assert.strictEqual(60, metadata.height); + assert.strictEqual(3, metadata.channels); + assert.strictEqual('uchar', metadata.depth); + assert.strictEqual(72, metadata.density); + assert.strictEqual('undefined', typeof metadata.chromaSubsampling); + assert.strictEqual(false, metadata.isProgressive); + assert.strictEqual(false, metadata.hasProfile); + assert.strictEqual(false, metadata.hasAlpha); + assert.strictEqual('undefined', typeof metadata.orientation); + assert.strictEqual('undefined', typeof metadata.exif); + assert.strictEqual('undefined', typeof metadata.icc); + }) + ); + + it('File in, Promise out', (_t, done) => { + sharp(fixtures.inputJpg).metadata().then((metadata) => { assert.strictEqual('jpeg', metadata.format); + assert.strictEqual('undefined', typeof metadata.size); assert.strictEqual(2725, metadata.width); assert.strictEqual(2225, metadata.height); assert.strictEqual('srgb', metadata.space); assert.strictEqual(3, metadata.channels); assert.strictEqual('uchar', metadata.depth); - assert.strictEqual('undefined', typeof metadata.density); + assert.strictEqual(true, ['undefined', 'number'].includes(typeof metadata.density)); + assert.strictEqual('4:2:0', metadata.chromaSubsampling); + assert.strictEqual(false, metadata.isProgressive); assert.strictEqual(false, metadata.hasProfile); assert.strictEqual(false, metadata.hasAlpha); assert.strictEqual('undefined', typeof metadata.orientation); @@ -187,49 +480,94 @@ describe('Image metadata', function () { }); }); - it('Non-existent file in, Promise out', function (done) { - sharp('fail').metadata().then(function (metadata) { - throw new Error('Non-existent file'); - }, function (err) { - assert.ok(!!err); - done(); - }); + it('Non-existent file in, Promise out', async () => + assert.rejects( + () => sharp('fail').metadata(), + (err) => { + assert.strictEqual(err.message, 'Input file is missing: fail'); + assert(err.stack.includes('at Sharp.metadata')); + assert(err.stack.includes(__filename)); + return true; + } + ) + ); + + it('Invalid stream in, callback out', (_t, done) => { + fs.createReadStream(__filename).pipe( + sharp().metadata((err) => { + assert.strictEqual(err.message, 'Input buffer contains unsupported image format'); + assert(err.stack.includes('at Sharp.metadata')); + assert(err.stack.includes(__filename)); + done(); + }) + ); }); - it('Stream in, Promise out', function (done) { + it('Stream in, Promise out', (_t, done) => { const readable = fs.createReadStream(fixtures.inputJpg); const pipeline = sharp(); - pipeline.metadata().then(function (metadata) { + pipeline.metadata().then((metadata) => { assert.strictEqual('jpeg', metadata.format); + assert.strictEqual(829183, metadata.size); assert.strictEqual(2725, metadata.width); assert.strictEqual(2225, metadata.height); assert.strictEqual('srgb', metadata.space); assert.strictEqual(3, metadata.channels); assert.strictEqual('uchar', metadata.depth); - assert.strictEqual('undefined', typeof metadata.density); + assert.strictEqual(true, ['undefined', 'number'].includes(typeof metadata.density)); + assert.strictEqual('4:2:0', metadata.chromaSubsampling); + assert.strictEqual(false, metadata.isProgressive); assert.strictEqual(false, metadata.hasProfile); assert.strictEqual(false, metadata.hasAlpha); assert.strictEqual('undefined', typeof metadata.orientation); assert.strictEqual('undefined', typeof metadata.exif); assert.strictEqual('undefined', typeof metadata.icc); done(); - }).catch(function (err) { - throw err; - }); + }).catch(_t, done); readable.pipe(pipeline); }); - it('Stream', function (done) { + it('Stream in, rejected Promise out', () => { + const pipeline = sharp(); + fs + .createReadStream(__filename) + .pipe(pipeline); + + return pipeline + .metadata() + .then( + () => Promise.reject(new Error('Expected metadata to reject')), + err => assert.strictEqual(err.message, 'Input buffer contains unsupported image format') + ); + }); + + it('Stream in, finish event fires before metadata is requested', (_t, done) => { + const create = { width: 1, height: 1, channels: 3, background: 'red' }; + const image1 = sharp({ create }).png().pipe(sharp()); + const image2 = sharp({ create }).png().pipe(sharp()); + setTimeout(async () => { + const data1 = await image1.metadata(); + assert.strictEqual('png', data1.format); + const data2 = await image2.metadata(); + assert.strictEqual('png', data2.format); + done(); + }, 500); + }); + + it('Stream', (_t, done) => { const readable = fs.createReadStream(fixtures.inputJpg); - const pipeline = sharp().metadata(function (err, metadata) { + const pipeline = sharp().metadata((err, metadata) => { if (err) throw err; assert.strictEqual('jpeg', metadata.format); + assert.strictEqual(829183, metadata.size); assert.strictEqual(2725, metadata.width); assert.strictEqual(2225, metadata.height); assert.strictEqual('srgb', metadata.space); assert.strictEqual(3, metadata.channels); assert.strictEqual('uchar', metadata.depth); - assert.strictEqual('undefined', typeof metadata.density); + assert.strictEqual(true, ['undefined', 'number'].includes(typeof metadata.density)); + assert.strictEqual('4:2:0', metadata.chromaSubsampling); + assert.strictEqual(false, metadata.isProgressive); assert.strictEqual(false, metadata.hasProfile); assert.strictEqual(false, metadata.hasAlpha); assert.strictEqual('undefined', typeof metadata.orientation); @@ -240,23 +578,26 @@ describe('Image metadata', function () { readable.pipe(pipeline); }); - it('Resize to half width using metadata', function (done) { + it('Resize to half width using metadata', (_t, done) => { const image = sharp(fixtures.inputJpg); - image.metadata(function (err, metadata) { + image.metadata((err, metadata) => { if (err) throw err; assert.strictEqual('jpeg', metadata.format); + assert.strictEqual('undefined', typeof metadata.size); assert.strictEqual(2725, metadata.width); assert.strictEqual(2225, metadata.height); assert.strictEqual('srgb', metadata.space); assert.strictEqual(3, metadata.channels); assert.strictEqual('uchar', metadata.depth); - assert.strictEqual('undefined', typeof metadata.density); + assert.strictEqual(true, ['undefined', 'number'].includes(typeof metadata.density)); + assert.strictEqual('4:2:0', metadata.chromaSubsampling); + assert.strictEqual(false, metadata.isProgressive); assert.strictEqual(false, metadata.hasProfile); assert.strictEqual(false, metadata.hasAlpha); assert.strictEqual('undefined', typeof metadata.orientation); assert.strictEqual('undefined', typeof metadata.exif); assert.strictEqual('undefined', typeof metadata.icc); - image.resize(Math.floor(metadata.width / 2)).toBuffer(function (err, data, info) { + image.resize(Math.floor(metadata.width / 2)).toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual(1362, info.width); @@ -266,23 +607,27 @@ describe('Image metadata', function () { }); }); - it('Keep EXIF metadata and add sRGB profile after a resize', function (done) { + it('Keep EXIF metadata and add sRGB profile after a resize', (_t, done) => { sharp(fixtures.inputJpgWithExif) .resize(320, 240) .withMetadata() - .toBuffer(function (err, buffer) { + .toBuffer((err, buffer) => { if (err) throw err; - sharp(buffer).metadata(function (err, metadata) { + sharp(buffer).metadata((err, metadata) => { if (err) throw err; assert.strictEqual(true, metadata.hasProfile); assert.strictEqual(8, metadata.orientation); + assert.strictEqual(320, metadata.width); + assert.strictEqual(240, metadata.height); + assert.strictEqual(240, metadata.autoOrient.width); + assert.strictEqual(320, metadata.autoOrient.height); assert.strictEqual('object', typeof metadata.exif); assert.strictEqual(true, metadata.exif instanceof Buffer); // EXIF const exif = exifReader(metadata.exif); assert.strictEqual('object', typeof exif); - assert.strictEqual('object', typeof exif.image); - assert.strictEqual('number', typeof exif.image.XResolution); + assert.strictEqual('object', typeof exif.Image); + assert.strictEqual('number', typeof exif.Image.XResolution); // ICC assert.strictEqual('object', typeof metadata.icc); assert.strictEqual(true, metadata.icc instanceof Buffer); @@ -296,13 +641,174 @@ describe('Image metadata', function () { }); }); - it('Remove EXIF metadata after a resize', function (done) { + it('keep existing ICC profile', async () => { + const data = await sharp(fixtures.inputJpgWithExif) + .keepIccProfile() + .toBuffer(); + + const metadata = await sharp(data).metadata(); + const { description } = icc.parse(metadata.icc); + assert.strictEqual(description, 'Generic RGB Profile'); + }); + + it('keep existing CMYK input profile for CMYK output', async () => { + const data = await sharp(fixtures.inputJpgWithCmykProfile) + .keepIccProfile() + .toColourspace('cmyk') + .toBuffer(); + + const metadata = await sharp(data).metadata(); + assert.strictEqual(metadata.channels, 4); + const { description } = icc.parse(metadata.icc); + assert.strictEqual(description, 'U.S. Web Coated (SWOP) v2'); + }); + + it('transform with but discard existing RGB input profile for CMYK output', async () => { + const data = await sharp(fixtures.inputJpgWithExif) + .keepIccProfile() + .toColourspace('cmyk') + .toBuffer(); + + const metadata = await sharp(data).metadata(); + assert.strictEqual(metadata.channels, 4); + assert.strictEqual(metadata.icc, undefined); + }); + + it('keep existing ICC profile, avoid colour transform', async () => { + const [r, g, b] = await sharp(fixtures.inputPngWithProPhotoProfile) + .keepIccProfile() + .raw() + .toBuffer(); + + assert.strictEqual(r, 131); + assert.strictEqual(g, 141); + assert.strictEqual(b, 192); + }); + + it('keep existing CMYK ICC profile', async () => { + const data = await sharp(fixtures.inputJpgWithCmykProfile) + .pipelineColourspace('cmyk') + .toColourspace('cmyk') + .keepIccProfile() + .toBuffer(); + + const metadata = await sharp(data).metadata(); + assert.strictEqual(metadata.channels, 4); + const { description } = icc.parse(metadata.icc); + assert.strictEqual(description, 'U.S. Web Coated (SWOP) v2'); + }); + + it('transform to ICC profile and attach', async () => { + const data = await sharp({ create }) + .png() + .withIccProfile('p3', { attach: true }) + .toBuffer(); + + const metadata = await sharp(data).metadata(); + const { description } = icc.parse(metadata.icc); + assert.strictEqual(description, 'sP3C'); + }); + + it('transform to ICC profile but do not attach', async () => { + const data = await sharp({ create }) + .png() + .withIccProfile('p3', { attach: false }) + .toBuffer(); + + const metadata = await sharp(data).metadata(); + assert.strictEqual(3, metadata.channels); + assert.strictEqual(undefined, metadata.icc); + }); + + it('transform to invalid ICC profile emits warning', async () => { + const img = sharp({ create }) + .png() + .withIccProfile(fixtures.path('invalid-illuminant.icc')); + + const warningsEmitted = []; + img.on('warning', (warning) => { + warningsEmitted.push(warning); + }); + + const data = await img.toBuffer(); + assert.strict(warningsEmitted.includes('Invalid profile')); + + const metadata = await sharp(data).metadata(); + assert.strictEqual(3, metadata.channels); + assert.strictEqual(undefined, metadata.icc); + }); + + it('Apply CMYK output ICC profile', (_t, done) => { + const output = fixtures.path('output.icc-cmyk.jpg'); + sharp(fixtures.inputJpg) + .resize(64) + .withIccProfile('cmyk') + .toFile(output, (err) => { + if (err) throw err; + sharp(output).metadata((err, metadata) => { + if (err) throw err; + assert.strictEqual(true, metadata.hasProfile); + assert.strictEqual('cmyk', metadata.space); + assert.strictEqual(4, metadata.channels); + // ICC + assert.strictEqual('object', typeof metadata.icc); + assert.strictEqual(true, metadata.icc instanceof Buffer); + const profile = icc.parse(metadata.icc); + assert.strictEqual('object', typeof profile); + assert.strictEqual('CMYK', profile.colorSpace); + assert.strictEqual('Relative', profile.intent); + assert.strictEqual('Printer', profile.deviceClass); + }); + fixtures.assertSimilar(output, fixtures.expected('icc-cmyk.jpg'), { threshold: 1 }, done); + }); + }); + + it('Apply custom output ICC profile', (_t, done) => { + const output = fixtures.path('output.hilutite.jpg'); + sharp(fixtures.inputJpg) + .resize(64) + .withIccProfile(fixtures.path('hilutite.icm')) + .toFile(output, (err) => { + if (err) throw err; + fixtures.assertMaxColourDistance(output, fixtures.expected('hilutite.jpg'), 9); + done(); + }); + }); + + it('Include metadata in output, enabled via empty object', () => + sharp(fixtures.inputJpgWithExif) + .withMetadata({}) + .toBuffer() + .then((buffer) => sharp(buffer) + .metadata() + .then(metadata => { + assert.strictEqual(true, metadata.hasProfile); + assert.strictEqual(8, metadata.orientation); + assert.strictEqual('object', typeof metadata.exif); + assert.strictEqual(true, metadata.exif instanceof Buffer); + // EXIF + const exif = exifReader(metadata.exif); + assert.strictEqual('object', typeof exif); + assert.strictEqual('object', typeof exif.Image); + assert.strictEqual('number', typeof exif.Image.XResolution); + // ICC + assert.strictEqual('object', typeof metadata.icc); + assert.strictEqual(true, metadata.icc instanceof Buffer); + const profile = icc.parse(metadata.icc); + assert.strictEqual('object', typeof profile); + assert.strictEqual('RGB', profile.colorSpace); + assert.strictEqual('Perceptual', profile.intent); + assert.strictEqual('Monitor', profile.deviceClass); + }) + ) + ); + + it('Remove EXIF metadata after a resize', (_t, done) => { sharp(fixtures.inputJpgWithExif) .resize(320, 240) - .withMetadata(false) - .toBuffer(function (err, buffer) { + .toBuffer((err, buffer) => { if (err) throw err; - sharp(buffer).metadata(function (err, metadata) { + sharp(buffer).metadata((err, metadata) => { if (err) throw err; assert.strictEqual(false, metadata.hasProfile); assert.strictEqual('undefined', typeof metadata.orientation); @@ -313,12 +819,12 @@ describe('Image metadata', function () { }); }); - it('Remove metadata from PNG output', function (done) { + it('Remove metadata from PNG output', (_t, done) => { sharp(fixtures.inputJpgWithExif) .png() - .toBuffer(function (err, buffer) { + .toBuffer((err, buffer) => { if (err) throw err; - sharp(buffer).metadata(function (err, metadata) { + sharp(buffer).metadata((err, metadata) => { if (err) throw err; assert.strictEqual(false, metadata.hasProfile); assert.strictEqual('undefined', typeof metadata.orientation); @@ -329,42 +835,513 @@ describe('Image metadata', function () { }); }); - it('File input with corrupt header fails gracefully', function (done) { + it('Add EXIF metadata to JPEG', async () => { + const data = await sharp({ create }) + .jpeg() + .withMetadata({ + exif: { + IFD0: { Software: 'sharp' }, + IFD2: { ExposureTime: '0.2' } + } + }) + .toBuffer(); + + const { exif } = await sharp(data).metadata(); + const parsedExif = exifReader(exif); + assert.strictEqual(parsedExif.Image.Software, 'sharp'); + assert.strictEqual(parsedExif.Photo.ExposureTime, 0.2); + }); + + it('Set density of JPEG', async () => { + const data = await sharp({ create }) + .withMetadata({ + density: 300 + }) + .jpeg() + .toBuffer(); + + const { density } = await sharp(data).metadata(); + assert.strictEqual(density, 300); + }); + + it('Set density of PNG', async () => { + const data = await sharp({ create }) + .withMetadata({ + density: 96 + }) + .png() + .toBuffer(); + + const { density } = await sharp(data).metadata(); + assert.strictEqual(density, 96); + }); + + it('chromaSubsampling 4:4:4:4 CMYK JPEG', () => sharp(fixtures.inputJpgWithCmykProfile) + .metadata() + .then((metadata) => { + assert.strictEqual('4:4:4:4', metadata.chromaSubsampling); + })); + + it('chromaSubsampling 4:4:4 RGB JPEG', () => sharp(fixtures.inputJpg) + .resize(10, 10) + .jpeg({ chromaSubsampling: '4:4:4' }) + .toBuffer() + .then((data) => sharp(data) + .metadata() + .then((metadata) => { + assert.strictEqual('4:4:4', metadata.chromaSubsampling); + }))); + + it('isProgressive JPEG', () => sharp(fixtures.inputJpg) + .resize(10, 10) + .jpeg({ progressive: true }) + .toBuffer() + .then((data) => sharp(data) + .metadata() + .then((metadata) => { + assert.strictEqual(true, metadata.isProgressive); + }))); + + it('isProgressive PNG', () => sharp(fixtures.inputJpg) + .resize(10, 10) + .png({ progressive: true }) + .toBuffer() + .then((data) => sharp(data) + .metadata() + .then((metadata) => { + assert.strictEqual(true, metadata.isProgressive); + }))); + + it('16-bit TIFF with TIFFTAG_PHOTOSHOP metadata', () => + sharp(fixtures.inputTifftagPhotoshop) + .metadata() + .then(metadata => { + assert.strictEqual(metadata.format, 'tiff'); + assert.strictEqual(metadata.width, 317); + assert.strictEqual(metadata.height, 211); + assert.strictEqual(metadata.space, 'rgb16'); + assert.strictEqual(metadata.channels, 3); + assert.strictEqual(typeof metadata.tifftagPhotoshop, 'object'); + assert.strictEqual(metadata.tifftagPhotoshop instanceof Buffer, true); + assert.strictEqual(metadata.tifftagPhotoshop.length, 6634); + }) + ); + + it('AVIF', async () => { + const metadata = await sharp(fixtures.inputAvif).metadata(); + assert.deepStrictEqual(metadata, { + format: 'heif', + width: 2048, + height: 858, + space: 'srgb', + channels: 3, + depth: 'uchar', + isProgressive: false, + isPalette: false, + bitsPerSample: 8, + pages: 1, + pagePrimary: 0, + compression: 'av1', + hasProfile: false, + hasAlpha: false, + autoOrient: { + width: 2048, + height: 858 + } + }); + }); + + it('withMetadata adds default sRGB profile', async () => { + const data = await sharp(fixtures.inputJpg) + .resize(32, 24) + .withMetadata() + .toBuffer(); + + const metadata = await sharp(data).metadata(); + const { colorSpace, deviceClass, intent } = icc.parse(metadata.icc); + assert.strictEqual(colorSpace, 'RGB'); + assert.strictEqual(deviceClass, 'Monitor'); + assert.strictEqual(intent, 'Perceptual'); + }); + + it('withMetadata adds default sRGB profile to RGB16', async () => { + const data = await sharp({ create }) + .toColorspace('rgb16') + .png() + .withMetadata() + .toBuffer(); + + const metadata = await sharp(data).metadata(); + assert.strictEqual(metadata.depth, 'ushort'); + + const { description } = icc.parse(metadata.icc); + assert.strictEqual(description, 'sRGB'); + }); + + it('withMetadata adds P3 profile to 16-bit PNG', async () => { + const data = await sharp({ create }) + .toColorspace('rgb16') + .png() + .withMetadata({ icc: 'p3' }) + .toBuffer(); + + const metadata = await sharp(data).metadata(); + assert.strictEqual(metadata.depth, 'ushort'); + + const { description } = icc.parse(metadata.icc); + assert.strictEqual(description, 'sP3C'); + }); + + it('File input with corrupt header fails gracefully', (_t, done) => { sharp(fixtures.inputJpgWithCorruptHeader) - .metadata(function (err) { + .metadata((err) => { assert.strictEqual(true, !!err); + assert.ok(err.message.includes('Input file has corrupt header: VipsJpeg: premature end of'), err); done(); }); }); - it('Buffer input with corrupt header fails gracefully', function (done) { + it('Buffer input with corrupt header fails gracefully', (_t, done) => { sharp(fs.readFileSync(fixtures.inputJpgWithCorruptHeader)) - .metadata(function (err) { + .metadata((err) => { assert.strictEqual(true, !!err); + assert.ok(err.message.includes('Input buffer has corrupt header: VipsJpeg: premature end of'), err); done(); }); }); - describe('Invalid withMetadata parameters', function () { - it('String orientation', function () { - assert.throws(function () { - sharp().withMetadata({orientation: 'zoinks'}); + it('Lossless JPEG', async () => { + const metadata = await sharp(fixtures.inputJpgLossless).metadata(); + assert.deepStrictEqual(metadata, { + format: 'jpeg', + width: 227, + height: 149, + space: 'srgb', + channels: 3, + depth: 'uchar', + density: 72, + chromaSubsampling: '4:4:4', + isProgressive: false, + isPalette: false, + hasProfile: false, + hasAlpha: false, + autoOrient: { width: 227, height: 149 } + }); + }); + + it('keepExif maintains all EXIF metadata', async () => { + const data1 = await sharp({ create }) + .withExif({ + IFD0: { + Copyright: 'Test 1', + Software: 'sharp' + } + }) + .jpeg() + .toBuffer(); + + const data2 = await sharp(data1) + .keepExif() + .toBuffer(); + + const md2 = await sharp(data2).metadata(); + const exif2 = exifReader(md2.exif); + assert.strictEqual(exif2.Image.Copyright, 'Test 1'); + assert.strictEqual(exif2.Image.Software, 'sharp'); + }); + + it('withExif replaces all EXIF metadata', async () => { + const data1 = await sharp({ create }) + .withExif({ + IFD0: { + Copyright: 'Test 1', + Software: 'sharp' + } + }) + .jpeg() + .toBuffer(); + + const md1 = await sharp(data1).metadata(); + const exif1 = exifReader(md1.exif); + assert.strictEqual(exif1.Image.Copyright, 'Test 1'); + assert.strictEqual(exif1.Image.Software, 'sharp'); + + const data2 = await sharp(data1) + .withExif({ + IFD0: { + Copyright: 'Test 2' + } + }) + .toBuffer(); + + const md2 = await sharp(data2).metadata(); + const exif2 = exifReader(md2.exif); + assert.strictEqual(exif2.Image.Copyright, 'Test 2'); + assert.strictEqual(exif2.Image.Software, undefined); + }); + + it('withExifMerge merges all EXIF metadata', async () => { + const data1 = await sharp({ create }) + .withExif({ + IFD0: { + Copyright: 'Test 1' + } + }) + .jpeg() + .toBuffer(); + + const md1 = await sharp(data1).metadata(); + const exif1 = exifReader(md1.exif); + assert.strictEqual(exif1.Image.Copyright, 'Test 1'); + assert.strictEqual(exif1.Image.Software, undefined); + + const data2 = await sharp(data1) + .withExifMerge({ + IFD0: { + Copyright: 'Test 2', + Software: 'sharp' + + } + }) + .toBuffer(); + + const md2 = await sharp(data2).metadata(); + const exif2 = exifReader(md2.exif); + assert.strictEqual(exif2.Image.Copyright, 'Test 2'); + assert.strictEqual(exif2.Image.Software, 'sharp'); + }); + + describe('XMP metadata tests', () => { + it('withMetadata preserves existing XMP metadata from input', async () => { + const data = await sharp(fixtures.inputJpgWithIptcAndXmp) + .resize(320, 240) + .withMetadata() + .toBuffer(); + + const metadata = await sharp(data).metadata(); + assert.strictEqual('object', typeof metadata.xmp); + assert.strictEqual(true, metadata.xmp instanceof Buffer); + assert.strictEqual(true, metadata.xmp.length > 0); + // Check that XMP starts with the expected XML declaration + assert.strictEqual(metadata.xmp.indexOf(Buffer.from(' { + const data = await sharp(fixtures.inputJpgWithIptcAndXmp) + .resize(320, 240) + .keepXmp() + .toBuffer(); + + const metadata = await sharp(data).metadata(); + assert.strictEqual('object', typeof metadata.xmp); + assert.strictEqual(true, metadata.xmp instanceof Buffer); + assert.strictEqual(true, metadata.xmp.length > 0); + // Check that XMP starts with the expected XML declaration + assert.strictEqual(metadata.xmp.indexOf(Buffer.from(' { + const customXmp = 'Test CreatorTest Title'; + + const data = await sharp(fixtures.inputJpgWithIptcAndXmp) + .resize(320, 240) + .withXmp(customXmp) + .toBuffer(); + + const metadata = await sharp(data).metadata(); + assert.strictEqual('object', typeof metadata.xmp); + assert.strictEqual(true, metadata.xmp instanceof Buffer); + + // Check that the XMP contains our custom content + const xmpString = metadata.xmp.toString(); + assert.strictEqual(true, xmpString.includes('Test Creator')); + assert.strictEqual(true, xmpString.includes('Test Title')); + }); + + it('withXmp with custom XMP buffer on image without existing XMP', async () => { + const customXmp = 'Added via Sharp'; + + const data = await sharp(fixtures.inputJpg) + .resize(320, 240) + .withXmp(customXmp) + .toBuffer(); + + const metadata = await sharp(data).metadata(); + assert.strictEqual('object', typeof metadata.xmp); + assert.strictEqual(true, metadata.xmp instanceof Buffer); + + // Check that the XMP contains our custom content + const xmpString = metadata.xmp.toString(); + assert.strictEqual(true, xmpString.includes('Added via Sharp')); + }); + + it('withXmp with valid XMP metadata for different image formats', async () => { + const customXmp = 'testmetadata'; + + // Test with JPEG output + const jpegData = await sharp(fixtures.inputJpg) + .resize(100, 100) + .jpeg() + .withXmp(customXmp) + .toBuffer(); + + const jpegMetadata = await sharp(jpegData).metadata(); + assert.strictEqual('object', typeof jpegMetadata.xmp); + assert.strictEqual(true, jpegMetadata.xmp instanceof Buffer); + assert.strictEqual(true, jpegMetadata.xmp.toString().includes('test')); + + // Test with PNG output (PNG should also support XMP metadata) + const pngData = await sharp(fixtures.inputJpg) + .resize(100, 100) + .png() + .withXmp(customXmp) + .toBuffer(); + + const pngMetadata = await sharp(pngData).metadata(); + // PNG format should preserve XMP metadata when using withXmp + assert.strictEqual('object', typeof pngMetadata.xmp); + assert.strictEqual(true, pngMetadata.xmp instanceof Buffer); + assert.strictEqual(true, pngMetadata.xmp.toString().includes('test')); + + // Test with WebP output (WebP should also support XMP metadata) + const webpData = await sharp(fixtures.inputJpg) + .resize(100, 100) + .webp() + .withXmp(customXmp) + .toBuffer(); + + const webpMetadata = await sharp(webpData).metadata(); + // WebP format should preserve XMP metadata when using withXmp + assert.strictEqual('object', typeof webpMetadata.xmp); + assert.strictEqual(true, webpMetadata.xmp instanceof Buffer); + assert.strictEqual(true, webpMetadata.xmp.toString().includes('test')); + }); + + it('XMP metadata persists through multiple operations', async () => { + const customXmp = 'persistent-test'; + + const data = await sharp(fixtures.inputJpg) + .resize(320, 240) + .withXmp(customXmp) + .rotate(90) + .blur(1) + .sharpen() + .toBuffer(); + + const metadata = await sharp(data).metadata(); + assert.strictEqual('object', typeof metadata.xmp); + assert.strictEqual(true, metadata.xmp instanceof Buffer); + assert.strictEqual(true, metadata.xmp.toString().includes('persistent-test')); + }); + + it('withXmp XMP works with WebP format specifically', async () => { + const webpXmp = 'WebP Creatorimage/webp'; + + const data = await sharp(fixtures.inputJpg) + .resize(120, 80) + .webp({ quality: 80 }) + .withXmp(webpXmp) + .toBuffer(); + + const metadata = await sharp(data).metadata(); + assert.strictEqual('webp', metadata.format); + assert.strictEqual('object', typeof metadata.xmp); + assert.strictEqual(true, metadata.xmp instanceof Buffer); + + const xmpString = metadata.xmp.toString(); + assert.strictEqual(true, xmpString.includes('WebP Creator')); + assert.strictEqual(true, xmpString.includes('image/webp')); + }); + + it('withXmp XMP validation - non-string input', () => { + assert.throws( + () => sharp().withXmp(123), + /Expected non-empty string for xmp but received 123 of type number/ + ); + }); + + it('withXmp XMP validation - null input', () => { + assert.throws( + () => sharp().withXmp(null), + /Expected non-empty string for xmp but received null of type object/ + ); + }); + + it('withXmp XMP validation - empty string', () => { + assert.throws( + () => sharp().withXmp(''), + /Expected non-empty string for xmp/ + ); + }); + }); + + describe('Invalid parameters', () => { + it('String orientation', () => { + assert.throws(() => { + sharp().withMetadata({ orientation: 'zoinks' }); }); }); - it('Negative orientation', function () { - assert.throws(function () { - sharp().withMetadata({orientation: -1}); + it('Negative orientation', () => { + assert.throws(() => { + sharp().withMetadata({ orientation: -1 }); }); }); - it('Zero orientation', function () { - assert.throws(function () { + it('Zero orientation', () => { + assert.throws(() => { sharp().withMetadata({ orientation: 0 }); }); }); - it('Too large orientation', function () { - assert.throws(function () { - sharp().withMetadata({orientation: 9}); + it('Too large orientation', () => { + assert.throws(() => { + sharp().withMetadata({ orientation: 9 }); }); }); + it('Non-numeric density', () => { + assert.throws(() => { + sharp().withMetadata({ density: '1' }); + }); + }); + it('Negative density', () => { + assert.throws(() => { + sharp().withMetadata({ density: -1 }); + }); + }); + it('Non string icc', () => { + assert.throws(() => { + sharp().withMetadata({ icc: true }); + }); + }); + it('Non object exif', () => { + assert.throws(() => { + sharp().withMetadata({ exif: false }); + }); + }); + it('Non string value in object exif', () => { + assert.throws(() => { + sharp().withMetadata({ exif: { ifd0: false } }); + }); + }); + it('Non string value in nested object exif', () => { + assert.throws(() => { + sharp().withMetadata({ exif: { ifd0: { fail: false } } }); + }); + }); + it('withIccProfile invalid profile', () => { + assert.throws( + () => sharp().withIccProfile(false), + /Expected string for icc but received false of type boolean/ + ); + }); + it('withIccProfile missing attach', () => { + assert.doesNotThrow( + () => sharp().withIccProfile('test', {}) + ); + }); + it('withIccProfile invalid attach', () => { + assert.throws( + () => sharp().withIccProfile('test', { attach: 1 }), + /Expected boolean for attach but received 1 of type number/ + ); + }); }); }); diff --git a/test/unit/modulate.js b/test/unit/modulate.js new file mode 100644 index 000000000..40c5b83f1 --- /dev/null +++ b/test/unit/modulate.js @@ -0,0 +1,187 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const sharp = require('../../'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); +const fixtures = require('../fixtures'); + +describe('Modulate', () => { + describe('Invalid options', () => { + [ + null, + undefined, + 10, + { brightness: -1 }, + { brightness: '50%' }, + { brightness: null }, + { saturation: -1 }, + { saturation: '50%' }, + { saturation: null }, + { hue: '50deg' }, + { hue: 1.5 }, + { hue: null }, + { lightness: '+50' }, + { lightness: null } + ].forEach((options) => { + it('should throw', () => { + assert.throws(() => { + sharp(fixtures.inputJpg).modulate(options); + }); + }); + }); + }); + + it('should be able to hue-rotate', async () => { + const [r, g, b] = await sharp({ + create: { + width: 1, + height: 1, + channels: 3, + background: { r: 153, g: 68, b: 68 } + } + }) + .modulate({ hue: 120 }) + .raw() + .toBuffer(); + + assert.deepStrictEqual({ r: 41, g: 107, b: 57 }, { r, g, b }); + }); + + it('should be able to brighten', async () => { + const [r, g, b] = await sharp({ + create: { + width: 1, + height: 1, + channels: 3, + background: { r: 153, g: 68, b: 68 } + } + }) + .modulate({ brightness: 2 }) + .raw() + .toBuffer(); + + assert.deepStrictEqual({ r: 255, g: 173, b: 168 }, { r, g, b }); + }); + + it('should be able to darken', async () => { + const [r, g, b] = await sharp({ + create: { + width: 1, + height: 1, + channels: 3, + background: { r: 153, g: 68, b: 68 } + } + }) + .modulate({ brightness: 0.5 }) + .raw() + .toBuffer(); + + assert.deepStrictEqual({ r: 97, g: 17, b: 25 }, { r, g, b }); + }); + + it('should be able to saturate', async () => { + const [r, g, b] = await sharp({ + create: { + width: 1, + height: 1, + channels: 3, + background: { r: 153, g: 68, b: 68 } + } + }) + .modulate({ saturation: 2 }) + .raw() + .toBuffer(); + + assert.deepStrictEqual({ r: 198, g: 0, b: 43 }, { r, g, b }); + }); + + it('should be able to desaturate', async () => { + const [r, g, b] = await sharp({ + create: { + width: 1, + height: 1, + channels: 3, + background: { r: 153, g: 68, b: 68 } + } + }) + .modulate({ saturation: 0.5 }) + .raw() + .toBuffer(); + + assert.deepStrictEqual({ r: 127, g: 83, b: 81 }, { r, g, b }); + }); + + it('should be able to lighten', async () => { + const [r, g, b] = await sharp({ + create: { + width: 1, + height: 1, + channels: 3, + background: { r: 153, g: 68, b: 68 } + } + }) + .modulate({ lightness: 10 }) + .raw() + .toBuffer(); + + assert.deepStrictEqual({ r: 182, g: 93, b: 92 }, { r, g, b }); + }); + + it('should be able to modulate all channels', async () => { + const [r, g, b] = await sharp({ + create: { + width: 1, + height: 1, + channels: 3, + background: { r: 153, g: 68, b: 68 } + } + }) + .modulate({ brightness: 2, saturation: 0.5, hue: 180 }) + .raw() + .toBuffer(); + + assert.deepStrictEqual({ r: 149, g: 209, b: 214 }, { r, g, b }); + }); + + it('should be able to use linear and modulate together', async () => { + const contrast = 1.5; + const brightness = 0.5; + + const [r, g, b] = await sharp({ + create: { + width: 1, + height: 1, + channels: 3, + background: { r: 153, g: 68, b: 68 } + } + }) + .linear(contrast, -(128 * contrast) + 128) + .modulate({ brightness }) + .raw() + .toBuffer(); + + assert.deepStrictEqual({ r: 81, g: 0, b: 0 }, { r, g, b }); + }); + + describe('hue-rotate', () => { + [30, 60, 90, 120, 150, 180, 210, 240, 270, 300, 330, 360].forEach(angle => { + it(`should hue rotate by ${angle} deg`, async () => { + const base = `modulate-hue-angle-${angle}.png`; + const actual = fixtures.path(`output.${base}`); + const expected = fixtures.expected(base); + + await sharp(fixtures.testPattern) + .resize(320) + .modulate({ hue: angle }) + .png({ compressionLevel: 0 }) + .toFile(actual) + .then(() => { + fixtures.assertMaxColourDistance(actual, expected, 3); + }); + }); + }); + }); +}); diff --git a/test/unit/negate.js b/test/unit/negate.js index aff4fd909..87196a592 100644 --- a/test/unit/negate.js +++ b/test/unit/negate.js @@ -1,16 +1,20 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const sharp = require('../../'); const fixtures = require('../fixtures'); -describe('Negate', function () { - it('negate (jpeg)', function (done) { +describe('Negate', () => { + it('negate (jpeg)', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .negate() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -19,11 +23,11 @@ describe('Negate', function () { }); }); - it('negate (png)', function (done) { + it('negate (png)', (_t, done) => { sharp(fixtures.inputPng) .resize(320, 240) .negate() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('png', info.format); assert.strictEqual(320, info.width); @@ -32,11 +36,11 @@ describe('Negate', function () { }); }); - it('negate (png, trans)', function (done) { + it('negate (png, trans)', (_t, done) => { sharp(fixtures.inputPngWithTransparency) .resize(320, 240) .negate() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('png', info.format); assert.strictEqual(320, info.width); @@ -45,11 +49,11 @@ describe('Negate', function () { }); }); - it('negate (png, alpha)', function (done) { + it('negate (png, alpha)', (_t, done) => { sharp(fixtures.inputPngWithGreyAlpha) .resize(320, 240) .negate() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('png', info.format); assert.strictEqual(320, info.width); @@ -58,11 +62,11 @@ describe('Negate', function () { }); }); - it('negate (webp)', function (done) { + it('negate (webp)', (_t, done) => { sharp(fixtures.inputWebP) .resize(320, 240) .negate() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('webp', info.format); assert.strictEqual(320, info.width); @@ -71,11 +75,11 @@ describe('Negate', function () { }); }); - it('negate (webp, trans)', function (done) { + it('negate (webp, trans)', (_t, done) => { sharp(fixtures.inputWebPWithTransparency) .resize(320, 240) .negate() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('webp', info.format); assert.strictEqual(320, info.width); @@ -84,11 +88,11 @@ describe('Negate', function () { }); }); - it('negate (true)', function (done) { + it('negate (true)', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .negate(true) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -97,14 +101,114 @@ describe('Negate', function () { }); }); - it('negate (false)', function (done) { + it('negate (false)', (_t, done) => { const output = fixtures.path('output.unmodified-by-negate.png'); sharp(fixtures.inputJpgWithLowContrast) .negate(false) - .toFile(output, function (err, info) { + .toFile(output, (err) => { if (err) throw err; fixtures.assertMaxColourDistance(output, fixtures.inputJpgWithLowContrast, 0); done(); }); }); + + it('negate ({alpha: true})', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 240) + .negate({ alpha: true }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + fixtures.assertSimilar(fixtures.expected('negate.jpg'), data, done); + }); + }); + + it('negate non-alpha channels (png)', (_t, done) => { + sharp(fixtures.inputPng) + .resize(320, 240) + .negate({ alpha: false }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + fixtures.assertSimilar(fixtures.expected('negate-preserve-alpha.png'), data, done); + }); + }); + + it('negate non-alpha channels (png, trans)', (_t, done) => { + sharp(fixtures.inputPngWithTransparency) + .resize(320, 240) + .negate({ alpha: false }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + fixtures.assertSimilar(fixtures.expected('negate-preserve-alpha-trans.png'), data, done); + }); + }); + + it('negate non-alpha channels (png, alpha)', (_t, done) => { + sharp(fixtures.inputPngWithGreyAlpha) + .resize(320, 240) + .negate({ alpha: false }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + fixtures.assertSimilar(fixtures.expected('negate-preserve-alpha-grey.png'), data, done); + }); + }); + + it('negate non-alpha channels (webp)', (_t, done) => { + sharp(fixtures.inputWebP) + .resize(320, 240) + .negate({ alpha: false }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('webp', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + fixtures.assertSimilar(fixtures.expected('negate-preserve-alpha.webp'), data, done); + }); + }); + + it('negate non-alpha channels (webp, trans)', (_t, done) => { + sharp(fixtures.inputWebPWithTransparency) + .resize(320, 240) + .negate({ alpha: false }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('webp', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + fixtures.assertSimilar(fixtures.expected('negate-preserve-alpha-trans.webp'), data, done); + }); + }); + + it('negate create', async () => { + const [r, g, b] = await sharp({ + create: { + width: 1, + height: 1, + channels: 3, + background: { r: 10, g: 20, b: 30 } + } + }) + .negate() + .raw() + .toBuffer(); + + assert.deepStrictEqual({ r, g, b }, { r: 245, g: 235, b: 225 }); + }); + + it('invalid alpha value', () => { + assert.throws(() => { + sharp(fixtures.inputWebPWithTransparency).negate({ alpha: 'non-bool' }); + }); + }); }); diff --git a/test/unit/noise.js b/test/unit/noise.js new file mode 100644 index 000000000..c95006b6d --- /dev/null +++ b/test/unit/noise.js @@ -0,0 +1,308 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('Gaussian noise', () => { + it('generate single-channel gaussian noise', (_t, done) => { + const output = fixtures.path('output.noise-1-channel.png'); + const noise = sharp({ + create: { + width: 1024, + height: 768, + channels: 1, // b-w + noise: { + type: 'gaussian', + mean: 128, + sigma: 30 + } + } + }).toColourspace('b-w'); + noise.toFile(output, (err, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(1024, info.width); + assert.strictEqual(768, info.height); + assert.strictEqual(1, info.channels); + sharp(output).metadata((err, metadata) => { + if (err) throw err; + assert.strictEqual('b-w', metadata.space); + assert.strictEqual('uchar', metadata.depth); + done(); + }); + }); + }); + + it('generate 3-channels gaussian noise', (_t, done) => { + const output = fixtures.path('output.noise-3-channels.png'); + const noise = sharp({ + create: { + width: 1024, + height: 768, + channels: 3, // sRGB + noise: { + type: 'gaussian', + mean: 128, + sigma: 30 + } + } + }); + noise.toFile(output, (err, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(1024, info.width); + assert.strictEqual(768, info.height); + assert.strictEqual(3, info.channels); + sharp(output).metadata((err, metadata) => { + if (err) throw err; + assert.strictEqual('srgb', metadata.space); + assert.strictEqual('uchar', metadata.depth); + done(); + }); + }); + }); + + it('overlay 3-channels gaussian noise over image', (_t, done) => { + const output = fixtures.path('output.noise-image.jpg'); + const noise = sharp({ + create: { + width: 320, + height: 240, + channels: 3, + noise: { + type: 'gaussian', + mean: 0, + sigma: 5 + } + } + }); + noise.toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(3, info.channels); + sharp(fixtures.inputJpg) + .resize(320, 240) + .composite([ + { + input: data, + blend: 'exclusion', + raw: { + width: info.width, + height: info.height, + channels: info.channels + } + } + ]) + .toFile(output, (err, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + assert.strictEqual(3, info.channels); + // perceptual hashing detects that images are the same (difference is <=1%) + fixtures.assertSimilar(output, fixtures.inputJpg, (err) => { + if (err) throw err; + done(); + }); + }); + }); + }); + + it('overlay strong single-channel (sRGB) gaussian noise with 25% transparency over transparent png image', (_t, done) => { + const output = fixtures.path('output.noise-image-transparent.png'); + const width = 320; + const height = 240; + const rawData = { + width, + height, + channels: 1 + }; + const noise = sharp({ + create: { + width, + height, + channels: 1, + noise: { + type: 'gaussian', + mean: 200, + sigma: 30 + } + } + }); + noise + .toColourspace('b-w') + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(1, info.channels); + sharp(data, { raw: rawData }) + .joinChannel(data, { raw: rawData }) // r channel + .joinChannel(data, { raw: rawData }) // b channel + .joinChannel(Buffer.alloc(width * height, 64), { raw: rawData }) // alpha channel + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(4, info.channels); + sharp(fixtures.inputPngRGBWithAlpha) + .resize(width, height) + .composite([ + { + input: data, + blend: 'exclusion', + raw: { + width: info.width, + height: info.height, + channels: info.channels + } + } + ]) + .toFile(output, (err, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(width, info.width); + assert.strictEqual(height, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(output, fixtures.inputPngRGBWithAlpha, { threshold: 10 }, (err) => { + if (err) throw err; + done(); + }); + }); + }); + }); + }); + + it('animated noise', async () => { + const gif = await sharp({ + create: { + width: 16, + height: 64, + pageHeight: 16, + channels: 3, + noise: { type: 'gaussian' } + } + }) + .gif() + .toBuffer(); + + const { width, height, pages, delay } = await sharp(gif).metadata(); + assert.strictEqual(width, 16); + assert.strictEqual(height, 16); + assert.strictEqual(pages, 4); + assert.strictEqual(delay.length, 4); + }); + + it('no create object properties specified', () => { + assert.throws(() => { + sharp({ + create: {} + }); + }); + }); + + it('invalid noise object', () => { + assert.throws(() => { + sharp({ + create: { + width: 100, + height: 100, + channels: 3, + noise: 'gaussian' + } + }); + }); + }); + + it('unknown type of noise', () => { + assert.throws(() => { + sharp({ + create: { + width: 100, + height: 100, + channels: 3, + noise: { + type: 'unknown' + } + } + }); + }); + }); + + it('gaussian noise, invalid amount of channels', () => { + assert.throws(() => { + sharp({ + create: { + width: 100, + height: 100, + channels: 5, + noise: { + type: 'gaussian', + mean: 5, + sigma: 10 + } + } + }); + }); + }); + + it('gaussian noise, invalid mean', () => { + assert.throws(() => { + sharp({ + create: { + width: 100, + height: 100, + channels: 1, + noise: { + type: 'gaussian', + mean: -1.5, + sigma: 10 + } + } + }); + }); + }); + + it('gaussian noise, invalid sigma', () => { + assert.throws(() => { + sharp({ + create: { + width: 100, + height: 100, + channels: 1, + noise: { + type: 'gaussian', + mean: 0, + sigma: -1.5 + } + } + }); + }); + }); + + it('Invalid pageHeight', () => { + const create = { + width: 8, + height: 8, + channels: 4, + noise: { type: 'gaussian' } + }; + assert.throws( + () => sharp({ create: { ...create, pageHeight: 'zoinks' } }), + /Expected positive integer for create\.pageHeight but received zoinks of type string/ + ); + assert.throws( + () => sharp({ create: { ...create, pageHeight: -1 } }), + /Expected positive integer for create\.pageHeight but received -1 of type number/ + ); + assert.throws( + () => sharp({ create: { ...create, pageHeight: 9 } }), + /Expected positive integer for create\.pageHeight but received 9 of type number/ + ); + assert.throws( + () => sharp({ create: { ...create, pageHeight: 3 } }), + /Expected create\.height 8 to be a multiple of create\.pageHeight 3/ + ); + }); +}); diff --git a/test/unit/normalize.js b/test/unit/normalize.js index fcf416316..2c0cf32c6 100644 --- a/test/unit/normalize.js +++ b/test/unit/normalize.js @@ -1,63 +1,67 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const sharp = require('../../'); const fixtures = require('../fixtures'); -const assertNormalized = function (data) { +const assertNormalized = (data) => { let min = 255; let max = 0; for (let i = 0; i < data.length; i++) { min = Math.min(min, data[i]); max = Math.max(max, data[i]); } - assert.strictEqual(0, min); - assert.strictEqual(255, max); + assert.strictEqual(0, min, 'min too high'); + assert.ok(max > 248, 'max too low'); }; -describe('Normalization', function () { - it('spreads rgb image values between 0 and 255', function (done) { +describe('Normalization', () => { + it('spreads rgb image values between 0 and 255', (_t, done) => { sharp(fixtures.inputJpgWithLowContrast) .normalise() .raw() - .toBuffer(function (err, data, info) { + .toBuffer((err, data) => { if (err) throw err; assertNormalized(data); done(); }); }); - it('spreads grayscaled image values between 0 and 255', function (done) { + it('spreads grayscaled image values between 0 and 255', (_t, done) => { sharp(fixtures.inputJpgWithLowContrast) - .gamma() .greyscale() - .normalize(true) + .normalize() .raw() - .toBuffer(function (err, data, info) { + .toBuffer((err, data) => { if (err) throw err; assertNormalized(data); done(); }); }); - it('stretches greyscale images with alpha channel', function (done) { + it('stretches greyscale images with alpha channel', (_t, done) => { sharp(fixtures.inputPngWithGreyAlpha) .normalise() .raw() - .toBuffer(function (err, data, info) { + .toBuffer((err, data) => { if (err) throw err; assertNormalized(data); done(); }); }); - it('keeps an existing alpha channel', function (done) { + it('keeps an existing alpha channel', (_t, done) => { sharp(fixtures.inputPngWithTransparency) + .resize(8, 8) .normalize() - .toBuffer(function (err, data) { + .toBuffer((err, data) => { if (err) throw err; - sharp(data).metadata(function (err, metadata) { + sharp(data).metadata((err, metadata) => { if (err) return done(err); assert.strictEqual(4, metadata.channels); assert.strictEqual(true, metadata.hasAlpha); @@ -67,12 +71,13 @@ describe('Normalization', function () { }); }); - it('keeps the alpha channel of greyscale images intact', function (done) { + it('keeps the alpha channel of greyscale images intact', (_t, done) => { sharp(fixtures.inputPngWithGreyAlpha) + .resize(8, 8) .normalise() - .toBuffer(function (err, data) { + .toBuffer((err, data) => { if (err) throw err; - sharp(data).metadata(function (err, metadata) { + sharp(data).metadata((err, metadata) => { if (err) return done(err); assert.strictEqual(true, metadata.hasAlpha); assert.strictEqual(4, metadata.channels); @@ -82,25 +87,79 @@ describe('Normalization', function () { }); }); - it('does not alter images with only one color', function (done) { + it('does not alter images with only one color', (_t, done) => { const output = fixtures.path('output.unmodified-png-with-one-color.png'); sharp(fixtures.inputPngWithOneColor) .normalize() - .toFile(output, function (err, info) { + .toFile(output, (err) => { if (err) done(err); fixtures.assertMaxColourDistance(output, fixtures.inputPngWithOneColor, 0); done(); }); }); - it('works with 16-bit RGBA images', function (done) { + it('works with 16-bit RGBA images', (_t, done) => { sharp(fixtures.inputPngWithTransparency16bit) .normalise() .raw() - .toBuffer(function (err, data, info) { + .toBuffer((err, data) => { + if (err) throw err; + assertNormalized(data); + done(); + }); + }); + + it('should handle luminance range', (_t, done) => { + sharp(fixtures.inputJpgWithLowContrast) + .normalise({ lower: 10, upper: 70 }) + .raw() + .toBuffer((err, data) => { if (err) throw err; assertNormalized(data); done(); }); }); + + it('should allow lower without upper', () => { + assert.doesNotThrow(() => sharp().normalize({ lower: 2 })); + }); + it('should allow upper without lower', () => { + assert.doesNotThrow(() => sharp().normalize({ upper: 98 })); + }); + it('should throw when lower is out of range', () => { + assert.throws( + () => sharp().normalise({ lower: -10 }), + /Expected number between 0 and 99 for lower but received -10 of type number/ + ); + }); + it('should throw when upper is out of range', () => { + assert.throws( + () => sharp().normalise({ upper: 110 }), + /Expected number between 1 and 100 for upper but received 110 of type number/ + ); + }); + it('should throw when lower is not a number', () => { + assert.throws( + () => sharp().normalise({ lower: 'fail' }), + /Expected number between 0 and 99 for lower but received fail of type string/ + ); + }); + it('should throw when upper is not a number', () => { + assert.throws( + () => sharp().normalise({ upper: 'fail' }), + /Expected number between 1 and 100 for upper but received fail of type string/ + ); + }); + it('should throw when the lower and upper are equal', () => { + assert.throws( + () => sharp().normalise({ lower: 2, upper: 2 }), + /Expected lower to be less than upper for range but received 2 >= 2/ + ); + }); + it('should throw when the lower is greater than upper', () => { + assert.throws( + () => sharp().normalise({ lower: 3, upper: 2 }), + /Expected lower to be less than upper for range but received 3 >= 2/ + ); + }); }); diff --git a/test/unit/overlay.js b/test/unit/overlay.js deleted file mode 100644 index 7e60c232c..000000000 --- a/test/unit/overlay.js +++ /dev/null @@ -1,598 +0,0 @@ -'use strict'; - -const fs = require('fs'); -const assert = require('assert'); - -const fixtures = require('../fixtures'); -const sharp = require('../../'); - -// Helpers -const getPaths = function (baseName, extension) { - if (typeof extension === 'undefined') { - extension = 'png'; - } - return { - actual: fixtures.path('output.' + baseName + '.' + extension), - expected: fixtures.expected(baseName + '.' + extension) - }; -}; - -// Test -describe('Overlays', function () { - it('Overlay transparent PNG file on solid background', function (done) { - const paths = getPaths('alpha-layer-01'); - - sharp(fixtures.inputPngOverlayLayer0) - .overlayWith(fixtures.inputPngOverlayLayer1) - .toFile(paths.actual, function (error) { - if (error) return done(error); - fixtures.assertMaxColourDistance(paths.actual, paths.expected); - done(); - }); - }); - - it('Overlay transparent PNG Buffer on solid background', function (done) { - const paths = getPaths('alpha-layer-01'); - - sharp(fixtures.inputPngOverlayLayer0) - .overlayWith(fs.readFileSync(fixtures.inputPngOverlayLayer1)) - .toFile(paths.actual, function (error) { - if (error) return done(error); - fixtures.assertMaxColourDistance(paths.actual, paths.expected); - done(); - }); - }); - - it('Overlay low-alpha transparent PNG on solid background', function (done) { - const paths = getPaths('alpha-layer-01-low-alpha'); - - sharp(fixtures.inputPngOverlayLayer0) - .overlayWith(fixtures.inputPngOverlayLayer1LowAlpha) - .toFile(paths.actual, function (error) { - if (error) return done(error); - fixtures.assertMaxColourDistance(paths.actual, paths.expected); - done(); - }); - }); - - it('Composite three transparent PNGs into one', function (done) { - const paths = getPaths('alpha-layer-012'); - - sharp(fixtures.inputPngOverlayLayer0) - .overlayWith(fixtures.inputPngOverlayLayer1) - .toBuffer(function (error, data) { - if (error) return done(error); - sharp(data) - .overlayWith(fixtures.inputPngOverlayLayer2) - .toFile(paths.actual, function (error) { - if (error) return done(error); - fixtures.assertMaxColourDistance(paths.actual, paths.expected); - done(); - }); - }); - }); - - it('Composite two transparent PNGs into one', function (done) { - const paths = getPaths('alpha-layer-12'); - - sharp(fixtures.inputPngOverlayLayer1) - .overlayWith(fixtures.inputPngOverlayLayer2) - .toFile(paths.actual, function (error) { - if (error) return done(error); - fixtures.assertMaxColourDistance(paths.actual, paths.expected); - done(); - }); - }); - - it('Composite two low-alpha transparent PNGs into one', function (done) { - const paths = getPaths('alpha-layer-12-low-alpha'); - - sharp(fixtures.inputPngOverlayLayer1LowAlpha) - .overlayWith(fixtures.inputPngOverlayLayer2LowAlpha) - .toFile(paths.actual, function (error) { - if (error) return done(error); - fixtures.assertMaxColourDistance(paths.actual, paths.expected, 2); - done(); - }); - }); - - it('Composite three low-alpha transparent PNGs into one', function (done) { - const paths = getPaths('alpha-layer-012-low-alpha'); - - sharp(fixtures.inputPngOverlayLayer0) - .overlayWith(fixtures.inputPngOverlayLayer1LowAlpha) - .toBuffer(function (error, data) { - if (error) return done(error); - - sharp(data) - .overlayWith(fixtures.inputPngOverlayLayer2LowAlpha) - .toFile(paths.actual, function (error) { - if (error) return done(error); - fixtures.assertMaxColourDistance(paths.actual, paths.expected); - done(); - }); - }); - }); - - it('Composite rgb+alpha PNG onto JPEG', function (done) { - const paths = getPaths('overlay-jpeg-with-rgb', 'jpg'); - - sharp(fixtures.inputJpg) - .resize(2048, 1536) - .overlayWith(fixtures.inputPngOverlayLayer1) - .toFile(paths.actual, function (error, info) { - if (error) return done(error); - fixtures.assertMaxColourDistance(paths.actual, paths.expected, 102); - done(); - }); - }); - - it('Composite greyscale+alpha PNG onto JPEG', function (done) { - const paths = getPaths('overlay-jpeg-with-greyscale', 'jpg'); - - sharp(fixtures.inputJpg) - .resize(400, 300) - .overlayWith(fixtures.inputPngWithGreyAlpha) - .toFile(paths.actual, function (error, info) { - if (error) return done(error); - fixtures.assertMaxColourDistance(paths.actual, paths.expected, 102); - done(); - }); - }); - - if (sharp.format.webp.input.file) { - it('Composite WebP onto JPEG', function (done) { - const paths = getPaths('overlay-jpeg-with-webp', 'jpg'); - - sharp(fixtures.inputJpg) - .resize(300, 300) - .overlayWith(fixtures.inputWebPWithTransparency) - .toFile(paths.actual, function (error, info) { - if (error) return done(error); - fixtures.assertMaxColourDistance(paths.actual, paths.expected, 102); - done(); - }); - }); - } - - it('Composite JPEG onto PNG, no premultiply', function (done) { - sharp(fixtures.inputPngOverlayLayer1) - .overlayWith(fixtures.inputJpgWithLandscapeExif1) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(false, info.premultiplied); - done(); - }); - }); - - it('Composite opaque JPEG onto JPEG, no premultiply', function (done) { - sharp(fixtures.inputJpg) - .overlayWith(fixtures.inputJpgWithLandscapeExif1) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(false, info.premultiplied); - done(); - }); - }); - - it('Fail when overlay is larger', function (done) { - sharp(fixtures.inputJpg) - .resize(320) - .overlayWith(fixtures.inputPngOverlayLayer1) - .toBuffer(function (error) { - assert.strictEqual(true, error instanceof Error); - done(); - }); - }); - - it('Fail with empty String parameter', function () { - assert.throws(function () { - sharp().overlayWith(''); - }); - }); - - it('Fail with non-String parameter', function () { - assert.throws(function () { - sharp().overlayWith(1); - }); - }); - - it('Fail with unsupported gravity', function () { - assert.throws(function () { - sharp() - .overlayWith(fixtures.inputPngOverlayLayer1, { - gravity: 9 - }); - }); - }); - - it('Empty options', function () { - assert.doesNotThrow(function () { - sharp().overlayWith(fixtures.inputPngOverlayLayer1, {}); - }); - }); - - describe('Overlay with numeric gravity', function () { - Object.keys(sharp.gravity).forEach(function (gravity) { - it(gravity, function (done) { - const expected = fixtures.expected('overlay-gravity-' + gravity + '.jpg'); - sharp(fixtures.inputJpg) - .resize(80) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - gravity: gravity - }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(80, info.width); - assert.strictEqual(65, info.height); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(expected, data, done); - }); - }); - }); - }); - - describe('Overlay with string-based gravity', function () { - Object.keys(sharp.gravity).forEach(function (gravity) { - it(gravity, function (done) { - const expected = fixtures.expected('overlay-gravity-' + gravity + '.jpg'); - sharp(fixtures.inputJpg) - .resize(80) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - gravity: sharp.gravity[gravity] - }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(80, info.width); - assert.strictEqual(65, info.height); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(expected, data, done); - }); - }); - }); - }); - - describe('Overlay with tile enabled and gravity', function () { - Object.keys(sharp.gravity).forEach(function (gravity) { - it(gravity, function (done) { - const expected = fixtures.expected('overlay-tile-gravity-' + gravity + '.jpg'); - sharp(fixtures.inputJpg) - .resize(80) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - tile: true, - gravity: gravity - }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(80, info.width); - assert.strictEqual(65, info.height); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(expected, data, done); - }); - }); - }); - }); - - describe('Overlay with top-left offsets', function () { - it('Overlay with 10px top & 10px left offsets', function (done) { - const expected = fixtures.expected('overlay-valid-offsets-10-10.jpg'); - sharp(fixtures.inputJpg) - .resize(400) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - top: 10, - left: 10 - }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(expected, data, done); - }); - }); - - it('Overlay with 100px top & 300px left offsets', function (done) { - const expected = fixtures.expected('overlay-valid-offsets-100-300.jpg'); - sharp(fixtures.inputJpg) - .resize(400) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - top: 100, - left: 300 - }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(expected, data, done); - }); - }); - - it('Overlay with only top offset', function () { - assert.throws(function () { - sharp(fixtures.inputJpg) - .resize(400) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - top: 1000 - }); - }); - }); - - it('Overlay with only left offset', function () { - assert.throws(function () { - sharp(fixtures.inputJpg) - .resize(400) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - left: 1000 - }); - }); - }); - - it('Overlay with negative offsets', function () { - assert.throws(function () { - sharp(fixtures.inputJpg) - .resize(400) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - top: -1000, - left: -1000 - }); - }); - }); - - it('Overlay with 0 offset', function (done) { - const expected = fixtures.expected('overlay-offset-0.jpg'); - sharp(fixtures.inputJpg) - .resize(400) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - top: 0, - left: 0 - }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(expected, data, done); - }); - }); - - it('Overlay with offset and gravity', function (done) { - const expected = fixtures.expected('overlay-offset-with-gravity.jpg'); - sharp(fixtures.inputJpg) - .resize(400) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - left: 10, - top: 10, - gravity: 4 - - }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(expected, data, done); - }); - }); - - it('Overlay with offset and gravity and tile', function (done) { - const expected = fixtures.expected('overlay-offset-with-gravity-tile.jpg'); - sharp(fixtures.inputJpg) - .resize(400) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - left: 10, - top: 10, - gravity: 4, - tile: true - }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(expected, data, done); - }); - }); - - it('Overlay with offset and tile', function (done) { - const expected = fixtures.expected('overlay-offset-with-tile.jpg'); - sharp(fixtures.inputJpg) - .resize(400) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - left: 10, - top: 10, - tile: true - }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(expected, data, done); - }); - }); - - it('Overlay with invalid cutout option', function () { - assert.throws(function () { - sharp().overlayWith('ignore', { cutout: 1 }); - }); - }); - - it('Overlay with invalid tile option', function () { - assert.throws(function () { - sharp().overlayWith('ignore', { tile: 1 }); - }); - }); - - it('Overlay with very large offset', function (done) { - const expected = fixtures.expected('overlay-very-large-offset.jpg'); - sharp(fixtures.inputJpg) - .resize(400) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - left: 10000, - top: 10000 - }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(expected, data, done); - }); - }); - - it('Overlay 100x100 with 50x50 so bottom edges meet', function (done) { - sharp(fixtures.inputJpg) - .resize(50, 50) - .toBuffer(function (err, overlay) { - if (err) throw err; - sharp(fixtures.inputJpgWithLandscapeExif1) - .resize(100, 100) - .overlayWith(overlay, { - top: 50, - left: 40 - }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(100, info.width); - assert.strictEqual(100, info.height); - fixtures.assertSimilar(fixtures.expected('overlay-bottom-edges-meet.jpg'), data, done); - }); - }); - }); - }); - - it('With tile enabled and image rotated 90 degrees', function (done) { - const expected = fixtures.expected('overlay-tile-rotated90.jpg'); - sharp(fixtures.inputJpg) - .rotate(90) - .resize(80) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - tile: true - }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(80, info.width); - assert.strictEqual(98, info.height); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(expected, data, done); - }); - }); - - it('With tile enabled and image rotated 90 degrees and gravity northwest', function (done) { - const expected = fixtures.expected('overlay-tile-rotated90-gravity-northwest.jpg'); - sharp(fixtures.inputJpg) - .rotate(90) - .resize(80) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - tile: true, - gravity: 'northwest' - }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(80, info.width); - assert.strictEqual(98, info.height); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(expected, data, done); - }); - }); - - describe('Overlay with cutout enabled and gravity', function () { - Object.keys(sharp.gravity).forEach(function (gravity) { - it(gravity, function (done) { - const expected = fixtures.expected('overlay-cutout-gravity-' + gravity + '.jpg'); - sharp(fixtures.inputJpg) - .resize(80) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - cutout: true, - gravity: gravity - }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(80, info.width); - assert.strictEqual(65, info.height); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(expected, data, done); - }); - }); - }); - }); - - it('With cutout enabled and image rotated 90 degrees', function (done) { - const expected = fixtures.expected('overlay-cutout-rotated90.jpg'); - sharp(fixtures.inputJpg) - .rotate(90) - .resize(80) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - cutout: true - }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(80, info.width); - assert.strictEqual(98, info.height); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(expected, data, done); - }); - }); - - it('With cutout enabled and image rotated 90 degrees and gravity northwest', function (done) { - const expected = fixtures.expected('overlay-cutout-rotated90-gravity-northwest.jpg'); - sharp(fixtures.inputJpg) - .rotate(90) - .resize(80) - .overlayWith(fixtures.inputPngWithTransparency16bit, { - cutout: true, - gravity: 'northwest' - }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(80, info.width); - assert.strictEqual(98, info.height); - assert.strictEqual(3, info.channels); - fixtures.assertSimilar(expected, data, done); - }); - }); - - it('Composite RGBA raw buffer onto JPEG', function (done) { - sharp(fixtures.inputPngOverlayLayer1) - .raw() - .toBuffer(function (err, data, info) { - if (err) throw err; - sharp(fixtures.inputJpg) - .resize(2048, 1536) - .overlayWith(data, { raw: info }) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, info.premultiplied); - fixtures.assertSimilar(fixtures.expected('overlay-jpeg-with-rgb.jpg'), data, done); - }); - }); - }); - - it('Returns an error when called with an invalid file', function (done) { - sharp(fixtures.inputJpg) - .overlayWith('notfound.png') - .toBuffer(function (err) { - assert(err instanceof Error); - done(); - }); - }); - - it('Composite JPEG onto JPEG, no premultiply', function (done) { - sharp(fixtures.inputJpg) - .resize(480, 320) - .overlayWith(fixtures.inputJpgBooleanTest) - .png() - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('png', info.format); - assert.strictEqual(480, info.width); - assert.strictEqual(320, info.height); - assert.strictEqual(3, info.channels); - assert.strictEqual(false, info.premultiplied); - fixtures.assertSimilar(fixtures.expected('overlay-jpeg-with-jpeg.jpg'), data, done); - }); - }); -}); diff --git a/test/unit/png.js b/test/unit/png.js new file mode 100644 index 000000000..81800e1a7 --- /dev/null +++ b/test/unit/png.js @@ -0,0 +1,253 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const fs = require('node:fs'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('PNG', () => { + it('compression level is valid', () => { + assert.doesNotThrow(() => { + sharp().png({ compressionLevel: 0 }); + }); + }); + + it('compression level is invalid', () => { + assert.throws(() => { + sharp().png({ compressionLevel: -1 }); + }); + }); + + it('default compressionLevel generates smaller file than compressionLevel=0', (_t, done) => { + // First generate with default compressionLevel + sharp(fixtures.inputPng) + .resize(320, 240) + .png() + .toBuffer((err, defaultData, defaultInfo) => { + if (err) throw err; + assert.strictEqual(true, defaultData.length > 0); + assert.strictEqual('png', defaultInfo.format); + // Then generate with compressionLevel=6 + sharp(fixtures.inputPng) + .resize(320, 240) + .png({ compressionLevel: 0 }) + .toBuffer((err, largerData, largerInfo) => { + if (err) throw err; + assert.strictEqual(true, largerData.length > 0); + assert.strictEqual('png', largerInfo.format); + assert.strictEqual(true, defaultData.length < largerData.length); + done(); + }); + }); + }); + + it('without adaptiveFiltering generates smaller file', (_t, done) => { + // First generate with adaptive filtering + sharp(fixtures.inputPng) + .resize(320, 240) + .png({ adaptiveFiltering: true }) + .toBuffer((err, adaptiveData, adaptiveInfo) => { + if (err) throw err; + assert.strictEqual(true, adaptiveData.length > 0); + assert.strictEqual(adaptiveData.length, adaptiveInfo.size); + assert.strictEqual('png', adaptiveInfo.format); + assert.strictEqual(320, adaptiveInfo.width); + assert.strictEqual(240, adaptiveInfo.height); + // Then generate without + sharp(fixtures.inputPng) + .resize(320, 240) + .png({ adaptiveFiltering: false }) + .toBuffer((err, withoutAdaptiveData, withoutAdaptiveInfo) => { + if (err) throw err; + assert.strictEqual(true, withoutAdaptiveData.length > 0); + assert.strictEqual(withoutAdaptiveData.length, withoutAdaptiveInfo.size); + assert.strictEqual('png', withoutAdaptiveInfo.format); + assert.strictEqual(320, withoutAdaptiveInfo.width); + assert.strictEqual(240, withoutAdaptiveInfo.height); + assert.strictEqual(true, withoutAdaptiveData.length < adaptiveData.length); + done(); + }); + }); + }); + + it('Invalid PNG adaptiveFiltering value throws error', () => { + assert.throws(() => { + sharp().png({ adaptiveFiltering: 1 }); + }); + }); + + it('Progressive PNG image', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 240) + .png({ progressive: false }) + .toBuffer((err, nonProgressiveData, nonProgressiveInfo) => { + if (err) throw err; + assert.strictEqual(true, nonProgressiveData.length > 0); + assert.strictEqual(nonProgressiveData.length, nonProgressiveInfo.size); + assert.strictEqual('png', nonProgressiveInfo.format); + assert.strictEqual(320, nonProgressiveInfo.width); + assert.strictEqual(240, nonProgressiveInfo.height); + sharp(nonProgressiveData) + .png({ progressive: true }) + .toBuffer((err, progressiveData, progressiveInfo) => { + if (err) throw err; + assert.strictEqual(true, progressiveData.length > 0); + assert.strictEqual(progressiveData.length, progressiveInfo.size); + assert.strictEqual(true, progressiveData.length > nonProgressiveData.length); + assert.strictEqual('png', progressiveInfo.format); + assert.strictEqual(320, progressiveInfo.width); + assert.strictEqual(240, progressiveInfo.height); + done(); + }); + }); + }); + + it('16-bit grey+alpha PNG identity transform', () => { + const actual = fixtures.path('output.16-bit-grey-alpha-identity.png'); + return sharp(fixtures.inputPng16BitGreyAlpha) + .toFile(actual) + .then(() => { + fixtures.assertMaxColourDistance(actual, fixtures.expected('16-bit-grey-alpha-identity.png')); + }); + }); + + it('16-bit grey+alpha PNG roundtrip', async () => { + const after = await sharp(fixtures.inputPng16BitGreyAlpha) + .toColourspace('grey16') + .toBuffer(); + + const [alphaMeanBefore, alphaMeanAfter] = ( + await Promise.all([ + sharp(fixtures.inputPng16BitGreyAlpha).stats(), + sharp(after).stats() + ]) + ) + .map(stats => stats.channels[1].mean); + + assert.strictEqual(alphaMeanAfter, alphaMeanBefore); + }); + + it('palette decode/encode roundtrip', async () => { + const data = await sharp(fixtures.inputPngPalette) + .png({ effort: 1, palette: true }) + .toBuffer(); + + const { size, ...metadata } = await sharp(data).metadata(); + void size; + assert.deepStrictEqual(metadata, { + autoOrient: { + height: 68, + width: 68 + }, + format: 'png', + width: 68, + height: 68, + space: 'srgb', + channels: 3, + density: 72, + depth: 'uchar', + isProgressive: false, + isPalette: true, + bitsPerSample: 8, + paletteBitDepth: 8, + hasProfile: false, + hasAlpha: false + }); + }); + + it('Valid PNG libimagequant palette value does not throw error', () => { + assert.doesNotThrow(() => { + sharp().png({ palette: false }); + }); + }); + + it('Invalid PNG libimagequant palette value throws error', () => { + assert.throws(() => { + sharp().png({ palette: 'fail' }); + }); + }); + + it('Valid PNG libimagequant quality value produces image of same size or smaller', () => { + const inputPngBuffer = fs.readFileSync(fixtures.inputPng); + return Promise.all([ + sharp(inputPngBuffer).resize(10).png({ effort: 1, quality: 80 }).toBuffer(), + sharp(inputPngBuffer).resize(10).png({ effort: 1, quality: 100 }).toBuffer() + ]).then((data) => { + assert.strictEqual(true, data[0].length <= data[1].length); + }); + }); + + it('Invalid PNG libimagequant quality value throws error', () => { + assert.throws(() => { + sharp().png({ quality: 101 }); + }); + }); + + it('Invalid effort value throws error', () => { + assert.throws(() => { + sharp().png({ effort: 0.1 }); + }); + }); + + it('Valid PNG libimagequant colours value produces image of same size or smaller', () => { + const inputPngBuffer = fs.readFileSync(fixtures.inputPng); + return Promise.all([ + sharp(inputPngBuffer).resize(10).png({ colours: 100 }).toBuffer(), + sharp(inputPngBuffer).resize(10).png({ colours: 200 }).toBuffer() + ]).then((data) => { + assert.strictEqual(true, data[0].length <= data[1].length); + }); + }); + + it('Invalid PNG libimagequant colours value throws error', () => { + assert.throws(() => { + sharp().png({ colours: -1 }); + }); + }); + + it('Invalid PNG libimagequant colors value throws error', () => { + assert.throws(() => { + sharp().png({ colors: 0.1 }); + }); + }); + + it('Can set bitdepth of PNG without palette', async () => { + const data = await sharp({ + create: { + width: 8, height: 8, channels: 3, background: 'red' + } + }) + .toColourspace('b-w') + .png({ colours: 2, palette: false }) + .toBuffer(); + + const { channels, isPalette, bitsPerSample, paletteBitDepth, size, space } = await sharp(data).metadata(); + assert.strictEqual(channels, 1); + assert.strictEqual(isPalette, false); + assert.strictEqual(bitsPerSample, 1); + assert.strictEqual(paletteBitDepth, undefined); + assert.strictEqual(size, 89); + assert.strictEqual(space, 'b-w'); + }); + + it('Valid PNG libimagequant dither value produces image of same size or smaller', () => { + const inputPngBuffer = fs.readFileSync(fixtures.inputPng); + return Promise.all([ + sharp(inputPngBuffer).resize(10).png({ dither: 0.1 }).toBuffer(), + sharp(inputPngBuffer).resize(10).png({ dither: 0.9 }).toBuffer() + ]).then((data) => { + assert.strictEqual(true, data[0].length <= data[1].length); + }); + }); + + it('Invalid PNG libimagequant dither value throws error', () => { + assert.throws(() => { + sharp().png({ dither: 'fail' }); + }); + }); +}); diff --git a/test/unit/raw.js b/test/unit/raw.js new file mode 100644 index 000000000..a3e9d563d --- /dev/null +++ b/test/unit/raw.js @@ -0,0 +1,370 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('Raw pixel data', () => { + describe('Raw pixel input', () => { + it('Empty data', () => { + assert.throws(() => { + sharp(Buffer.from('')); + }, /empty/); + assert.throws(() => { + sharp(new ArrayBuffer(0)); + }, /empty/); + assert.throws(() => { + sharp(new Uint8Array(0)); + }, /empty/); + assert.throws(() => { + sharp(new Uint8ClampedArray(0)); + }, /empty/); + }); + + it('Missing options', () => { + assert.throws(() => { + sharp({ raw: {} }); + }); + }); + + it('Incomplete options', () => { + assert.throws(() => { + sharp({ raw: { width: 1, height: 1 } }); + }); + }); + + it('Invalid channels', () => { + assert.throws(() => { + sharp({ raw: { width: 1, height: 1, channels: 5 } }); + }); + }); + + it('Invalid height', () => { + assert.throws(() => { + sharp({ raw: { width: 1, height: 0, channels: 4 } }); + }); + }); + + it('Invalid width', () => { + assert.throws(() => { + sharp({ raw: { width: 'zoinks', height: 1, channels: 4 } }); + }); + }); + + it('Invalid premultiplied', () => { + assert.throws( + () => sharp({ raw: { width: 1, height: 1, channels: 4, premultiplied: 'zoinks' } }), + /Expected boolean for raw\.premultiplied but received zoinks of type string/ + ); + }); + + it('Invalid pageHeight', () => { + const width = 8; + const height = 8; + const channels = 4; + assert.throws( + () => sharp({ raw: { width, height, channels, pageHeight: 'zoinks' } }), + /Expected positive integer for raw\.pageHeight but received zoinks of type string/ + ); + assert.throws( + () => sharp({ raw: { width, height, channels, pageHeight: -1 } }), + /Expected positive integer for raw\.pageHeight but received -1 of type number/ + ); + assert.throws( + () => sharp({ raw: { width, height, channels, pageHeight: 9 } }), + /Expected positive integer for raw\.pageHeight but received 9 of type number/ + ); + assert.throws( + () => sharp({ raw: { width, height, channels, pageHeight: 3 } }), + /Expected raw\.height 8 to be a multiple of raw\.pageHeight 3/ + ); + }); + + it('RGB', (_t, done) => { + // Convert to raw pixel data + sharp(fixtures.inputJpg) + .resize(256) + .raw() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(256, info.width); + assert.strictEqual(209, info.height); + assert.strictEqual(3, info.channels); + // Convert back to JPEG + sharp(data, { + raw: { + width: info.width, + height: info.height, + channels: info.channels + } + }) + .jpeg() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(256, info.width); + assert.strictEqual(209, info.height); + assert.strictEqual(3, info.channels); + fixtures.assertSimilar(fixtures.inputJpg, data, done); + }); + }); + }); + + it('RGBA', (_t, done) => { + // Convert to raw pixel data + sharp(fixtures.inputPngOverlayLayer1) + .resize(256) + .raw() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(256, info.width); + assert.strictEqual(192, info.height); + assert.strictEqual(4, info.channels); + // Convert back to PNG + sharp(data, { + raw: { + width: info.width, + height: info.height, + channels: info.channels + } + }) + .png() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(256, info.width); + assert.strictEqual(192, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.inputPngOverlayLayer1, data, { threshold: 7 }, done); + }); + }); + }); + + it('RGBA premultiplied', (_t, done) => { + // Convert to raw pixel data + sharp(fixtures.inputPngSolidAlpha) + .resize(256) + .raw() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(256, info.width); + assert.strictEqual(192, info.height); + assert.strictEqual(4, info.channels); + + const originalData = Buffer.from(data); + + // Premultiply image data + for (let i = 0; i < data.length; i += 4) { + const alpha = data[i + 3]; + const norm = alpha / 255; + + if (alpha < 255) { + data[i] = Math.round(data[i] * norm); + data[i + 1] = Math.round(data[i + 1] * norm); + data[i + 2] = Math.round(data[i + 2] * norm); + } + } + + // Convert back to PNG + sharp(data, { + raw: { + width: info.width, + height: info.height, + channels: info.channels, + premultiplied: true + } + }) + .raw() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(256, info.width); + assert.strictEqual(192, info.height); + assert.strictEqual(4, info.channels); + assert.equal(data.compare(originalData), 0, 'output buffer matches unpremultiplied input buffer'); + done(); + }); + }); + }); + + it('JPEG to raw Stream and back again', (_t, done) => { + const width = 32; + const height = 24; + const writable = sharp({ + raw: { + width, + height, + channels: 3 + } + }); + writable + .jpeg() + .toBuffer((err, _data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(32, info.width); + assert.strictEqual(24, info.height); + done(); + }); + sharp(fixtures.inputJpg) + .resize(width, height) + .raw() + .pipe(writable); + }); + }); + + describe('Output raw, uncompressed image data', () => { + it('1 channel greyscale image', (_t, done) => { + sharp(fixtures.inputJpg) + .greyscale() + .resize(32, 24) + .raw() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(32 * 24 * 1, info.size); + assert.strictEqual(data.length, info.size); + assert.strictEqual('raw', info.format); + assert.strictEqual(32, info.width); + assert.strictEqual(24, info.height); + assert.strictEqual(1, info.channels); + done(); + }); + }); + + it('3 channel colour image without transparency', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(32, 24) + .toFormat('raw') + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(32 * 24 * 3, info.size); + assert.strictEqual(data.length, info.size); + assert.strictEqual('raw', info.format); + assert.strictEqual(32, info.width); + assert.strictEqual(24, info.height); + done(); + }); + }); + + it('4 channel colour image with transparency', (_t, done) => { + sharp(fixtures.inputPngWithTransparency) + .resize(32, 24) + .toFormat(sharp.format.raw) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(32 * 24 * 4, info.size); + assert.strictEqual(data.length, info.size); + assert.strictEqual('raw', info.format); + assert.strictEqual(32, info.width); + assert.strictEqual(24, info.height); + done(); + }); + }); + + it('Extract A from RGBA', () => + sharp(fixtures.inputPngWithTransparency) + .resize(32, 24) + .extractChannel(3) + .toColourspace('b-w') + .raw() + .toBuffer({ resolveWithObject: true }) + .then(({ info }) => { + assert.strictEqual('raw', info.format); + assert.strictEqual(1, info.channels); + assert.strictEqual(32 * 24, info.size); + }) + ); + }); + + describe('Raw pixel depths', () => { + it('Invalid depth', () => { + assert.throws(() => { + sharp(Buffer.alloc(3), { raw: { width: 1, height: 1, channels: 3 } }) + .raw({ depth: 'zoinks' }); + }); + }); + + for (const { type, depth, bits } of [ + { type: Uint8Array, depth: undefined, bits: 8 }, + { type: Uint8Array, depth: 'uchar', bits: 8 }, + { type: Uint8ClampedArray, depth: 'uchar', bits: 8 }, + { type: Int8Array, depth: 'char', bits: 8 }, + { type: Uint16Array, depth: 'ushort', bits: 16 }, + { type: Int16Array, depth: 'short', bits: 16 }, + { type: Uint32Array, depth: 'uint', bits: 32 }, + { type: Int32Array, depth: 'int', bits: 32 }, + { type: Float32Array, depth: 'float', bits: 32 }, + { type: Float64Array, depth: 'double', bits: 64 } + ]) { + it(type.name, () => + sharp(new type(3), { raw: { width: 1, height: 1, channels: 3 } }) + .raw({ depth }) + .toBuffer({ resolveWithObject: true }) + .then(({ data, info }) => { + assert.strictEqual(1, info.width); + assert.strictEqual(1, info.height); + assert.strictEqual(3, info.channels); + if (depth !== undefined) { + assert.strictEqual(depth, info.depth); + } + assert.strictEqual(data.length / 3, bits / 8); + }) + ); + } + }); + + it('Animated', async () => { + const gif = await sharp( + Buffer.alloc(8), + { raw: { width: 1, height: 2, channels: 4, pageHeight: 1 }, animated: true } + ) + .gif({ keepDuplicateFrames: true }) + .toBuffer(); + + const { width, height, pages, delay } = await sharp(gif).metadata(); + assert.strictEqual(width, 1); + assert.strictEqual(height, 1); + assert.strictEqual(pages, 2); + assert.strictEqual(delay.length, 2); + }); + + describe('16-bit roundtrip', () => { + it('grey', async () => { + const grey = 42000; + const png = await sharp( + Uint16Array.from([grey]), + { raw: { width: 1, height: 1, channels: 1 } } + ) + .toColourspace('grey16') + .png({ compressionLevel: 0 }) + .toBuffer(); + const raw = await sharp(png) + .toColourspace('grey16') + .raw({ depth: 'ushort' }) + .toBuffer(); + + assert.strictEqual(raw.readUint16LE(0), grey); + }); + + it('RGB', async () => { + const rgb = [10946, 28657, 46368]; + const png = await sharp( + Uint16Array.from(rgb), + { raw: { width: 1, height: 1, channels: 3 } } + ) + .toColourspace('rgb16') + .png({ compressionLevel: 0 }) + .toBuffer(); + const raw = await sharp(png) + .toColourspace('rgb16') + .raw({ depth: 'ushort' }) + .toBuffer(); + + assert.strictEqual(raw.readUint16LE(0), rgb[0]); + assert.strictEqual(raw.readUint16LE(2), rgb[1]); + assert.strictEqual(raw.readUint16LE(4), rgb[2]); + }); + }); +}); diff --git a/test/unit/recomb.js b/test/unit/recomb.js new file mode 100644 index 000000000..ec05eb502 --- /dev/null +++ b/test/unit/recomb.js @@ -0,0 +1,174 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +const sepia = [ + [0.3588, 0.7044, 0.1368], + [0.299, 0.587, 0.114], + [0.2392, 0.4696, 0.0912] +]; + +describe('Recomb', () => { + it('applies a sepia filter using recomb', (_t, done) => { + const output = fixtures.path('output.recomb-sepia.jpg'); + sharp(fixtures.inputJpgWithLandscapeExif1) + .recomb(sepia) + .toFile(output, (err, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(600, info.width); + assert.strictEqual(450, info.height); + fixtures.assertMaxColourDistance( + output, + fixtures.expected('Landscape_1-recomb-sepia.jpg'), + 17 + ); + done(); + }); + }); + + it('applies a sepia filter using recomb to an PNG with Alpha', (_t, done) => { + const output = fixtures.path('output.recomb-sepia.png'); + sharp(fixtures.inputPngAlphaPremultiplicationSmall) + .recomb(sepia) + .toFile(output, (err, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(1024, info.width); + assert.strictEqual(768, info.height); + fixtures.assertMaxColourDistance( + output, + fixtures.expected('alpha-recomb-sepia.png'), + 17 + ); + done(); + }); + }); + + it('recomb with a single channel input', async () => { + const { info } = await sharp(Buffer.alloc(64), { + raw: { + width: 8, + height: 8, + channels: 1 + } + }) + .recomb(sepia) + .toBuffer({ resolveWithObject: true }); + + assert.strictEqual(3, info.channels); + }); + + it('applies a different sepia filter using recomb', (_t, done) => { + const output = fixtures.path('output.recomb-sepia2.jpg'); + sharp(fixtures.inputJpgWithLandscapeExif1) + .recomb([ + [0.393, 0.769, 0.189], + [0.349, 0.686, 0.168], + [0.272, 0.534, 0.131] + ]) + .toFile(output, (err, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(600, info.width); + assert.strictEqual(450, info.height); + fixtures.assertMaxColourDistance( + output, + fixtures.expected('Landscape_1-recomb-sepia2.jpg'), + 17 + ); + done(); + }); + }); + it('increases the saturation of the image', (_t, done) => { + const saturationLevel = 1; + const output = fixtures.path('output.recomb-saturation.jpg'); + sharp(fixtures.inputJpgWithLandscapeExif1) + .recomb([ + [ + saturationLevel + 1 - 0.2989, + -0.587 * saturationLevel, + -0.114 * saturationLevel + ], + [ + -0.2989 * saturationLevel, + saturationLevel + 1 - 0.587, + -0.114 * saturationLevel + ], + [ + -0.2989 * saturationLevel, + -0.587 * saturationLevel, + saturationLevel + 1 - 0.114 + ] + ]) + .toFile(output, (err, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(600, info.width); + assert.strictEqual(450, info.height); + fixtures.assertMaxColourDistance( + output, + fixtures.expected('Landscape_1-recomb-saturation.jpg'), + 37 + ); + done(); + }); + }); + + it('applies opacity 30% to the image', (_t, done) => { + const output = fixtures.path('output.recomb-opacity.png'); + sharp(fixtures.inputPngWithTransparent) + .recomb([ + [1, 0, 0, 0], + [0, 1, 0, 0], + [0, 0, 1, 0], + [0, 0, 0, 0.3] + ]) + .toFile(output, (err, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(48, info.width); + assert.strictEqual(48, info.height); + fixtures.assertMaxColourDistance( + output, + fixtures.expected('d-opacity-30.png'), + 17 + ); + done(); + }); + }); + + describe('invalid matrix specification', () => { + it('missing', () => { + assert.throws(() => { + sharp(fixtures.inputJpg).recomb(); + }); + }); + it('incorrect flat data', () => { + assert.throws(() => { + sharp(fixtures.inputJpg).recomb([1, 2, 3, 4, 5, 6, 7, 8, 9]); + }); + }); + it('incorrect sub size', () => { + assert.throws(() => { + sharp(fixtures.inputJpg).recomb([ + [1, 2, 3, 4], + [5, 6, 7, 8], + [1, 2, 9, 6] + ]); + }); + }); + it('incorrect top size', () => { + assert.throws(() => { + sharp(fixtures.inputJpg).recomb([[1, 2, 3, 4], [5, 6, 7, 8]]); + }); + }); + }); +}); diff --git a/test/unit/resize-contain.js b/test/unit/resize-contain.js new file mode 100644 index 000000000..36cc4518e --- /dev/null +++ b/test/unit/resize-contain.js @@ -0,0 +1,839 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('Resize fit=contain', () => { + it('Allows specifying the position as a string', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 240, { + fit: 'contain', + position: 'center' + }) + .png() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + fixtures.assertSimilar(fixtures.expected('embed-3-into-3.png'), data, done); + }); + }); + + it('JPEG within PNG, no alpha channel', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 240, { fit: 'contain' }) + .png() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + assert.strictEqual(3, info.channels); + fixtures.assertSimilar(fixtures.expected('embed-3-into-3.png'), data, done); + }); + }); + + it('JPEG within WebP, to include alpha channel', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 240, { + fit: 'contain', + background: { r: 0, g: 0, b: 0, alpha: 0 } + }) + .webp() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('webp', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('embed-3-into-4.webp'), data, done); + }); + }); + + it('PNG with alpha channel', (_t, done) => { + sharp(fixtures.inputPngWithTransparency) + .resize(50, 50, { fit: 'contain' }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(50, info.width); + assert.strictEqual(50, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('embed-4-into-4.png'), data, done); + }); + }); + + it('16-bit PNG with alpha channel', (_t, done) => { + sharp(fixtures.inputPngWithTransparency16bit) + .resize(32, 16, { fit: 'contain' }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(32, info.width); + assert.strictEqual(16, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('embed-16bit.png'), data, done); + }); + }); + + it('16-bit PNG with alpha channel onto RGBA', (_t, done) => { + sharp(fixtures.inputPngWithTransparency16bit) + .resize(32, 16, { + fit: 'contain', + background: { r: 0, g: 0, b: 0, alpha: 0 } + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(32, info.width); + assert.strictEqual(16, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('embed-16bit-rgba.png'), data, done); + }); + }); + + it('PNG with 2 channels', (_t, done) => { + sharp(fixtures.inputPngWithGreyAlpha) + .resize(32, 16, { + fit: 'contain', + background: { r: 0, g: 0, b: 0, alpha: 0 } + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(32, info.width); + assert.strictEqual(16, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('embed-2channel.png'), data, done); + }); + }); + + it('TIFF in LAB colourspace onto RGBA background', (_t, done) => { + sharp(fixtures.inputTiffCielab) + .resize(64, 128, { + fit: 'contain', + background: { r: 255, g: 102, b: 0, alpha: 0.5 } + }) + .png() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(64, info.width); + assert.strictEqual(128, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('embed-lab-into-rgba.png'), data, done); + }); + }); + + it('Enlarge', (_t, done) => { + sharp(fixtures.inputPngWithOneColor) + .resize(320, 240, { fit: 'contain' }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + assert.strictEqual(3, info.channels); + fixtures.assertSimilar(fixtures.expected('embed-enlarge.png'), data, done); + }); + }); + + describe('Animated WebP', () => { + it('Width only', (_t, done) => { + sharp(fixtures.inputWebPAnimated, { pages: -1 }) + .resize(320, 240, { + fit: 'contain', + background: { r: 255, g: 0, b: 0 } + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('webp', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(240 * 9, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('embed-animated-width.webp'), data, done); + }); + }); + + it('Height only', (_t, done) => { + sharp(fixtures.inputWebPAnimated, { pages: -1 }) + .resize(240, 320, { + fit: 'contain', + background: { r: 255, g: 0, b: 0 } + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('webp', info.format); + assert.strictEqual(240, info.width); + assert.strictEqual(320 * 9, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('embed-animated-height.webp'), data, done); + }); + }); + }); + + it('Invalid position values should fail', () => { + [-1, 8.1, 9, 1000000, false, 'vallejo'].forEach((position) => { + assert.throws(() => { + sharp().resize(null, null, { fit: 'contain', position }); + }); + }); + }); + + it('Position horizontal top', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 100, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: 'top' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/a2-n.png'), data, done); + }); + }); + + it('Position horizontal right top', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 100, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: 'right top' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/a3-ne.png'), data, done); + }); + }); + + it('Position horizontal right', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 100, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: 'right' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/a4-e.png'), data, done); + }); + }); + + it('Position horizontal right bottom', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 100, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: 'right bottom' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/a5-se.png'), data, done); + }); + }); + + it('Position horizontal bottom', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 100, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: 'bottom' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/a6-s.png'), data, done); + }); + }); + + it('Position horizontal left bottom', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 100, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: 'left bottom' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/a7-sw.png'), data, done); + }); + }); + + it('Position horizontal left', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 100, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: 'left' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/a8-w.png'), data, done); + }); + }); + + it('Position horizontal left top', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 100, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: 'left top' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/a1-nw.png'), data, done); + }); + }); + + it('Position horizontal north', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 100, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.north + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/a2-n.png'), data, done); + }); + }); + + it('Position horizontal northeast', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 100, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.northeast + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/a3-ne.png'), data, done); + }); + }); + + it('Position horizontal east', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 100, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.east + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/a4-e.png'), data, done); + }); + }); + + it('Position horizontal southeast', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 100, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.southeast + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/a5-se.png'), data, done); + }); + }); + + it('Position horizontal south', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 100, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.south + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/a6-s.png'), data, done); + }); + }); + + it('Position horizontal southwest', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 100, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.southwest + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/a7-sw.png'), data, done); + }); + }); + + it('Position horizontal west', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 100, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.west + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/a8-w.png'), data, done); + }); + }); + + it('Position horizontal northwest', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 100, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.northwest + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/a1-nw.png'), data, done); + }); + }); + + it('Position horizontal center', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 100, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.center + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(100, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/a9-c.png'), data, done); + }); + }); + + it('Position vertical top', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 200, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: 'top' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/2-n.png'), data, done); + }); + }); + + it('Position vertical right top', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 200, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: 'right top' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/3-ne.png'), data, done); + }); + }); + + it('Position vertical right', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 200, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: 'right' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/4-e.png'), data, done); + }); + }); + + it('Position vertical right bottom', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 200, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: 'right bottom' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/5-se.png'), data, done); + }); + }); + + it('Position vertical bottom', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 200, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: 'bottom' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/6-s.png'), data, done); + }); + }); + + it('Position vertical left bottom', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 200, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: 'left bottom' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/7-sw.png'), data, done); + }); + }); + + it('Position vertical left', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 200, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: 'left' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/8-w.png'), data, done); + }); + }); + + it('Position vertical left top', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 200, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: 'left top' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/1-nw.png'), data, done); + }); + }); + + it('Position vertical north', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 200, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.north + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/2-n.png'), data, done); + }); + }); + + it('Position vertical northeast', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 200, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.northeast + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/3-ne.png'), data, done); + }); + }); + + it('Position vertical east', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 200, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.east + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/4-e.png'), data, done); + }); + }); + + it('Position vertical southeast', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 200, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.southeast + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/5-se.png'), data, done); + }); + }); + + it('Position vertical south', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 200, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.south + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/6-s.png'), data, done); + }); + }); + + it('Position vertical southwest', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 200, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.southwest + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/7-sw.png'), data, done); + }); + }); + + it('Position vertical west', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 200, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.west + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/8-w.png'), data, done); + }); + }); + + it('Position vertical northwest', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 200, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.northwest + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/1-nw.png'), data, done); + }); + }); + + it('Position vertical center', (_t, done) => { + sharp(fixtures.inputPngEmbed) + .resize(200, 200, { + fit: sharp.fit.contain, + background: { r: 0, g: 0, b: 0, alpha: 0 }, + position: sharp.gravity.center + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(200, info.width); + assert.strictEqual(200, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('./embedgravitybird/9-c.png'), data, done); + }); + }); + + it('multiple alpha channels', async () => { + const create = { + width: 20, + height: 12, + channels: 4, + background: 'green' + }; + const multipleAlphaChannels = await sharp({ create }) + .joinChannel({ create }) + .tiff({ compression: 'deflate' }) + .toBuffer(); + + const data = await sharp(multipleAlphaChannels) + .resize({ + width: 8, + height: 8, + fit: 'contain', + background: 'blue' + }) + .tiff({ compression: 'deflate' }) + .toBuffer(); + const { format, width, height, space, channels } = await sharp(data).metadata(); + assert.deepStrictEqual(format, 'tiff'); + assert.deepStrictEqual(width, 8); + assert.deepStrictEqual(height, 8); + assert.deepStrictEqual(space, 'srgb'); + assert.deepStrictEqual(channels, 8); + }); +}); diff --git a/test/unit/resize-cover.js b/test/unit/resize-cover.js new file mode 100644 index 000000000..4e2dda737 --- /dev/null +++ b/test/unit/resize-cover.js @@ -0,0 +1,455 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('Resize fit=cover', () => { + [ + // Position + { + name: 'Position: top', + width: 320, + height: 80, + gravity: sharp.position.top, + fixture: 'gravity-north.jpg' + }, + { + name: 'Position: right', + width: 80, + height: 320, + gravity: sharp.position.right, + fixture: 'gravity-east.jpg' + }, + { + name: 'Position: bottom', + width: 320, + height: 80, + gravity: sharp.position.bottom, + fixture: 'gravity-south.jpg' + }, + { + name: 'Position: left', + width: 80, + height: 320, + gravity: sharp.position.left, + fixture: 'gravity-west.jpg' + }, + { + name: 'Position: right top (top)', + width: 320, + height: 80, + gravity: sharp.position['right top'], + fixture: 'gravity-north.jpg' + }, + { + name: 'Position: right top (right)', + width: 80, + height: 320, + gravity: sharp.position['right top'], + fixture: 'gravity-east.jpg' + }, + { + name: 'Position: right bottom (bottom)', + width: 320, + height: 80, + gravity: sharp.position['right bottom'], + fixture: 'gravity-south.jpg' + }, + { + name: 'Position: right bottom (right)', + width: 80, + height: 320, + gravity: sharp.position['right bottom'], + fixture: 'gravity-east.jpg' + }, + { + name: 'Position: left bottom (bottom)', + width: 320, + height: 80, + gravity: sharp.position['left bottom'], + fixture: 'gravity-south.jpg' + }, + { + name: 'Position: left bottom (left)', + width: 80, + height: 320, + gravity: sharp.position['left bottom'], + fixture: 'gravity-west.jpg' + }, + { + name: 'Position: left top (top)', + width: 320, + height: 80, + gravity: sharp.position['left top'], + fixture: 'gravity-north.jpg' + }, + { + name: 'Position: left top (left)', + width: 80, + height: 320, + gravity: sharp.position['left top'], + fixture: 'gravity-west.jpg' + }, + // Gravity + { + name: 'Gravity: north', + width: 320, + height: 80, + gravity: sharp.gravity.north, + fixture: 'gravity-north.jpg' + }, + { + name: 'Gravity: east', + width: 80, + height: 320, + gravity: sharp.gravity.east, + fixture: 'gravity-east.jpg' + }, + { + name: 'Gravity: south', + width: 320, + height: 80, + gravity: sharp.gravity.south, + fixture: 'gravity-south.jpg' + }, + { + name: 'Gravity: west', + width: 80, + height: 320, + gravity: sharp.gravity.west, + fixture: 'gravity-west.jpg' + }, + { + name: 'Gravity: center', + width: 320, + height: 80, + gravity: sharp.gravity.center, + fixture: 'gravity-center.jpg' + }, + { + name: 'Gravity: centre', + width: 80, + height: 320, + gravity: sharp.gravity.centre, + fixture: 'gravity-centre.jpg' + }, + { + name: 'Default (centre)', + width: 80, + height: 320, + gravity: undefined, + fixture: 'gravity-centre.jpg' + }, + { + name: 'Gravity: northeast (north)', + width: 320, + height: 80, + gravity: sharp.gravity.northeast, + fixture: 'gravity-north.jpg' + }, + { + name: 'Gravity: northeast (east)', + width: 80, + height: 320, + gravity: sharp.gravity.northeast, + fixture: 'gravity-east.jpg' + }, + { + name: 'Gravity: southeast (south)', + width: 320, + height: 80, + gravity: sharp.gravity.southeast, + fixture: 'gravity-south.jpg' + }, + { + name: 'Gravity: southeast (east)', + width: 80, + height: 320, + gravity: sharp.gravity.southeast, + fixture: 'gravity-east.jpg' + }, + { + name: 'Gravity: southwest (south)', + width: 320, + height: 80, + gravity: sharp.gravity.southwest, + fixture: 'gravity-south.jpg' + }, + { + name: 'Gravity: southwest (west)', + width: 80, + height: 320, + gravity: sharp.gravity.southwest, + fixture: 'gravity-west.jpg' + }, + { + name: 'Gravity: northwest (north)', + width: 320, + height: 80, + gravity: sharp.gravity.northwest, + fixture: 'gravity-north.jpg' + }, + { + name: 'Gravity: northwest (west)', + width: 80, + height: 320, + gravity: sharp.gravity.northwest, + fixture: 'gravity-west.jpg' + } + ].forEach((settings) => { + it(settings.name, (_t, done) => { + sharp(fixtures.inputJpg) + .resize(settings.width, settings.height, { + fit: sharp.fit.cover, + position: settings.gravity + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(settings.width, info.width); + assert.strictEqual(settings.height, info.height); + fixtures.assertSimilar(fixtures.expected(settings.fixture), data, done); + }); + }); + }); + + it('Allows specifying the gravity as a string', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(80, 320, { + fit: sharp.fit.cover, + position: 'east' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(80, info.width); + assert.strictEqual(320, info.height); + fixtures.assertSimilar(fixtures.expected('gravity-east.jpg'), data, done); + }); + }); + + it('Invalid position values fail', () => { + assert.throws(() => { + sharp().resize(null, null, { fit: 'cover', position: 9 }); + }, /Expected valid position\/gravity\/strategy for position but received 9 of type number/); + assert.throws(() => { + sharp().resize(null, null, { fit: 'cover', position: 1.1 }); + }, /Expected valid position\/gravity\/strategy for position but received 1.1 of type number/); + assert.throws(() => { + sharp().resize(null, null, { fit: 'cover', position: -1 }); + }, /Expected valid position\/gravity\/strategy for position but received -1 of type number/); + assert.throws(() => { + sharp().resize(null, null, { fit: 'cover', position: 'zoinks' }).crop(); + }, /Expected valid position\/gravity\/strategy for position but received zoinks of type string/); + }); + + it('Uses default value when none specified', () => { + assert.doesNotThrow(() => { + sharp().resize(null, null, { fit: 'cover' }); + }); + }); + + it('Skip crop when post-resize dimensions are at target', () => sharp(fixtures.inputJpg) + .resize(1600, 1200) + .toBuffer() + .then((input) => sharp(input) + .resize(1110, null, { + fit: sharp.fit.cover, + position: sharp.strategy.attention + }) + .toBuffer({ resolveWithObject: true }) + .then((result) => { + assert.strictEqual(1110, result.info.width); + assert.strictEqual(832, result.info.height); + assert.strictEqual(undefined, result.info.cropOffsetLeft); + assert.strictEqual(undefined, result.info.cropOffsetTop); + }))); + + describe('Animated WebP', () => { + it('Width only', (_t, done) => { + sharp(fixtures.inputWebPAnimated, { pages: -1 }) + .resize(80, 320, { fit: sharp.fit.cover }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(80, info.width); + assert.strictEqual(320 * 9, info.height); + fixtures.assertSimilar(fixtures.expected('gravity-center-width.webp'), data, done); + }); + }); + + it('Height only', (_t, done) => { + sharp(fixtures.inputWebPAnimated, { pages: -1 }) + .resize(320, 80, { fit: sharp.fit.cover }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(320, info.width); + assert.strictEqual(80 * 9, info.height); + fixtures.assertSimilar(fixtures.expected('gravity-center-height.webp'), data, done); + }); + }); + }); + + describe('Entropy-based strategy', () => { + it('JPEG', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(80, 320, { + fit: 'cover', + position: sharp.strategy.entropy + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(3, info.channels); + assert.strictEqual(80, info.width); + assert.strictEqual(320, info.height); + assert.strictEqual(-117, info.cropOffsetLeft); + assert.strictEqual(0, info.cropOffsetTop); + fixtures.assertSimilar(fixtures.expected('crop-strategy-entropy.jpg'), data, done); + }); + }); + + it('PNG', (_t, done) => { + sharp(fixtures.inputPngWithTransparency) + .resize(320, 80, { + fit: 'cover', + position: sharp.strategy.entropy + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(4, info.channels); + assert.strictEqual(320, info.width); + assert.strictEqual(80, info.height); + assert.strictEqual(0, info.cropOffsetLeft); + assert.strictEqual(-80, info.cropOffsetTop); + fixtures.assertSimilar(fixtures.expected('crop-strategy.png'), data, done); + }); + }); + + it('supports the strategy passed as a string', (_t, done) => { + sharp(fixtures.inputPngWithTransparency) + .resize(320, 80, { + fit: 'cover', + position: 'entropy' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(4, info.channels); + assert.strictEqual(320, info.width); + assert.strictEqual(80, info.height); + assert.strictEqual(0, info.cropOffsetLeft); + assert.strictEqual(-80, info.cropOffsetTop); + fixtures.assertSimilar(fixtures.expected('crop-strategy.png'), data, done); + }); + }); + + it('Animated image rejects', () => + assert.rejects(() => sharp(fixtures.inputGifAnimated, { animated: true }) + .resize({ + width: 100, + height: 8, + position: sharp.strategy.entropy + }) + .toBuffer(), + /Resize strategy is not supported for multi-page images/ + ) + ); + }); + + describe('Attention strategy', () => { + it('JPEG', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(80, 320, { + fit: 'cover', + position: sharp.strategy.attention + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(3, info.channels); + assert.strictEqual(80, info.width); + assert.strictEqual(320, info.height); + assert.strictEqual(-107, info.cropOffsetLeft); + assert.strictEqual(0, info.cropOffsetTop); + assert.strictEqual(588, info.attentionX); + assert.strictEqual(640, info.attentionY); + fixtures.assertSimilar(fixtures.expected('crop-strategy-attention.jpg'), data, done); + }); + }); + + it('PNG', (_t, done) => { + sharp(fixtures.inputPngWithTransparency) + .resize(320, 80, { + fit: 'cover', + position: sharp.strategy.attention + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(4, info.channels); + assert.strictEqual(320, info.width); + assert.strictEqual(80, info.height); + assert.strictEqual(0, info.cropOffsetLeft); + assert.strictEqual(0, info.cropOffsetTop); + assert.strictEqual(0, info.attentionX); + assert.strictEqual(0, info.attentionY); + fixtures.assertSimilar(fixtures.expected('crop-strategy.png'), data, done); + }); + }); + + it('WebP', (_t, done) => { + sharp(fixtures.inputWebP) + .resize(320, 80, { + fit: 'cover', + position: sharp.strategy.attention + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('webp', info.format); + assert.strictEqual(3, info.channels); + assert.strictEqual(320, info.width); + assert.strictEqual(80, info.height); + assert.strictEqual(0, info.cropOffsetLeft); + assert.strictEqual(-161, info.cropOffsetTop); + assert.strictEqual(288, info.attentionX); + assert.strictEqual(745, info.attentionY); + fixtures.assertSimilar(fixtures.expected('crop-strategy.webp'), data, done); + }); + }); + + it('supports the strategy passed as a string', (_t, done) => { + sharp(fixtures.inputPngWithTransparency) + .resize(320, 80, { + fit: 'cover', + position: 'attention' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(4, info.channels); + assert.strictEqual(320, info.width); + assert.strictEqual(80, info.height); + assert.strictEqual(0, info.cropOffsetLeft); + assert.strictEqual(0, info.cropOffsetTop); + fixtures.assertSimilar(fixtures.expected('crop-strategy.png'), data, done); + }); + }); + + it('Animated image rejects', () => + assert.rejects(() => sharp(fixtures.inputGifAnimated, { animated: true }) + .resize({ + width: 100, + height: 8, + position: sharp.strategy.attention + }) + .toBuffer(), + /Resize strategy is not supported for multi-page images/ + ) + ); + }); +}); diff --git a/test/unit/resize.js b/test/unit/resize.js index 47958dad7..e0a001543 100644 --- a/test/unit/resize.js +++ b/test/unit/resize.js @@ -1,13 +1,17 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const sharp = require('../../'); const fixtures = require('../fixtures'); -describe('Resize dimensions', function () { - it('Exact crop', function (done) { - sharp(fixtures.inputJpg).resize(320, 240).toBuffer(function (err, data, info) { +describe('Resize dimensions', () => { + it('Exact crop', (_t, done) => { + sharp(fixtures.inputJpg).resize(320, 240).toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -17,8 +21,8 @@ describe('Resize dimensions', function () { }); }); - it('Fixed width', function (done) { - sharp(fixtures.inputJpg).resize(320).toBuffer(function (err, data, info) { + it('Fixed width', (_t, done) => { + sharp(fixtures.inputJpg).resize(320).toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -28,8 +32,8 @@ describe('Resize dimensions', function () { }); }); - it('Fixed height', function (done) { - sharp(fixtures.inputJpg).resize(null, 320).toBuffer(function (err, data, info) { + it('Fixed height', (_t, done) => { + sharp(fixtures.inputJpg).resize(null, 320).toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -39,8 +43,8 @@ describe('Resize dimensions', function () { }); }); - it('Identity transform', function (done) { - sharp(fixtures.inputJpg).toBuffer(function (err, data, info) { + it('Identity transform', (_t, done) => { + sharp(fixtures.inputJpg).toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -50,10 +54,10 @@ describe('Resize dimensions', function () { }); }); - it('Upscale', function (done) { + it('Upscale', (_t, done) => { sharp(fixtures.inputJpg) .resize(3000) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -63,64 +67,90 @@ describe('Resize dimensions', function () { }); }); - it('Invalid width - NaN', function () { - assert.throws(function () { + it('Invalid width - NaN', () => { + assert.throws(() => { sharp().resize('spoons', 240); }, /Expected positive integer for width but received spoons of type string/); }); - it('Invalid height - NaN', function () { - assert.throws(function () { + it('Invalid height - NaN', () => { + assert.throws(() => { sharp().resize(320, 'spoons'); }, /Expected positive integer for height but received spoons of type string/); }); - it('Invalid width - float', function () { - assert.throws(function () { + it('Invalid width - float', () => { + assert.throws(() => { sharp().resize(1.5, 240); }, /Expected positive integer for width but received 1.5 of type number/); }); - it('Invalid height - float', function () { - assert.throws(function () { + it('Invalid height - float', () => { + assert.throws(() => { sharp().resize(320, 1.5); }, /Expected positive integer for height but received 1.5 of type number/); }); - it('Invalid width - too large', function (done) { + it('Invalid width - via options', () => { + assert.throws(() => { + sharp().resize({ width: 1.5, height: 240 }); + }, /Expected positive integer for width but received 1.5 of type number/); + }); + + it('Invalid height - via options', () => { + assert.throws(() => { + sharp().resize({ width: 320, height: 1.5 }); + }, /Expected positive integer for height but received 1.5 of type number/); + }); + + it('Invalid width - too large', (_t, done) => { sharp(fixtures.inputJpg) .resize(0x4000, 1) .webp() - .toBuffer(function (err) { + .toBuffer((err) => { assert.strictEqual(true, err instanceof Error); assert.strictEqual('Processed image is too large for the WebP format', err.message); done(); }); }); - it('Invalid height - too large', function (done) { + it('Invalid height - too large', (_t, done) => { sharp(fixtures.inputJpg) .resize(1, 0x4000) .webp() - .toBuffer(function (err) { + .toBuffer((err) => { assert.strictEqual(true, err instanceof Error); assert.strictEqual('Processed image is too large for the WebP format', err.message); done(); }); }); - it('WebP shrink-on-load rounds to zero, ensure recalculation is correct', function (done) { + it('Webp resize then extract large image', (_t, done) => { + sharp(fixtures.inputWebP) + .resize(0x4000, 0x4000) + .extract({ top: 0x2000, left: 0x2000, width: 256, height: 256 }) + .webp() + .toBuffer((err, _data, info) => { + if (err) throw err; + assert.strictEqual('webp', info.format); + assert.strictEqual(256, info.width); + assert.strictEqual(256, info.height); + done(); + }); + }); + + it('WebP shrink-on-load rounds to zero, ensure recalculation is correct', (_t, done) => { sharp(fixtures.inputJpg) .resize(1080, 607) .webp() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('webp', info.format); assert.strictEqual(1080, info.width); assert.strictEqual(607, info.height); sharp(data) .resize(233, 131) - .toBuffer(function (err, data, info) { + .toBuffer((err, _data, info) => { if (err) throw err; assert.strictEqual('webp', info.format); assert.strictEqual(233, info.width); @@ -130,12 +160,30 @@ describe('Resize dimensions', function () { }); }); - it('TIFF embed known to cause rounding errors', function (done) { + it('JPEG shrink-on-load with 90 degree rotation, ensure recalculation is correct', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(1920, 1280) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(1920, info.width); + assert.strictEqual(1280, info.height); + sharp(data) + .rotate(90) + .resize(533, 800) + .toBuffer((err, _data, info) => { + if (err) throw err; + assert.strictEqual(533, info.width); + assert.strictEqual(800, info.height); + done(); + }); + }); + }); + + it('TIFF embed known to cause rounding errors', (_t, done) => { sharp(fixtures.inputTiff) - .resize(240, 320) - .embed() + .resize(240, 320, { fit: sharp.fit.contain }) .jpeg() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -145,11 +193,11 @@ describe('Resize dimensions', function () { }); }); - it('TIFF known to cause rounding errors', function (done) { + it('TIFF known to cause rounding errors', (_t, done) => { sharp(fixtures.inputTiff) .resize(240, 320) .jpeg() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -159,12 +207,11 @@ describe('Resize dimensions', function () { }); }); - it('Max width or height considering ratio (portrait)', function (done) { + it('fit=inside, portrait', (_t, done) => { sharp(fixtures.inputTiff) - .resize(320, 320) - .max() + .resize(320, 320, { fit: sharp.fit.inside }) .jpeg() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -174,12 +221,11 @@ describe('Resize dimensions', function () { }); }); - it('Min width or height considering ratio (portrait)', function (done) { + it('fit=outside, portrait', (_t, done) => { sharp(fixtures.inputTiff) - .resize(320, 320) - .min() + .resize(320, 320, { fit: sharp.fit.outside }) .jpeg() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -189,11 +235,10 @@ describe('Resize dimensions', function () { }); }); - it('Max width or height considering ratio (landscape)', function (done) { + it('fit=inside, landscape', (_t, done) => { sharp(fixtures.inputJpg) - .resize(320, 320) - .max() - .toBuffer(function (err, data, info) { + .resize(320, 320, { fit: sharp.fit.inside }) + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -203,39 +248,42 @@ describe('Resize dimensions', function () { }); }); - it('Provide only one dimension with max, should default to crop', function (done) { + it('fit=outside, landscape', (_t, done) => { sharp(fixtures.inputJpg) - .resize(320) - .max() - .toBuffer(function (err, data, info) { + .resize(320, 320, { fit: sharp.fit.outside }) + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); - assert.strictEqual(320, info.width); - assert.strictEqual(261, info.height); + assert.strictEqual(392, info.width); + assert.strictEqual(320, info.height); done(); }); }); - it('Min width or height considering ratio (landscape)', function (done) { + it('fit=inside, provide only one dimension', (_t, done) => { sharp(fixtures.inputJpg) - .resize(320, 320) - .min() - .toBuffer(function (err, data, info) { + .resize({ + width: 320, + fit: sharp.fit.inside + }) + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); - assert.strictEqual(392, info.width); - assert.strictEqual(320, info.height); + assert.strictEqual(320, info.width); + assert.strictEqual(261, info.height); done(); }); }); - it('Provide only one dimension with min, should default to crop', function (done) { + it('fit=outside, provide only one dimension', (_t, done) => { sharp(fixtures.inputJpg) - .resize(320) - .min() - .toBuffer(function (err, data, info) { + .resize({ + width: 320, + fit: sharp.fit.outside + }) + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -245,11 +293,13 @@ describe('Resize dimensions', function () { }); }); - it('Do not enlarge when input width is already less than output width', function (done) { + it('Do not enlarge when input width is already less than output width', (_t, done) => { sharp(fixtures.inputJpg) - .resize(2800) - .withoutEnlargement() - .toBuffer(function (err, data, info) { + .resize({ + width: 2800, + withoutEnlargement: true + }) + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -259,11 +309,13 @@ describe('Resize dimensions', function () { }); }); - it('Do not enlarge when input height is already less than output height', function (done) { + it('Do not enlarge when input height is already less than output height', (_t, done) => { sharp(fixtures.inputJpg) - .resize(null, 2300) - .withoutEnlargement() - .toBuffer(function (err, data, info) { + .resize({ + height: 2300, + withoutEnlargement: true + }) + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -273,11 +325,47 @@ describe('Resize dimensions', function () { }); }); - it('Do enlarge when input width is less than output width', function (done) { + it('Do crop when fit = cover and withoutEnlargement = true and width >= outputWidth, and height < outputHeight', (_t, done) => { sharp(fixtures.inputJpg) - .resize(2800) - .withoutEnlargement(false) - .toBuffer(function (err, data, info) { + .resize({ + width: 3000, + height: 1000, + withoutEnlargement: true + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(1000, info.height); + done(); + }); + }); + + it('Do crop when fit = cover and withoutEnlargement = true and width < outputWidth, and height >= outputHeight', (_t, done) => { + sharp(fixtures.inputJpg) + .resize({ + width: 1500, + height: 2226, + withoutEnlargement: true + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(1500, info.width); + assert.strictEqual(2225, info.height); + done(); + }); + }); + + it('Do enlarge when input width is less than output width', (_t, done) => { + sharp(fixtures.inputJpg) + .resize({ + width: 2800, + withoutEnlargement: false + }) + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -287,142 +375,226 @@ describe('Resize dimensions', function () { }); }); - it('Downscale width and height, ignoring aspect ratio', function (done) { - sharp(fixtures.inputJpg).resize(320, 320).ignoreAspectRatio().toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('jpeg', info.format); - assert.strictEqual(320, info.width); - assert.strictEqual(320, info.height); - done(); - }); + it('Do enlarge when input width is less than output width', (_t, done) => { + sharp(fixtures.inputJpg) + .resize({ + width: 2800, + withoutReduction: true + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(2800, info.width); + assert.strictEqual(2286, info.height); + done(); + }); }); - it('Downscale width, ignoring aspect ratio', function (done) { - sharp(fixtures.inputJpg).resize(320).ignoreAspectRatio().toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('jpeg', info.format); - assert.strictEqual(320, info.width); - assert.strictEqual(2225, info.height); - done(); - }); + it('Do enlarge when input height is less than output height', (_t, done) => { + sharp(fixtures.inputJpg) + .resize({ + height: 2300, + withoutReduction: true + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(2817, info.width); + assert.strictEqual(2300, info.height); + done(); + }); }); - it('Downscale height, ignoring aspect ratio', function (done) { - sharp(fixtures.inputJpg).resize(null, 320).ignoreAspectRatio().toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('jpeg', info.format); - assert.strictEqual(2725, info.width); - assert.strictEqual(320, info.height); - done(); - }); + it('Do enlarge when input width is less than output width', (_t, done) => { + sharp(fixtures.inputJpg) + .resize({ + width: 2800, + withoutReduction: false + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(2800, info.width); + assert.strictEqual(2286, info.height); + done(); + }); }); - it('Upscale width and height, ignoring aspect ratio', function (done) { - sharp(fixtures.inputJpg).resize(3000, 3000).ignoreAspectRatio().toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('jpeg', info.format); - assert.strictEqual(3000, info.width); - assert.strictEqual(3000, info.height); - done(); - }); + it('Do not resize when both withoutEnlargement and withoutReduction are true', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 320, { fit: 'fill', withoutEnlargement: true, withoutReduction: true }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + done(); + }); }); - it('Upscale width, ignoring aspect ratio', function (done) { - sharp(fixtures.inputJpg).resize(3000).ignoreAspectRatio().toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('jpeg', info.format); - assert.strictEqual(3000, info.width); - assert.strictEqual(2225, info.height); - done(); - }); + it('Do not reduce size when fit = outside and withoutReduction are true and height > outputHeight and width > outputWidth', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 320, { fit: 'outside', withoutReduction: true }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + done(); + }); }); - it('Upscale height, ignoring aspect ratio', function (done) { - sharp(fixtures.inputJpg).resize(null, 3000).ignoreAspectRatio().toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('jpeg', info.format); - assert.strictEqual(2725, info.width); - assert.strictEqual(3000, info.height); - done(); - }); + it('Do resize when fit = outside and withoutReduction are true and input height > height and input width > width ', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(3000, 3000, { fit: 'outside', withoutReduction: true }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(3674, info.width); + assert.strictEqual(3000, info.height); + done(); + }); }); - it('Downscale width, upscale height, ignoring aspect ratio', function (done) { - sharp(fixtures.inputJpg).resize(320, 3000).ignoreAspectRatio().toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('jpeg', info.format); - assert.strictEqual(320, info.width); - assert.strictEqual(3000, info.height); - done(); - }); + it('fit=fill, downscale width and height', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 320, { fit: 'fill' }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(320, info.height); + done(); + }); }); - it('Upscale width, downscale height, ignoring aspect ratio', function (done) { - sharp(fixtures.inputJpg).resize(3000, 320).ignoreAspectRatio().toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('jpeg', info.format); - assert.strictEqual(3000, info.width); - assert.strictEqual(320, info.height); - done(); - }); + it('fit=fill, downscale width', (_t, done) => { + sharp(fixtures.inputJpg) + .resize({ + width: 320, + fit: 'fill' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(2225, info.height); + done(); + }); }); - it('Identity transform, ignoring aspect ratio', function (done) { - sharp(fixtures.inputJpg).ignoreAspectRatio().toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('jpeg', info.format); - assert.strictEqual(2725, info.width); - assert.strictEqual(2225, info.height); - done(); - }); + it('fit=fill, downscale height', (_t, done) => { + sharp(fixtures.inputJpg) + .resize({ + height: 320, + fit: 'fill' + }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(320, info.height); + done(); + }); }); - it('Centre vs corner convention return different results', function (done) { + it('fit=fill, upscale width and height', (_t, done) => { sharp(fixtures.inputJpg) - .resize(32, 24, { centreSampling: false }) - .greyscale() - .raw() - .toBuffer(function (err, cornerData) { - if (err) throw err; - assert.strictEqual(768, cornerData.length); - sharp(fixtures.inputJpg) - .resize(32, 24, { centerSampling: true }) - .greyscale() - .raw() - .toBuffer(function (err, centreData) { - if (err) throw err; - assert.strictEqual(768, centreData.length); - assert.notStrictEqual(0, cornerData.compare(centreData)); - done(); - }); + .resize(3000, 3000, { fit: 'fill' }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(3000, info.width); + assert.strictEqual(3000, info.height); + done(); }); }); - it('Invalid centreSampling option', function () { - assert.throws(function () { - sharp().resize(32, 24, { centreSampling: 1 }); - }); + it('fit=fill, upscale width', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(3000, null, { fit: 'fill' }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(3000, info.width); + assert.strictEqual(2225, info.height); + done(); + }); }); - it('Dimensions that result in differing even shrinks on each axis', function (done) { + it('fit=fill, upscale height', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(null, 3000, { fit: 'fill' }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(3000, info.height); + done(); + }); + }); + + it('fit=fill, downscale width, upscale height', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 3000, { fit: 'fill' }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(3000, info.height); + done(); + }); + }); + + it('fit=fill, upscale width, downscale height', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(3000, 320, { fit: 'fill' }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(3000, info.width); + assert.strictEqual(320, info.height); + done(); + }); + }); + + it('fit=fill, identity transform', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(null, null, { fit: 'fill' }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + done(); + }); + }); + + it('Dimensions that result in differing even shrinks on each axis', (_t, done) => { sharp(fixtures.inputJpg) .resize(645, 399) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(645, info.width); assert.strictEqual(399, info.height); sharp(data) .resize(150, 100) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(150, info.width); assert.strictEqual(100, info.height); @@ -431,21 +603,196 @@ describe('Resize dimensions', function () { }); }); - it('Dimensions that result in differing odd shrinks on each axis', function (done) { - return sharp(fixtures.inputJpg) + it('Dimensions that result in differing odd shrinks on each axis', (_t, done) => sharp(fixtures.inputJpg) .resize(600, 399) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(600, info.width); assert.strictEqual(399, info.height); sharp(data) .resize(200) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(200, info.width); assert.strictEqual(133, info.height); fixtures.assertSimilar(fixtures.expected('resize-diff-shrink-odd.jpg'), data, done); }); + })); + + [ + true, + false + ].forEach((value) => { + it(`fastShrinkOnLoad: ${value} does not causes image shifts`, (_t, done) => { + sharp(fixtures.inputJpgCenteredImage) + .resize(9, 8, { fastShrinkOnLoad: value }) + .png() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(9, info.width); + assert.strictEqual(8, info.height); + fixtures.assertSimilar(fixtures.expected('fast-shrink-on-load.png'), data, done); + }); + }); + }); + + [ + sharp.kernel.nearest, + sharp.kernel.cubic, + sharp.kernel.mitchell, + sharp.kernel.lanczos2, + sharp.kernel.lanczos3 + ].forEach((kernel) => { + it(`kernel ${kernel}`, (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, null, { kernel }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(320, info.width); + fixtures.assertSimilar(fixtures.inputJpg, data, done); + }); + }); + }); + + it('nearest upsampling with integral factor', (_t, done) => { + sharp(fixtures.inputTiff8BitDepth) + .resize(210, 210, { kernel: 'nearest' }) + .png() + .toBuffer((err, _data, info) => { + if (err) throw err; + assert.strictEqual(210, info.width); + assert.strictEqual(210, info.height); + done(); }); }); + + it('Ensure shortest edge (height) is at least 1 pixel', () => sharp({ + create: { + width: 10, + height: 2, + channels: 3, + background: 'red' + } + }) + .resize(2) + .toBuffer({ resolveWithObject: true }) + .then((output) => { + assert.strictEqual(2, output.info.width); + assert.strictEqual(1, output.info.height); + })); + + it('Ensure shortest edge (width) is at least 1 pixel', () => sharp({ + create: { + width: 2, + height: 10, + channels: 3, + background: 'red' + } + }) + .resize(null, 2) + .toBuffer({ resolveWithObject: true }) + .then((output) => { + assert.strictEqual(1, output.info.width); + assert.strictEqual(2, output.info.height); + })); + + it('Ensure embedded shortest edge (height) is at least 1 pixel', () => sharp({ + create: { + width: 200, + height: 1, + channels: 3, + background: 'red' + } + }) + .resize({ width: 50, height: 50, fit: sharp.fit.contain }) + .toBuffer({ resolveWithObject: true }) + .then((output) => { + assert.strictEqual(50, output.info.width); + assert.strictEqual(50, output.info.height); + })); + + it('Ensure embedded shortest edge (width) is at least 1 pixel', () => sharp({ + create: { + width: 1, + height: 200, + channels: 3, + background: 'red' + } + }) + .resize({ width: 50, height: 50, fit: sharp.fit.contain }) + .toBuffer({ resolveWithObject: true }) + .then((output) => { + assert.strictEqual(50, output.info.width); + assert.strictEqual(50, output.info.height); + })); + + it('Skip shrink-on-load where one dimension <4px', async () => { + const jpeg = await sharp({ + create: { + width: 100, + height: 3, + channels: 3, + background: 'red' + } + }) + .jpeg() + .toBuffer(); + + const { info } = await sharp(jpeg) + .resize(8) + .toBuffer({ resolveWithObject: true }); + + assert.strictEqual(info.width, 8); + assert.strictEqual(info.height, 1); + }); + + it('Skip JPEG shrink-on-load for known libjpeg rounding errors', async () => { + const input = await sharp({ + create: { + width: 1000, + height: 667, + channels: 3, + background: 'red' + } + }) + .jpeg() + .toBuffer(); + + const output = await sharp(input) + .resize({ width: 500 }) + .toBuffer(); + + const { width, height } = await sharp(output).metadata(); + assert.strictEqual(width, 500); + assert.strictEqual(height, 334); + }); + + it('unknown kernel throws', () => { + assert.throws(() => { + sharp().resize(null, null, { kernel: 'unknown' }); + }); + }); + + it('unknown fit throws', () => { + assert.throws(() => { + sharp().resize(null, null, { fit: 'unknown' }); + }); + }); + + it('unknown position throws', () => { + assert.throws(() => { + sharp().resize(null, null, { position: 'unknown' }); + }); + }); + + it('Multiple resize emits warning', () => { + let warningMessage = ''; + const s = sharp(); + s.on('warning', (msg) => { warningMessage = msg; }); + s.resize(1); + assert.strictEqual(warningMessage, ''); + s.resize(2); + assert.strictEqual(warningMessage, 'ignoring previous resize options'); + }); }); diff --git a/test/unit/rotate.js b/test/unit/rotate.js index 6ddc95875..ae8b755c0 100644 --- a/test/unit/rotate.js +++ b/test/unit/rotate.js @@ -1,42 +1,181 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const sharp = require('../../'); const fixtures = require('../fixtures'); -describe('Rotation', function () { - ['Landscape', 'Portrait'].forEach(function (orientation) { - [1, 2, 3, 4, 5, 6, 7, 8].forEach(function (exifTag) { - it('Input image has Orientation EXIF tag value of (' + exifTag + '), auto-rotate', function (done) { - sharp(fixtures['inputJpgWith' + orientation + 'Exif' + exifTag]) - .rotate() - .resize(320) - .toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual('jpeg', info.format); - assert.strictEqual(320, info.width); - assert.strictEqual(orientation === 'Landscape' ? 240 : 427, info.height); - fixtures.assertSimilar(fixtures.expected(orientation + '_' + exifTag + '-out.jpg'), data, done); +describe('Rotation', () => { + ['autoOrient', 'constructor'].forEach((rotateMethod) => { + describe(`Auto orientation via ${rotateMethod}:`, () => { + const options = rotateMethod === 'constructor' ? { autoOrient: true } : {}; + + ['Landscape', 'Portrait'].forEach((orientation) => { + [1, 2, 3, 4, 5, 6, 7, 8].forEach((exifTag) => { + const input = fixtures[`inputJpgWith${orientation}Exif${exifTag}`]; + const expectedOutput = fixtures.expected(`${orientation}_${exifTag}-out.jpg`); + it(`${orientation} image with EXIF Orientation ${exifTag}: Auto-rotate`, (_t, done) => { + const [expectedWidth, expectedHeight] = orientation === 'Landscape' ? [600, 450] : [450, 600]; + + const img = sharp(input, options); + rotateMethod === 'autoOrient' && img.autoOrient(); + + img.toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(info.width, expectedWidth); + assert.strictEqual(info.height, expectedHeight); + fixtures.assertSimilar(expectedOutput, data, done); + }); + }); + + it(`${orientation} image with EXIF Orientation ${exifTag}: Auto-rotate then resize`, (_t, done) => { + const [expectedWidth, expectedHeight] = orientation === 'Landscape' ? [320, 240] : [320, 427]; + + const img = sharp(input, options); + rotateMethod === 'autoOrient' && img.autoOrient(); + + img.resize({ width: 320 }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(info.width, expectedWidth); + assert.strictEqual(info.height, expectedHeight); + fixtures.assertSimilar(expectedOutput, data, done); + }); + }); + + if (rotateMethod !== 'constructor') { + it(`${orientation} image with EXIF Orientation ${exifTag}: Resize then auto-rotate`, (_t, done) => { + const [expectedWidth, expectedHeight] = orientation === 'Landscape' + ? (exifTag < 5) ? [320, 240] : [320, 240] + : [320, 427]; + + const img = sharp(input, options) + .resize({ width: 320 }); + + rotateMethod === 'autoOrient' && img.autoOrient(); + img.toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(info.width, expectedWidth); + assert.strictEqual(info.height, expectedHeight); + fixtures.assertSimilar(expectedOutput, data, done); + }); + }); + } + + [true, false].forEach((doResize) => { + [90, 180, 270, 45].forEach((angle) => { + const [inputWidth, inputHeight] = orientation === 'Landscape' ? [600, 450] : [450, 600]; + const expectedOutput = fixtures.expected(`${orientation}_${exifTag}_rotate${angle}-out.jpg`); + it(`${orientation} image with EXIF Orientation ${exifTag}: Auto-rotate then rotate ${angle} ${doResize ? 'and resize' : ''}`, (_t, done) => { + const [width, height] = (angle === 45 ? [742, 742] : [inputWidth, inputHeight]).map((x) => doResize ? Math.floor(x / 1.875) : x); + const [expectedWidth, expectedHeight] = angle % 180 === 0 ? [width, height] : [height, width]; + + const img = sharp(input, options); + rotateMethod === 'autoOrient' && img.autoOrient(); + + img.rotate(angle); + doResize && img.resize(expectedWidth); + + img.toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(info.width, expectedWidth); + assert.strictEqual(info.height, expectedHeight); + fixtures.assertSimilar(expectedOutput, data, done); + }); + }); + }); + + [[true, true], [true, false], [false, true]].forEach(([flip, flop]) => { + const [inputWidth, inputHeight] = orientation === 'Landscape' ? [600, 450] : [450, 600]; + const flipFlopFileName = [flip && 'flip', flop && 'flop'].filter(Boolean).join('_'); + const flipFlopTestName = [flip && 'flip', flop && 'flop'].filter(Boolean).join(' & '); + it(`${orientation} image with EXIF Orientation ${exifTag}: Auto-rotate then ${flipFlopTestName} ${doResize ? 'and resize' : ''}`, (_t, done) => { + const expectedOutput = fixtures.expected(`${orientation}_${exifTag}_${flipFlopFileName}-out.jpg`); + + const img = sharp(input, options); + + rotateMethod === 'autoOrient' && img.autoOrient(); + + flip && img.flip(); + flop && img.flop(); + doResize && img.resize(orientation === 'Landscape' ? 320 : 240); + + img.toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(info.width, inputWidth / (doResize ? 1.875 : 1)); + assert.strictEqual(info.height, inputHeight / (doResize ? 1.875 : 1)); + fixtures.assertSimilar(expectedOutput, data, done); + }); + }); + }); }); + }); }); }); }); - it('Rotate by 90 degrees, respecting output input size', function (done) { - sharp(fixtures.inputJpg).rotate(90).resize(320, 240).toBuffer(function (err, data, info) { - if (err) throw err; - assert.strictEqual(true, data.length > 0); - assert.strictEqual('jpeg', info.format); - assert.strictEqual(320, info.width); - assert.strictEqual(240, info.height); - done(); - }); + it('Rotate by 30 degrees with semi-transparent background', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320) + .rotate(30, { background: { r: 255, g: 0, b: 0, alpha: 0.5 } }) + .png() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(408, info.width); + assert.strictEqual(386, info.height); + fixtures.assertSimilar(fixtures.expected('rotate-transparent-bg.png'), data, done); + }); }); - [-3690, -450, -90, 90, 450, 3690].forEach(function (angle) { - it('Rotate by any 90-multiple angle (' + angle + 'deg)', function (done) { - sharp(fixtures.inputJpg320x240).rotate(angle).toBuffer(function (err, data, info) { + it('Rotate by 30 degrees with solid background', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320) + .rotate(30, { background: { r: 255, g: 0, b: 0 } }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('jpeg', info.format); + assert.strictEqual(408, info.width); + assert.strictEqual(386, info.height); + fixtures.assertSimilar(fixtures.expected('rotate-solid-bg.jpg'), data, done); + }); + }); + + it('Rotate by 90 degrees, respecting output input size', (_t, done) => { + sharp(fixtures.inputJpg) + .rotate(90) + .resize(320, 240) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + done(); + }); + }); + + it('Resize then rotate by 30 degrees, respecting output input size', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 240) + .rotate(30) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(397, info.width); + assert.strictEqual(368, info.height); + done(); + }); + }); + + [-3690, -450, -90, 90, 450, 3690].forEach((angle) => { + it(`Rotate by any 90-multiple angle (${angle}deg)`, (_t, done) => { + sharp(fixtures.inputJpg320x240).rotate(angle).toBuffer((err, _data, info) => { if (err) throw err; assert.strictEqual(240, info.width); assert.strictEqual(320, info.height); @@ -45,9 +184,20 @@ describe('Rotation', function () { }); }); - [-3780, -540, 0, 180, 540, 3780].forEach(function (angle) { - it('Rotate by any 180-multiple angle (' + angle + 'deg)', function (done) { - sharp(fixtures.inputJpg320x240).rotate(angle).toBuffer(function (err, data, info) { + [-3750, -510, -150, 30, 390, 3630].forEach((angle) => { + it(`Rotate by any 30-multiple angle (${angle}deg)`, (_t, done) => { + sharp(fixtures.inputJpg320x240).rotate(angle).toBuffer((err, _data, info) => { + if (err) throw err; + assert.strictEqual(397, info.width); + assert.strictEqual(368, info.height); + done(); + }); + }); + }); + + [-3780, -540, 0, 180, 540, 3780].forEach((angle) => { + it(`Rotate by any 180-multiple angle (${angle}deg)`, (_t, done) => { + sharp(fixtures.inputJpg320x240).rotate(angle).toBuffer((err, _data, info) => { if (err) throw err; assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); @@ -56,16 +206,15 @@ describe('Rotation', function () { }); }); - it('Rotate by 270 degrees, square output ignoring aspect ratio', function (done) { + it('Rotate by 270 degrees, square output ignoring aspect ratio', (_t, done) => { sharp(fixtures.inputJpg) - .resize(240, 240) - .ignoreAspectRatio() + .resize(240, 240, { fit: sharp.fit.fill }) .rotate(270) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(240, info.width); assert.strictEqual(240, info.height); - sharp(data).metadata(function (err, metadata) { + sharp(data).metadata((err, metadata) => { if (err) throw err; assert.strictEqual(240, metadata.width); assert.strictEqual(240, metadata.height); @@ -74,16 +223,49 @@ describe('Rotation', function () { }); }); - it('Rotate by 270 degrees, rectangular output ignoring aspect ratio', function (done) { + it('Rotate by 315 degrees, square output ignoring aspect ratio', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(240, 240, { fit: sharp.fit.fill }) + .rotate(315) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(339, info.width); + assert.strictEqual(339, info.height); + sharp(data).metadata((err, metadata) => { + if (err) throw err; + assert.strictEqual(339, metadata.width); + assert.strictEqual(339, metadata.height); + done(); + }); + }); + }); + + it('Rotate by 270 degrees, rectangular output ignoring aspect ratio', (_t, done) => { sharp(fixtures.inputJpg) - .resize(320, 240) - .ignoreAspectRatio() .rotate(270) - .toBuffer(function (err, data, info) { + .resize(320, 240, { fit: sharp.fit.fill }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + sharp(data).metadata((err, metadata) => { + if (err) throw err; + assert.strictEqual(320, metadata.width); + assert.strictEqual(240, metadata.height); + done(); + }); + }); + }); + + it('Auto-rotate by 270 degrees, rectangular output ignoring aspect ratio', (_t, done) => { + sharp(fixtures.inputJpgWithLandscapeExif8) + .resize(320, 240, { fit: sharp.fit.fill }) + .rotate() + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); - sharp(data).metadata(function (err, metadata) { + sharp(data).metadata((err, metadata) => { if (err) throw err; assert.strictEqual(320, metadata.width); assert.strictEqual(240, metadata.height); @@ -92,17 +274,34 @@ describe('Rotation', function () { }); }); - it('Input image has Orientation EXIF tag but do not rotate output', function (done) { + it('Rotate by 30 degrees, rectangular output ignoring aspect ratio', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 240, { fit: sharp.fit.fill }) + .rotate(30) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(397, info.width); + assert.strictEqual(368, info.height); + sharp(data).metadata((err, metadata) => { + if (err) throw err; + assert.strictEqual(397, metadata.width); + assert.strictEqual(368, metadata.height); + done(); + }); + }); + }); + + it('Input image has Orientation EXIF tag but do not rotate output', (_t, done) => { sharp(fixtures.inputJpgWithExif) .resize(320) .withMetadata() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); assert.strictEqual(427, info.height); - sharp(data).metadata(function (err, metadata) { + sharp(data).metadata((err, metadata) => { if (err) throw err; assert.strictEqual(8, metadata.orientation); done(); @@ -110,11 +309,11 @@ describe('Rotation', function () { }); }); - it('Input image has Orientation EXIF tag value of 8 (270 degrees), auto-rotate', function (done) { + it('Input image has Orientation EXIF tag value of 8 (270 degrees), auto-rotate', (_t, done) => { sharp(fixtures.inputJpgWithExif) .rotate() .resize(320) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -123,17 +322,17 @@ describe('Rotation', function () { }); }); - it('Override EXIF Orientation tag metadata after auto-rotate', function (done) { + it('Override EXIF Orientation tag metadata after auto-rotate', (_t, done) => { sharp(fixtures.inputJpgWithExif) .rotate() .resize(320) - .withMetadata({orientation: 3}) - .toBuffer(function (err, data, info) { + .withMetadata({ orientation: 3 }) + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); - sharp(data).metadata(function (err, metadata) { + sharp(data).metadata((err, metadata) => { if (err) throw err; assert.strictEqual(3, metadata.orientation); fixtures.assertSimilar(fixtures.expected('exif-8.jpg'), data, done); @@ -141,17 +340,17 @@ describe('Rotation', function () { }); }); - it('Input image has Orientation EXIF tag value of 5 (270 degrees + flip), auto-rotate', function (done) { + it('Input image has Orientation EXIF tag value of 5 (270 degrees + flip), auto-rotate', (_t, done) => { sharp(fixtures.inputJpgWithExifMirroring) .rotate() .resize(320) .withMetadata() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); assert.strictEqual(240, info.height); - sharp(data).metadata(function (err, metadata) { + sharp(data).metadata((err, metadata) => { if (err) throw err; assert.strictEqual(1, metadata.orientation); fixtures.assertSimilar(fixtures.expected('exif-5.jpg'), data, done); @@ -159,8 +358,8 @@ describe('Rotation', function () { }); }); - it('Attempt to auto-rotate using image that has no EXIF', function (done) { - sharp(fixtures.inputJpg).rotate().resize(320).toBuffer(function (err, data, info) { + it('Attempt to auto-rotate using image that has no EXIF', (_t, done) => { + sharp(fixtures.inputJpg).rotate().resize(320).toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -170,12 +369,12 @@ describe('Rotation', function () { }); }); - it('Attempt to auto-rotate image format without EXIF support', function (done) { + it('Attempt to auto-rotate image format without EXIF support', (_t, done) => { sharp(fixtures.inputPng) .rotate() .resize(320) .jpeg() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('jpeg', info.format); @@ -185,23 +384,100 @@ describe('Rotation', function () { }); }); - it('Rotate to an invalid angle, should fail', function () { - assert.throws(function () { - sharp(fixtures.inputJpg).rotate(1); + it('Rotate with a string argument, should fail', () => { + assert.throws(() => { + sharp(fixtures.inputJpg).rotate('not-a-number'); }); }); - it('Flip - vertical', function (done) { + it('Animated image rotate-then-extract rejects', () => + assert.rejects(() => sharp(fixtures.inputGifAnimated, { animated: true }) + .rotate(1) + .extract({ + top: 1, + left: 1, + width: 10, + height: 10 + }) + .toBuffer(), + /Rotate is not supported for multi-page images/ + ) + ); + + it('Animated image extract-then-rotate rejects', () => + assert.rejects(() => sharp(fixtures.inputGifAnimated, { animated: true }) + .extract({ + top: 1, + left: 1, + width: 10, + height: 10 + }) + .rotate(1) + .toBuffer(), + /Rotate is not supported for multi-page images/ + ) + ); + + it('Animated image rotate 180', () => + assert.doesNotReject(() => sharp(fixtures.inputGifAnimated, { animated: true }) + .rotate(180) + .toBuffer() + ) + ); + + it('Animated image rotate non-180 rejects', () => + assert.rejects(() => sharp(fixtures.inputGifAnimated, { animated: true }) + .rotate(90) + .toBuffer(), + /Rotate is not supported for multi-page images/ + ) + ); + + it('Multiple rotate emits warning', () => { + let warningMessage = ''; + const s = sharp(); + s.on('warning', (msg) => { warningMessage = msg; }); + s.rotate(90); + assert.strictEqual(warningMessage, ''); + s.rotate(180); + assert.strictEqual(warningMessage, 'ignoring previous rotate options'); + }); + + it('Multiple rotate: last one wins (cardinal)', (_t, done) => { + sharp(fixtures.inputJpg) + .rotate(45) + .rotate(90) + .toBuffer((err, _data, info) => { + if (err) throw err; + assert.strictEqual(2225, info.width); + assert.strictEqual(2725, info.height); + done(); + }); + }); + + it('Multiple rotate: last one wins (non cardinal)', (_t, done) => { + sharp(fixtures.inputJpg) + .rotate(90) + .rotate(45) + .toBuffer((err, _data, info) => { + if (err) throw err; + assert.strictEqual(3500, info.width); + assert.strictEqual(3500, info.height); + done(); + }); + }); + + it('Flip - vertical', (_t, done) => { sharp(fixtures.inputJpg) .resize(320) .flip() .withMetadata() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); assert.strictEqual(261, info.height); - sharp(data).metadata(function (err, metadata) { + sharp(data).metadata((err, metadata) => { if (err) throw err; assert.strictEqual(1, metadata.orientation); fixtures.assertSimilar(fixtures.expected('flip.jpg'), data, done); @@ -209,17 +485,17 @@ describe('Rotation', function () { }); }); - it('Flop - horizontal', function (done) { + it('Flop - horizontal', (_t, done) => { sharp(fixtures.inputJpg) .resize(320) .flop() .withMetadata() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); assert.strictEqual(261, info.height); - sharp(data).metadata(function (err, metadata) { + sharp(data).metadata((err, metadata) => { if (err) throw err; assert.strictEqual(1, metadata.orientation); fixtures.assertSimilar(fixtures.expected('flop.jpg'), data, done); @@ -227,11 +503,12 @@ describe('Rotation', function () { }); }); - it('Flip and flop', function (done) { + it('Flip and flop', (_t, done) => { sharp(fixtures.inputJpg) .resize(320) + .flip() .flop() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -240,12 +517,12 @@ describe('Rotation', function () { }); }); - it('Neither flip nor flop', function (done) { + it('Neither flip nor flop', (_t, done) => { sharp(fixtures.inputJpg) .resize(320) .flip(false) .flop(false) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -254,12 +531,12 @@ describe('Rotation', function () { }); }); - it('Auto-rotate and flip', function (done) { + it('Auto-rotate and flip', (_t, done) => { sharp(fixtures.inputJpgWithExif) .rotate() .flip() .resize(320) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -268,12 +545,12 @@ describe('Rotation', function () { }); }); - it('Auto-rotate and flop', function (done) { + it('Auto-rotate and flop', (_t, done) => { sharp(fixtures.inputJpgWithExif) .rotate() .flop() .resize(320) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -281,4 +558,122 @@ describe('Rotation', function () { fixtures.assertSimilar(fixtures.expected('rotate-and-flop.jpg'), data, done); }); }); + + it('Auto-rotate and shrink-on-load', async () => { + const [r, g, b] = await sharp(fixtures.inputJpgWithLandscapeExif3) + .rotate() + .resize(8) + .raw() + .toBuffer(); + + assert.strictEqual(r, 61); + assert.strictEqual(g, 74); + assert.strictEqual(b, 51); + }); + + it('Flip and rotate ordering', async () => { + const [r, g, b] = await sharp(fixtures.inputJpgWithPortraitExif5) + .flip() + .rotate(90) + .raw() + .toBuffer(); + + assert.strictEqual(r, 55); + assert.strictEqual(g, 65); + assert.strictEqual(b, 31); + }); + + it('Flip, rotate and resize ordering', async () => { + const [r, g, b] = await sharp(fixtures.inputJpgWithPortraitExif5) + .flip() + .rotate(90) + .resize(449) + .raw() + .toBuffer(); + + assert.strictEqual(r, 54); + assert.strictEqual(g, 64); + assert.strictEqual(b, 30); + }); + + it('Resize after affine-based rotation does not overcompute', async () => + sharp({ + create: { + width: 4640, + height: 2610, + channels: 3, + background: 'black' + } + }) + .rotate(28) + .resize({ width: 640, height: 360 }) + .raw() + .timeout({ seconds: 5 }) + .toBuffer() + ); + + it('Rotate 90 then resize with inside fit', async () => { + const data = await sharp({ create: { width: 16, height: 8, channels: 3, background: 'red' } }) + .rotate(90) + .resize({ width: 6, fit: 'inside' }) + .png({ compressionLevel: 0 }) + .toBuffer(); + + const { width, height } = await sharp(data).metadata(); + assert.strictEqual(width, 6); + assert.strictEqual(height, 12); + }); + + it('Resize with inside fit then rotate 90', async () => { + const data = await sharp({ create: { width: 16, height: 8, channels: 3, background: 'red' } }) + .resize({ width: 6, fit: 'inside' }) + .rotate(90) + .png({ compressionLevel: 0 }) + .toBuffer(); + + const { width, height } = await sharp(data).metadata(); + assert.strictEqual(width, 3); + assert.strictEqual(height, 6); + }); + + it('Shrink-on-load with autoOrient', async () => { + const data = await sharp(fixtures.inputJpgWithLandscapeExif6) + .resize(8) + .autoOrient() + .avif({ effort: 0 }) + .toBuffer(); + + const { width, height, orientation } = await sharp(data).metadata(); + assert.strictEqual(width, 8); + assert.strictEqual(height, 6); + assert.strictEqual(orientation, undefined); + }); + + it('Auto-orient and rotate 45', async () => { + const data = await sharp(fixtures.inputJpgWithLandscapeExif2, { autoOrient: true }) + .rotate(45) + .toBuffer(); + + const { width, height } = await sharp(data).metadata(); + assert.strictEqual(width, 742); + assert.strictEqual(height, 742); + }); + + it('Auto-orient, extract and rotate 45', async () => { + const data = await sharp(fixtures.inputJpgWithLandscapeExif2, { autoOrient: true }) + .extract({ left: 20, top: 20, width: 200, height: 100 }) + .rotate(45) + .toBuffer(); + + const { width, height } = await sharp(data).metadata(); + assert.strictEqual(width, 212); + assert.strictEqual(height, 212); + }); + + it('Invalid autoOrient throws', () => + assert.throws( + () => sharp({ autoOrient: 'fail' }), + /Expected boolean for autoOrient but received fail of type string/ + ) + ); }); diff --git a/test/unit/sharpen.js b/test/unit/sharpen.js index ab3b134e5..9ea354a71 100644 --- a/test/unit/sharpen.js +++ b/test/unit/sharpen.js @@ -1,16 +1,20 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const sharp = require('../../'); const fixtures = require('../fixtures'); -describe('Sharpen', function () { - it('specific radius 10 (sigma 6)', function (done) { +describe('Sharpen', () => { + it('specific radius 10 (sigma 6)', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .sharpen(6) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -19,11 +23,11 @@ describe('Sharpen', function () { }); }); - it('specific radius 3 (sigma 1.5) and levels 0.5, 2.5', function (done) { + it('specific radius 3 (sigma 1.5) and levels 0.5, 2.5', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .sharpen(1.5, 0.5, 2.5) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -32,11 +36,11 @@ describe('Sharpen', function () { }); }); - it('specific radius 5 (sigma 3.5) and levels 2, 4', function (done) { + it('specific radius 5 (sigma 3.5) and levels 2, 4', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .sharpen(3.5, 2, 4) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -45,12 +49,28 @@ describe('Sharpen', function () { }); }); + it('sigma=3.5, m1=2, m2=4', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 240) + .sharpen({ sigma: 3.5, m1: 2, m2: 4 }) + .toBuffer() + .then(data => fixtures.assertSimilar(fixtures.expected('sharpen-5-2-4.jpg'), data, done)); + }); + + it('sigma=3.5, m1=2, m2=4, x1=2, y2=5, y3=25', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 240) + .sharpen({ sigma: 3.5, m1: 2, m2: 4, x1: 2, y2: 5, y3: 25 }) + .toBuffer() + .then(data => fixtures.assertSimilar(fixtures.expected('sharpen-5-2-4.jpg'), data, done)); + }); + if (!process.env.SHARP_TEST_WITHOUT_CACHE) { - it('specific radius/levels with alpha channel', function (done) { + it('specific radius/levels with alpha channel', (_t, done) => { sharp(fixtures.inputPngWithTransparency) .resize(320, 240) .sharpen(5, 4, 8) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('png', info.format); assert.strictEqual(4, info.channels); @@ -61,11 +81,11 @@ describe('Sharpen', function () { }); } - it('mild sharpen', function (done) { + it('mild sharpen', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .sharpen() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -74,29 +94,59 @@ describe('Sharpen', function () { }); }); - it('invalid sigma', function () { - assert.throws(function () { + it('invalid sigma', () => { + assert.throws(() => { sharp(fixtures.inputJpg).sharpen(-1.5); }); }); - it('invalid flat', function () { - assert.throws(function () { + it('invalid flat', () => { + assert.throws(() => { sharp(fixtures.inputJpg).sharpen(1, -1); }); }); - it('invalid jagged', function () { - assert.throws(function () { + it('invalid jagged', () => { + assert.throws(() => { sharp(fixtures.inputJpg).sharpen(1, 1, -1); }); }); - it('sharpened image is larger than non-sharpened', function (done) { + it('invalid options.sigma', () => assert.throws( + () => sharp().sharpen({ sigma: -1 }), + /Expected number between 0\.000001 and 10 for options\.sigma but received -1 of type number/ + )); + + it('invalid options.m1', () => assert.throws( + () => sharp().sharpen({ sigma: 1, m1: -1 }), + /Expected number between 0 and 1000000 for options\.m1 but received -1 of type number/ + )); + + it('invalid options.m2', () => assert.throws( + () => sharp().sharpen({ sigma: 1, m2: -1 }), + /Expected number between 0 and 1000000 for options\.m2 but received -1 of type number/ + )); + + it('invalid options.x1', () => assert.throws( + () => sharp().sharpen({ sigma: 1, x1: -1 }), + /Expected number between 0 and 1000000 for options\.x1 but received -1 of type number/ + )); + + it('invalid options.y2', () => assert.throws( + () => sharp().sharpen({ sigma: 1, y2: -1 }), + /Expected number between 0 and 1000000 for options\.y2 but received -1 of type number/ + )); + + it('invalid options.y3', () => assert.throws( + () => sharp().sharpen({ sigma: 1, y3: -1 }), + /Expected number between 0 and 1000000 for options\.y3 but received -1 of type number/ + )); + + it('sharpened image is larger than non-sharpened', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .sharpen(false) - .toBuffer(function (err, notSharpened, info) { + .toBuffer((err, notSharpened, info) => { if (err) throw err; assert.strictEqual(true, notSharpened.length > 0); assert.strictEqual('jpeg', info.format); @@ -105,7 +155,7 @@ describe('Sharpen', function () { sharp(fixtures.inputJpg) .resize(320, 240) .sharpen(true) - .toBuffer(function (err, sharpened, info) { + .toBuffer((err, sharpened, info) => { if (err) throw err; assert.strictEqual(true, sharpened.length > 0); assert.strictEqual(true, sharpened.length > notSharpened.length); diff --git a/test/unit/stats.js b/test/unit/stats.js new file mode 100644 index 000000000..9164787e3 --- /dev/null +++ b/test/unit/stats.js @@ -0,0 +1,752 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const fs = require('node:fs'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +// Test Helpers +const threshold = 0.001; +function isInAcceptableRange (actual, expected) { + return actual >= ((1 - threshold) * expected) && actual <= ((1 + threshold) * expected); +} +function isInRange (actual, min, max) { + return actual >= min && actual <= max; +} +function isInteger (val) { + return Number.isInteger(val); +} + +describe('Image Stats', () => { + it('JPEG', (_t, done) => { + sharp(fixtures.inputJpg).stats((err, stats) => { + if (err) throw err; + + assert.strictEqual(true, stats.isOpaque); + assert.strictEqual(true, isInAcceptableRange(stats.entropy, 7.332915340666659)); + assert.strictEqual(true, isInAcceptableRange(stats.sharpness, 0.7883011147075762)); + + const { r, g, b } = stats.dominant; + assert.strictEqual(40, r); + assert.strictEqual(40, g); + assert.strictEqual(40, b); + + // red channel + assert.strictEqual(0, stats.channels[0].min); + assert.strictEqual(255, stats.channels[0].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].sum, 615101275)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].squaresSum, 83061892917)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].mean, 101.44954540768993)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].stdev, 58.373870588815414)); + assert.strictEqual(true, isInteger(stats.channels[0].minX)); + assert.strictEqual(true, isInRange(stats.channels[0].minX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].minY)); + assert.strictEqual(true, isInRange(stats.channels[0].minY, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].maxX)); + assert.strictEqual(true, isInRange(stats.channels[0].maxX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].maxY)); + assert.strictEqual(true, isInRange(stats.channels[0].maxY, 0, 2725)); + + // green channel + assert.strictEqual(0, stats.channels[1].min); + assert.strictEqual(255, stats.channels[1].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].sum, 462824115)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].squaresSum, 47083677255)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].mean, 76.33425255128337)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].stdev, 44.03023262954866)); + assert.strictEqual(true, isInteger(stats.channels[1].minX)); + assert.strictEqual(true, isInRange(stats.channels[1].minX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[1].minY)); + assert.strictEqual(true, isInRange(stats.channels[1].minY, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[1].maxX)); + assert.strictEqual(true, isInRange(stats.channels[1].maxX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[1].maxY)); + assert.strictEqual(true, isInRange(stats.channels[1].maxY, 0, 2725)); + + // blue channel + assert.strictEqual(0, stats.channels[2].min); + assert.strictEqual(255, stats.channels[2].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].sum, 372986756)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].squaresSum, 32151543524)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].mean, 61.51724663436759)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].stdev, 38.96702865090125)); + assert.strictEqual(true, isInteger(stats.channels[2].minX)); + assert.strictEqual(true, isInRange(stats.channels[2].minX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[2].minY)); + assert.strictEqual(true, isInRange(stats.channels[2].minY, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[2].maxX)); + assert.strictEqual(true, isInRange(stats.channels[2].maxX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[2].maxY)); + assert.strictEqual(true, isInRange(stats.channels[2].maxY, 0, 2725)); + + done(); + }); + }); + + it('PNG without transparency', (_t, done) => { + sharp(fixtures.inputPng).stats((err, stats) => { + if (err) throw err; + + assert.strictEqual(true, stats.isOpaque); + assert.strictEqual(true, isInAcceptableRange(stats.entropy, 0.3409031108021736)); + assert.strictEqual(true, isInAcceptableRange(stats.sharpness, 9.111356137722868)); + + const { r, g, b } = stats.dominant; + assert.strictEqual(248, r); + assert.strictEqual(248, g); + assert.strictEqual(248, b); + + // red channel + assert.strictEqual(0, stats.channels[0].min); + assert.strictEqual(255, stats.channels[0].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].sum, 1391368230)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].squaresSum, 354798898650)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].mean, 238.8259925648822)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].stdev, 62.15121915523771)); + assert.strictEqual(true, isInteger(stats.channels[0].minX)); + assert.strictEqual(true, isInRange(stats.channels[0].minX, 0, 2809)); + assert.strictEqual(true, isInteger(stats.channels[0].minY)); + assert.strictEqual(true, isInRange(stats.channels[0].minY, 0, 2074)); + assert.strictEqual(true, isInteger(stats.channels[0].maxX)); + assert.strictEqual(true, isInRange(stats.channels[0].maxX, 0, 2809)); + assert.strictEqual(true, isInteger(stats.channels[0].maxY)); + assert.strictEqual(true, isInRange(stats.channels[0].maxY, 0, 2074)); + done(); + }); + }); + + it('PNG with transparency', (_t, done) => { + sharp(fixtures.inputPngWithTransparency).stats((err, stats) => { + if (err) throw err; + + assert.strictEqual(false, stats.isOpaque); + assert.strictEqual(true, isInAcceptableRange(stats.entropy, 0.06778064835816622)); + assert.strictEqual(true, isInAcceptableRange(stats.sharpness, 2.522916068931278)); + + const { r, g, b } = stats.dominant; + assert.strictEqual(248, r); + assert.strictEqual(248, g); + assert.strictEqual(248, b); + + // red channel + assert.strictEqual(0, stats.channels[0].min); + assert.strictEqual(255, stats.channels[0].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].sum, 795678795)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].squaresSum, 202898092725)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].mean, 252.9394769668579)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].stdev, 22.829537532816)); + assert.strictEqual(true, isInteger(stats.channels[0].minX)); + assert.strictEqual(true, isInRange(stats.channels[0].minX, 0, 2048)); + assert.strictEqual(true, isInteger(stats.channels[0].minY)); + assert.strictEqual(true, isInRange(stats.channels[0].minY, 0, 1536)); + assert.strictEqual(true, isInteger(stats.channels[0].maxX)); + assert.strictEqual(true, isInRange(stats.channels[0].maxX, 0, 2048)); + assert.strictEqual(true, isInteger(stats.channels[0].maxY)); + assert.strictEqual(true, isInRange(stats.channels[0].maxY, 0, 1536)); + + // green channel + assert.strictEqual(0, stats.channels[1].min); + assert.strictEqual(255, stats.channels[1].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].sum, 795678795)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].squaresSum, 202898092725)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].mean, 252.9394769668579)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].stdev, 22.829537532816)); + assert.strictEqual(true, isInteger(stats.channels[1].minX)); + assert.strictEqual(true, isInRange(stats.channels[1].minX, 0, 2048)); + assert.strictEqual(true, isInteger(stats.channels[1].minY)); + assert.strictEqual(true, isInRange(stats.channels[1].minY, 0, 1536)); + assert.strictEqual(true, isInteger(stats.channels[1].maxX)); + assert.strictEqual(true, isInRange(stats.channels[1].maxX, 0, 2048)); + assert.strictEqual(true, isInteger(stats.channels[1].maxY)); + assert.strictEqual(true, isInRange(stats.channels[1].maxY, 0, 1536)); + + // blue channel + assert.strictEqual(0, stats.channels[2].min); + assert.strictEqual(255, stats.channels[2].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].sum, 795678795)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].squaresSum, 202898092725)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].mean, 252.9394769668579)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].stdev, 22.829537532816)); + assert.strictEqual(true, isInteger(stats.channels[2].minX)); + assert.strictEqual(true, isInRange(stats.channels[2].minX, 0, 2048)); + assert.strictEqual(true, isInteger(stats.channels[2].minY)); + assert.strictEqual(true, isInRange(stats.channels[2].minY, 0, 1536)); + assert.strictEqual(true, isInteger(stats.channels[2].maxX)); + assert.strictEqual(true, isInRange(stats.channels[2].maxX, 0, 2048)); + assert.strictEqual(true, isInteger(stats.channels[2].maxY)); + assert.strictEqual(true, isInRange(stats.channels[2].maxY, 0, 1536)); + + // alpha channel + assert.strictEqual(0, stats.channels[3].min); + assert.strictEqual(255, stats.channels[3].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[3].sum, 5549142)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[3].squaresSum, 1333571132)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[3].mean, 1.7640247344970703)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[3].stdev, 20.51387814157297)); + assert.strictEqual(true, isInteger(stats.channels[3].minX)); + assert.strictEqual(true, isInRange(stats.channels[3].minX, 0, 2048)); + assert.strictEqual(true, isInteger(stats.channels[3].minY)); + assert.strictEqual(true, isInRange(stats.channels[3].minY, 0, 1536)); + assert.strictEqual(true, isInteger(stats.channels[3].maxX)); + assert.strictEqual(true, isInRange(stats.channels[3].maxX, 0, 2048)); + assert.strictEqual(true, isInteger(stats.channels[3].maxY)); + assert.strictEqual(true, isInRange(stats.channels[3].maxY, 0, 1536)); + + done(); + }); + }); + + it('PNG fully transparent', (_t, done) => { + sharp(fixtures.inputPngCompleteTransparency).stats((err, stats) => { + if (err) throw err; + + assert.strictEqual(false, stats.isOpaque); + assert.strictEqual(true, isInAcceptableRange(stats.entropy, 0)); + assert.strictEqual(true, isInAcceptableRange(stats.sharpness, 0)); + + const { r, g, b } = stats.dominant; + assert.strictEqual(72, r); + assert.strictEqual(104, g); + assert.strictEqual(72, b); + + // alpha channel + assert.strictEqual(0, stats.channels[3].min); + assert.strictEqual(0, stats.channels[3].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[3].sum, 0)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[3].squaresSum, 0)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[3].mean, 0)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[3].stdev, 0)); + assert.strictEqual(true, isInteger(stats.channels[3].minX)); + assert.strictEqual(true, isInRange(stats.channels[3].minX, 0, 300)); + assert.strictEqual(true, isInteger(stats.channels[3].minY)); + assert.strictEqual(true, isInRange(stats.channels[3].minY, 0, 300)); + assert.strictEqual(true, isInteger(stats.channels[3].maxX)); + assert.strictEqual(true, isInRange(stats.channels[3].maxX, 0, 300)); + assert.strictEqual(true, isInteger(stats.channels[3].maxY)); + assert.strictEqual(true, isInRange(stats.channels[3].maxY, 0, 300)); + + done(); + }); + }); + + it('Tiff', (_t, done) => { + sharp(fixtures.inputTiff).stats((err, stats) => { + if (err) throw err; + + assert.strictEqual(true, stats.isOpaque); + assert.strictEqual(true, isInAcceptableRange(stats.entropy, 0.3851250782608986)); + assert.strictEqual(true, isInAcceptableRange(stats.sharpness, 10.312521863719589)); + + const { r, g, b } = stats.dominant; + assert.strictEqual(248, r); + assert.strictEqual(248, g); + assert.strictEqual(248, b); + + // red channel + assert.strictEqual(0, stats.channels[0].min); + assert.strictEqual(255, stats.channels[0].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].sum, 1887266220)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].squaresSum, 481252886100)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].mean, 235.81772349417824)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].stdev, 67.25712856093298)); + assert.strictEqual(true, isInteger(stats.channels[0].minX)); + assert.strictEqual(true, isInRange(stats.channels[0].minX, 0, 2464)); + assert.strictEqual(true, isInteger(stats.channels[0].minY)); + assert.strictEqual(true, isInRange(stats.channels[0].minY, 0, 3248)); + assert.strictEqual(true, isInteger(stats.channels[0].maxX)); + assert.strictEqual(true, isInRange(stats.channels[0].maxX, 0, 2464)); + assert.strictEqual(true, isInteger(stats.channels[0].maxY)); + assert.strictEqual(true, isInRange(stats.channels[0].maxY, 0, 3248)); + + done(); + }); + }); + + it('WebP', (_t, done) => { + sharp(fixtures.inputWebP).stats((err, stats) => { + if (err) throw err; + + assert.strictEqual(true, stats.isOpaque); + assert.strictEqual(true, isInAcceptableRange(stats.entropy, 7.51758075132966)); + assert.strictEqual(true, isInAcceptableRange(stats.sharpness, 9.971384105278734)); + + const { r, g, b } = stats.dominant; + assert.strictEqual(40, r); + assert.strictEqual(136, g); + assert.strictEqual(200, b); + + // red channel + assert.strictEqual(0, stats.channels[0].min); + assert.strictEqual(true, isInRange(stats.channels[0].max, 254, 255)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].sum, 83291370)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].squaresSum, 11379783198)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].mean, 105.36169496842616)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].stdev, 57.39412151419967)); + assert.strictEqual(true, isInteger(stats.channels[0].minX)); + assert.strictEqual(true, isInRange(stats.channels[0].minX, 0, 1024)); + assert.strictEqual(true, isInteger(stats.channels[0].minY)); + assert.strictEqual(true, isInRange(stats.channels[0].minY, 0, 772)); + assert.strictEqual(true, isInteger(stats.channels[0].maxX)); + assert.strictEqual(true, isInRange(stats.channels[0].maxX, 0, 1024)); + assert.strictEqual(true, isInteger(stats.channels[0].maxY)); + assert.strictEqual(true, isInRange(stats.channels[0].maxY, 0, 772)); + + // green channel + assert.strictEqual(0, stats.channels[1].min); + assert.strictEqual(true, isInRange(stats.channels[1].max, 254, 255)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].sum, 120877425)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].squaresSum, 20774687595)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].mean, 152.9072025279307)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].stdev, 53.84143349689916)); + assert.strictEqual(true, isInteger(stats.channels[1].minX)); + assert.strictEqual(true, isInRange(stats.channels[1].minX, 0, 1024)); + assert.strictEqual(true, isInteger(stats.channels[1].minY)); + assert.strictEqual(true, isInRange(stats.channels[1].minY, 0, 772)); + assert.strictEqual(true, isInteger(stats.channels[1].maxX)); + assert.strictEqual(true, isInRange(stats.channels[1].maxX, 0, 1024)); + assert.strictEqual(true, isInteger(stats.channels[1].maxY)); + assert.strictEqual(true, isInRange(stats.channels[1].maxY, 0, 772)); + + // blue channel + assert.strictEqual(0, stats.channels[2].min); + assert.strictEqual(true, isInRange(stats.channels[2].max, 254, 255)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].sum, 138938859)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].squaresSum, 28449125593)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].mean, 175.75450711423252)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].stdev, 71.39929031070358)); + assert.strictEqual(true, isInteger(stats.channels[2].minX)); + assert.strictEqual(true, isInRange(stats.channels[2].minX, 0, 1024)); + assert.strictEqual(true, isInteger(stats.channels[2].minY)); + assert.strictEqual(true, isInRange(stats.channels[2].minY, 0, 772)); + assert.strictEqual(true, isInteger(stats.channels[2].maxX)); + assert.strictEqual(true, isInRange(stats.channels[2].maxX, 0, 1024)); + assert.strictEqual(true, isInteger(stats.channels[2].maxY)); + assert.strictEqual(true, isInRange(stats.channels[2].maxY, 0, 772)); + + done(); + }); + }); + + it('GIF', (_t, done) => { + sharp(fixtures.inputGif).stats((err, stats) => { + if (err) throw err; + + assert.strictEqual(true, stats.isOpaque); + assert.strictEqual(true, isInAcceptableRange(stats.entropy, 6.08118048729375)); + assert.strictEqual(true, isInAcceptableRange(stats.sharpness, 2.936767879098001)); + + const { r, g, b } = stats.dominant; + assert.strictEqual(120, r); + assert.strictEqual(136, g); + assert.strictEqual(88, b); + + // red channel + assert.strictEqual(35, stats.channels[0].min); + assert.strictEqual(254, stats.channels[0].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].sum, 56088385)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].squaresSum, 8002132113)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].mean, 131.53936444652908)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].stdev, 38.26389131415863)); + assert.strictEqual(true, isInteger(stats.channels[0].minX)); + assert.strictEqual(true, isInRange(stats.channels[0].minX, 0, 800)); + assert.strictEqual(true, isInteger(stats.channels[0].minY)); + assert.strictEqual(true, isInRange(stats.channels[0].minY, 0, 533)); + assert.strictEqual(true, isInteger(stats.channels[0].maxX)); + assert.strictEqual(true, isInRange(stats.channels[0].maxX, 0, 800)); + assert.strictEqual(true, isInteger(stats.channels[0].maxY)); + assert.strictEqual(true, isInRange(stats.channels[0].maxY, 0, 533)); + + // green channel + assert.strictEqual(43, stats.channels[1].min); + assert.strictEqual(255, stats.channels[1].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].sum, 58612156)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].squaresSum, 8548344254)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].mean, 137.45815196998123)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].stdev, 33.955424103758205)); + assert.strictEqual(true, isInteger(stats.channels[1].minX)); + assert.strictEqual(true, isInRange(stats.channels[1].minX, 0, 800)); + assert.strictEqual(true, isInteger(stats.channels[1].minY)); + assert.strictEqual(true, isInRange(stats.channels[1].minY, 0, 533)); + assert.strictEqual(true, isInteger(stats.channels[1].maxX)); + assert.strictEqual(true, isInRange(stats.channels[1].maxX, 0, 800)); + assert.strictEqual(true, isInteger(stats.channels[1].maxY)); + assert.strictEqual(true, isInRange(stats.channels[1].maxY, 0, 533)); + + // blue channel + assert.strictEqual(51, stats.channels[2].min); + assert.strictEqual(254, stats.channels[2].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].sum, 49628525)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].squaresSum, 6450556071)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].mean, 116.38959896810506)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].stdev, 39.7669551046809)); + assert.strictEqual(true, isInteger(stats.channels[2].minX)); + assert.strictEqual(true, isInRange(stats.channels[2].minX, 0, 800)); + assert.strictEqual(true, isInteger(stats.channels[2].minY)); + assert.strictEqual(true, isInRange(stats.channels[2].minY, 0, 533)); + assert.strictEqual(true, isInteger(stats.channels[2].maxX)); + assert.strictEqual(true, isInRange(stats.channels[2].maxX, 0, 800)); + assert.strictEqual(true, isInteger(stats.channels[2].maxY)); + assert.strictEqual(true, isInRange(stats.channels[2].maxY, 0, 533)); + + done(); + }); + }); + + it('Grayscale GIF with alpha', (_t, done) => { + sharp(fixtures.inputGifGreyPlusAlpha).stats((err, stats) => { + if (err) throw err; + + assert.strictEqual(false, stats.isOpaque); + assert.strictEqual(true, isInAcceptableRange(stats.entropy, 1)); + assert.strictEqual(true, isInAcceptableRange(stats.sharpness, 15.870619016486861)); + + const { r, g, b } = stats.dominant; + assert.strictEqual(8, r); + assert.strictEqual(8, g); + assert.strictEqual(8, b); + + // gray channel + assert.strictEqual(0, stats.channels[0].min); + assert.strictEqual(101, stats.channels[0].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].sum, 101)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].squaresSum, 10201)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].mean, 50.5)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].stdev, 71.4177848998413)); + assert.strictEqual(true, isInteger(stats.channels[0].minX)); + assert.strictEqual(true, isInRange(stats.channels[0].minX, 0, 2)); + assert.strictEqual(true, isInteger(stats.channels[0].minY)); + assert.strictEqual(true, isInRange(stats.channels[0].minY, 0, 1)); + assert.strictEqual(true, isInteger(stats.channels[0].maxX)); + assert.strictEqual(true, isInRange(stats.channels[0].maxX, 0, 2)); + assert.strictEqual(true, isInteger(stats.channels[0].maxY)); + assert.strictEqual(true, isInRange(stats.channels[0].maxY, 0, 1)); + + // alpha channel + assert.strictEqual(0, stats.channels[3].min); + assert.strictEqual(255, stats.channels[3].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[3].sum, 255)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[3].squaresSum, 65025)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[3].mean, 127.5)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[3].stdev, 180.31222920256963)); + assert.strictEqual(true, isInteger(stats.channels[3].minX)); + assert.strictEqual(true, isInRange(stats.channels[3].minX, 0, 2)); + assert.strictEqual(true, isInteger(stats.channels[3].minY)); + assert.strictEqual(true, isInRange(stats.channels[3].minY, 0, 1)); + assert.strictEqual(true, isInteger(stats.channels[3].maxX)); + assert.strictEqual(true, isInRange(stats.channels[3].maxX, 0, 2)); + assert.strictEqual(true, isInteger(stats.channels[3].maxY)); + assert.strictEqual(true, isInRange(stats.channels[3].maxY, 0, 1)); + + done(); + }); + }); + + it('CMYK input without profile', () => + sharp(fixtures.inputJpgWithCmykNoProfile) + .stats() + .then(stats => { + assert.strictEqual(4, stats.channels.length); + assert.strictEqual(true, stats.isOpaque); + }) + ); + + it('Dominant colour', () => + sharp(fixtures.inputJpgBooleanTest) + .stats() + .then(({ dominant }) => { + const { r, g, b } = dominant; + assert.strictEqual(r, 8); + assert.strictEqual(g, 136); + assert.strictEqual(b, 248); + }) + ); + + it('Entropy and sharpness of 1x1 input are zero', async () => { + const { entropy, sharpness } = await sharp({ + create: { + width: 1, + height: 1, + channels: 3, + background: 'red' + } + }).stats(); + assert.strictEqual(entropy, 0); + assert.strictEqual(sharpness, 0); + }); + + it('Stream in, Callback out', (_t, done) => { + const readable = fs.createReadStream(fixtures.inputJpg); + const pipeline = sharp().stats((err, stats) => { + if (err) throw err; + + assert.strictEqual(true, stats.isOpaque); + assert.strictEqual(true, isInAcceptableRange(stats.entropy, 7.332915340666659)); + assert.strictEqual(true, isInAcceptableRange(stats.sharpness, 0.788301114707569)); + + const { r, g, b } = stats.dominant; + assert.strictEqual(40, r); + assert.strictEqual(40, g); + assert.strictEqual(40, b); + + // red channel + assert.strictEqual(0, stats.channels[0].min); + assert.strictEqual(255, stats.channels[0].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].sum, 615101275)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].squaresSum, 83061892917)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].mean, 101.44954540768993)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].stdev, 58.373870588815414)); + assert.strictEqual(true, isInteger(stats.channels[0].minX)); + assert.strictEqual(true, isInRange(stats.channels[0].minX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].minY)); + assert.strictEqual(true, isInRange(stats.channels[0].minY, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].maxX)); + assert.strictEqual(true, isInRange(stats.channels[0].maxX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].maxY)); + assert.strictEqual(true, isInRange(stats.channels[0].maxY, 0, 2725)); + + // green channel + assert.strictEqual(0, stats.channels[1].min); + assert.strictEqual(255, stats.channels[1].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].sum, 462824115)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].squaresSum, 47083677255)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].mean, 76.33425255128337)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].stdev, 44.03023262954866)); + assert.strictEqual(true, isInteger(stats.channels[1].minX)); + assert.strictEqual(true, isInRange(stats.channels[1].minX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[1].minY)); + assert.strictEqual(true, isInRange(stats.channels[1].minY, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[1].maxX)); + assert.strictEqual(true, isInRange(stats.channels[1].maxX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[1].maxY)); + assert.strictEqual(true, isInRange(stats.channels[1].maxY, 0, 2725)); + + // blue channel + assert.strictEqual(0, stats.channels[2].min); + assert.strictEqual(255, stats.channels[2].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].sum, 372986756)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].squaresSum, 32151543524)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].mean, 61.51724663436759)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].stdev, 38.96702865090125)); + assert.strictEqual(true, isInteger(stats.channels[2].minX)); + assert.strictEqual(true, isInRange(stats.channels[2].minX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[2].minY)); + assert.strictEqual(true, isInRange(stats.channels[2].minY, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[2].maxX)); + assert.strictEqual(true, isInRange(stats.channels[2].maxX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[2].maxY)); + assert.strictEqual(true, isInRange(stats.channels[2].maxY, 0, 2725)); + + done(); + }); + readable.pipe(pipeline); + }); + + it('Stream in, Promise out', () => { + const pipeline = sharp(); + + fs.createReadStream(fixtures.inputJpg).pipe(pipeline); + + return pipeline.stats().then((stats) => { + assert.strictEqual(true, stats.isOpaque); + assert.strictEqual(true, isInAcceptableRange(stats.entropy, 7.332915340666659)); + assert.strictEqual(true, isInAcceptableRange(stats.sharpness, 0.788301114707569)); + + const { r, g, b } = stats.dominant; + assert.strictEqual(40, r); + assert.strictEqual(40, g); + assert.strictEqual(40, b); + + // red channel + assert.strictEqual(0, stats.channels[0].min); + assert.strictEqual(255, stats.channels[0].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].sum, 615101275)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].squaresSum, 83061892917)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].mean, 101.44954540768993)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].stdev, 58.373870588815414)); + assert.strictEqual(true, isInteger(stats.channels[0].minX)); + assert.strictEqual(true, isInRange(stats.channels[0].minX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].minY)); + assert.strictEqual(true, isInRange(stats.channels[0].minY, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].maxX)); + assert.strictEqual(true, isInRange(stats.channels[0].maxX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].maxY)); + assert.strictEqual(true, isInRange(stats.channels[0].maxY, 0, 2725)); + + // green channel + assert.strictEqual(0, stats.channels[1].min); + assert.strictEqual(255, stats.channels[1].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].sum, 462824115)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].squaresSum, 47083677255)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].mean, 76.33425255128337)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].stdev, 44.03023262954866)); + assert.strictEqual(true, isInteger(stats.channels[0].minX)); + assert.strictEqual(true, isInRange(stats.channels[0].minX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].minY)); + assert.strictEqual(true, isInRange(stats.channels[0].minY, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].maxX)); + assert.strictEqual(true, isInRange(stats.channels[0].maxX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].maxY)); + assert.strictEqual(true, isInRange(stats.channels[0].maxY, 0, 2725)); + + // blue channel + assert.strictEqual(0, stats.channels[2].min); + assert.strictEqual(255, stats.channels[2].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].sum, 372986756)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].squaresSum, 32151543524)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].mean, 61.51724663436759)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].stdev, 38.96702865090125)); + assert.strictEqual(true, isInteger(stats.channels[0].minX)); + assert.strictEqual(true, isInRange(stats.channels[0].minX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].minY)); + assert.strictEqual(true, isInRange(stats.channels[0].minY, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].maxX)); + assert.strictEqual(true, isInRange(stats.channels[0].maxX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].maxY)); + assert.strictEqual(true, isInRange(stats.channels[0].maxY, 0, 2725)); + }).catch((err) => { + throw err; + }); + }); + + it('File in, Promise out', () => sharp(fixtures.inputJpg).stats().then((stats) => { + assert.strictEqual(true, stats.isOpaque); + assert.strictEqual(true, isInAcceptableRange(stats.entropy, 7.332915340666659)); + assert.strictEqual(true, isInAcceptableRange(stats.sharpness, 0.788301114707569)); + + const { r, g, b } = stats.dominant; + assert.strictEqual(40, r); + assert.strictEqual(40, g); + assert.strictEqual(40, b); + + // red channel + assert.strictEqual(0, stats.channels[0].min); + assert.strictEqual(255, stats.channels[0].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].sum, 615101275)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].squaresSum, 83061892917)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].mean, 101.44954540768993)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[0].stdev, 58.373870588815414)); + assert.strictEqual(true, isInteger(stats.channels[0].minX)); + assert.strictEqual(true, isInRange(stats.channels[0].minX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].minY)); + assert.strictEqual(true, isInRange(stats.channels[0].minY, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].maxX)); + assert.strictEqual(true, isInRange(stats.channels[0].maxX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].maxY)); + assert.strictEqual(true, isInRange(stats.channels[0].maxY, 0, 2725)); + + // green channel + assert.strictEqual(0, stats.channels[1].min); + assert.strictEqual(255, stats.channels[1].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].sum, 462824115)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].squaresSum, 47083677255)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].mean, 76.33425255128337)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[1].stdev, 44.03023262954866)); + assert.strictEqual(true, isInteger(stats.channels[0].minX)); + assert.strictEqual(true, isInRange(stats.channels[0].minX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].minY)); + assert.strictEqual(true, isInRange(stats.channels[0].minY, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].maxX)); + assert.strictEqual(true, isInRange(stats.channels[0].maxX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].maxY)); + assert.strictEqual(true, isInRange(stats.channels[0].maxY, 0, 2725)); + + // blue channel + assert.strictEqual(0, stats.channels[2].min); + assert.strictEqual(255, stats.channels[2].max); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].sum, 372986756)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].squaresSum, 32151543524)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].mean, 61.51724663436759)); + assert.strictEqual(true, isInAcceptableRange(stats.channels[2].stdev, 38.96702865090125)); + assert.strictEqual(true, isInteger(stats.channels[0].minX)); + assert.strictEqual(true, isInRange(stats.channels[0].minX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].minY)); + assert.strictEqual(true, isInRange(stats.channels[0].minY, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].maxX)); + assert.strictEqual(true, isInRange(stats.channels[0].maxX, 0, 2725)); + assert.strictEqual(true, isInteger(stats.channels[0].maxY)); + assert.strictEqual(true, isInRange(stats.channels[0].maxY, 0, 2725)); + }).catch((err) => { + throw err; + })); + + it('Blurred image has lower sharpness than original', () => { + const original = sharp(fixtures.inputJpg).stats(); + const blurred = sharp(fixtures.inputJpg).blur().toBuffer().then(blur => sharp(blur).stats()); + + return Promise + .all([original, blurred]) + .then(([original, blurred]) => { + assert.strictEqual(true, isInAcceptableRange(original.sharpness, 0.789046400439488)); + assert.strictEqual(true, isInAcceptableRange(blurred.sharpness, 0.47985138441709047)); + }); + }); + + it('File input with corrupt header fails gracefully', (_t, done) => { + sharp(fixtures.inputJpgWithCorruptHeader) + .stats((err) => { + assert(err.message.includes('Input file has corrupt header')); + assert(err.stack.includes('at Sharp.stats')); + assert(err.stack.includes(__filename)); + done(); + }); + }); + + it('Stream input with corrupt header fails gracefully', (_t, done) => { + fs.createReadStream(fixtures.inputJpgWithCorruptHeader).pipe( + sharp() + .stats((err) => { + assert(err.message.includes('Input buffer has corrupt header')); + assert(err.stack.includes('at Sharp.stats')); + assert(err.stack.includes(__filename)); + done(); + }) + ); + }); + + it('File input with corrupt header fails gracefully, Promise out', () => sharp(fixtures.inputJpgWithCorruptHeader) + .stats().then(() => { + throw new Error('Corrupt Header file'); + }).catch((err) => { + assert.ok(!!err); + })); + + it('File input with corrupt header fails gracefully, Stream In, Promise Out', () => { + const pipeline = sharp(); + + fs.createReadStream(fixtures.inputJpgWithCorruptHeader).pipe(pipeline); + + return pipeline + .stats().then(() => { + throw new Error('Corrupt Header file'); + }).catch((err) => { + assert.ok(!!err); + }); + }); + + it('Buffer input with corrupt header fails gracefully', (_t, done) => { + sharp(fs.readFileSync(fixtures.inputJpgWithCorruptHeader)) + .stats((err) => { + assert.strictEqual(true, !!err); + done(); + }); + }); + + it('Non-existent file in, Promise out', (_t, done) => { + sharp('fail').stats().then(() => { + throw new Error('Non-existent file'); + }, (err) => { + assert.ok(!!err); + done(); + }); + }); + + it('Sequential read option is ignored', async () => { + const { isOpaque } = await sharp(fixtures.inputJpg, { sequentialRead: true }).stats(); + assert.strictEqual(isOpaque, true); + }); +}); diff --git a/test/unit/svg.js b/test/unit/svg.js new file mode 100644 index 000000000..d11b61895 --- /dev/null +++ b/test/unit/svg.js @@ -0,0 +1,198 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const fs = require('node:fs'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('SVG input', () => { + it('Convert SVG to PNG at default 72DPI', (_t, done) => { + sharp(fixtures.inputSvg) + .resize(1024) + .extract({ left: 290, top: 760, width: 40, height: 40 }) + .toFormat('png') + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(40, info.width); + assert.strictEqual(40, info.height); + fixtures.assertSimilar(fixtures.expected('svg72.png'), data, (err) => { + if (err) throw err; + sharp(data).metadata((err, info) => { + if (err) throw err; + assert.strictEqual(72, info.density); + done(); + }); + }); + }); + }); + + it('Convert SVG to PNG at 1200DPI', (_t, done) => { + sharp(fixtures.inputSvg, { density: 1200 }) + .resize(1024) + .extract({ left: 290, top: 760, width: 40, height: 40 }) + .toFormat('png') + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(40, info.width); + assert.strictEqual(40, info.height); + fixtures.assertSimilar(fixtures.expected('svg1200.png'), data, (err) => { + if (err) throw err; + sharp(data).metadata((err, info) => { + if (err) throw err; + assert.strictEqual(1200, info.density); + done(); + }); + }); + }); + }); + + it('Convert SVG to PNG at DPI larger than 2400', (_t, done) => { + const size = 1024; + sharp(fixtures.inputSvgSmallViewBox).metadata((err, metadata) => { + if (err) throw err; + const density = (size / Math.max(metadata.width, metadata.height)) * metadata.density; + sharp(fixtures.inputSvgSmallViewBox, { density }) + .resize(size) + .toFormat('png') + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(size, info.width); + assert.strictEqual(size, info.height); + fixtures.assertSimilar(fixtures.expected('circle.png'), data, (err) => { + if (err) throw err; + sharp(data).metadata((err, info) => { + if (err) throw err; + assert.strictEqual(9216, info.density); + done(); + }); + }); + }); + }); + }); + + it('Convert SVG to PNG utilizing scale-on-load', (_t, done) => { + const size = 1024; + sharp(fixtures.inputSvgSmallViewBox) + .resize(size) + .toFormat('png') + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(size, info.width); + assert.strictEqual(size, info.height); + fixtures.assertSimilar(fixtures.expected('circle.png'), data, (err) => { + if (err) throw err; + sharp(data).metadata((err, info) => { + if (err) throw err; + assert.strictEqual(72, info.density); + done(); + }); + }); + }); + }); + + it('Convert SVG to PNG at 14.4DPI', (_t, done) => { + sharp(fixtures.inputSvg, { density: 14.4 }) + .toFormat('png') + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(20, info.width); + assert.strictEqual(20, info.height); + fixtures.assertSimilar(fixtures.expected('svg14.4.png'), data, (err) => { + if (err) throw err; + done(); + }); + }); + }); + + it('Convert SVG with embedded images to PNG, respecting dimensions, autoconvert to PNG', (_t, done) => { + sharp(fixtures.inputSvgWithEmbeddedImages) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('png', info.format); + assert.strictEqual(480, info.width); + assert.strictEqual(360, info.height); + assert.strictEqual(4, info.channels); + fixtures.assertSimilar(fixtures.expected('svg-embedded.png'), data, done); + }); + }); + + it('Converts SVG with truncated embedded PNG', async () => { + const truncatedPng = fs.readFileSync(fixtures.inputPngTruncated).toString('base64'); + const svg = ` + + + `; + + const { info } = await sharp(Buffer.from(svg)).toBuffer({ resolveWithObject: true }); + assert.strictEqual(info.format, 'png'); + assert.strictEqual(info.width, 294); + assert.strictEqual(info.height, 240); + assert.strictEqual(info.channels, 4); + }); + + it('Can apply custom CSS', async () => { + const svg = ` + + + `; + const stylesheet = 'circle { fill: red }'; + + const [r, g, b, a] = await sharp(Buffer.from(svg), { svg: { stylesheet } }) + .extract({ left: 5, top: 5, width: 1, height: 1 }) + .raw() + .toBuffer(); + + assert.deepEqual([r, g, b, a], [255, 0, 0, 255]); + }); + + it('Invalid stylesheet input option throws', () => + assert.throws( + () => sharp({ svg: { stylesheet: 123 } }), + /Expected string for svg\.stylesheet but received 123 of type number/ + ) + ); + + it('Valid highBitdepth input option does not throw', () => + assert.doesNotThrow( + () => sharp({ svg: { highBitdepth: true } }) + ) + ); + + it('Invalid highBitdepth input option throws', () => + assert.throws( + () => sharp({ svg: { highBitdepth: 123 } }), + /Expected boolean for svg\.highBitdepth but received 123 of type number/ + ) + ); + + it('Fails to render SVG larger than 32767x32767', () => + assert.rejects( + () => sharp(Buffer.from('')).toBuffer(), + /Input SVG image exceeds 32767x32767 pixel limit/ + ) + ); + + it('Fails to render scaled SVG larger than 32767x32767', () => + assert.rejects( + () => sharp(Buffer.from('')).resize(32768).toBuffer(), + /Input SVG image will exceed 32767x32767 pixel limit when scaled/ + ) + ); + + it('Detects SVG passed as a string', () => + assert.rejects( + () => sharp('').toBuffer(), + /Input file is missing, did you mean/ + ) + ); +}); diff --git a/test/unit/text.js b/test/unit/text.js new file mode 100644 index 000000000..029b45f1f --- /dev/null +++ b/test/unit/text.js @@ -0,0 +1,333 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); +const { inRange } = require('../../lib/is'); + +describe('Text to image', () => { + it('text with default values', async (t) => { + const output = fixtures.path('output.text-default.png'); + const text = sharp({ + text: { + text: 'Hello, world !' + } + }); + if (!sharp.versions.pango) { + return t.skip(); + } + const info = await text.png().toFile(output); + assert.strictEqual('png', info.format); + assert.strictEqual(3, info.channels); + assert.strictEqual(false, info.premultiplied); + assert.ok(info.width > 10); + assert.ok(info.height > 8); + const metadata = await sharp(output).metadata(); + assert.strictEqual('uchar', metadata.depth); + assert.strictEqual('srgb', metadata.space); + assert.strictEqual(72, metadata.density); + const stats = await sharp(output).stats(); + assert.strictEqual(0, stats.channels[0].min); + assert.strictEqual(255, stats.channels[0].max); + assert.strictEqual(0, stats.channels[1].min); + assert.strictEqual(255, stats.channels[1].max); + assert.strictEqual(0, stats.channels[2].min); + assert.strictEqual(255, stats.channels[2].max); + assert.ok(info.textAutofitDpi > 0); + }); + + it('text with width and height', async (t) => { + const output = fixtures.path('output.text-width-height.png'); + const text = sharp({ + text: { + text: 'Hello, world!', + width: 500, + height: 400 + } + }); + if (!sharp.versions.pango) { + return t.skip(); + } + const info = await text.toFile(output); + assert.strictEqual('png', info.format); + assert.strictEqual(3, info.channels); + assert.ok(inRange(info.width, 400, 600), `Actual width ${info.width}`); + assert.ok(inRange(info.height, 290, 500), `Actual height ${info.height}`); + assert.ok(inRange(info.textAutofitDpi, 900, 1300), `Actual textAutofitDpi ${info.textAutofitDpi}`); + }); + + it('text with dpi', async (t) => { + const output = fixtures.path('output.text-dpi.png'); + const dpi = 300; + const text = sharp({ + text: { + text: 'Hello, world!', + dpi + } + }); + if (!sharp.versions.pango) { + return t.skip(); + } + const info = await text.toFile(output); + assert.strictEqual('png', info.format); + const metadata = await sharp(output).metadata(); + assert.strictEqual(dpi, metadata.density); + }); + + it('text with color and pango markup', async (t) => { + const output = fixtures.path('output.text-color-pango.png'); + const dpi = 300; + const text = sharp({ + text: { + text: 'redblue', + rgba: true, + dpi + } + }); + if (!sharp.versions.pango) { + return t.skip(); + } + const info = await text.toFile(output); + assert.strictEqual('png', info.format); + assert.strictEqual(4, info.channels); + const metadata = await sharp(output).metadata(); + assert.strictEqual(dpi, metadata.density); + assert.strictEqual('uchar', metadata.depth); + assert.strictEqual(true, metadata.hasAlpha); + }); + + it('text with font', async (t) => { + const output = fixtures.path('output.text-with-font.png'); + const text = sharp({ + text: { + text: 'Hello, world!', + font: 'sans 100' + } + }); + if (!sharp.versions.pango) { + return t.skip(); + } + const info = await text.toFile(output); + assert.strictEqual('png', info.format); + assert.strictEqual(3, info.channels); + assert.ok(info.width > 30); + assert.ok(info.height > 10); + }); + + it('text with justify and composite', async (t) => { + const output = fixtures.path('output.text-composite.png'); + const width = 500; + const dpi = 300; + const text = sharp(fixtures.inputJpg) + .resize(width) + .composite([{ + input: { + text: { + text: 'Watermark is cool', + width: 300, + height: 300, + justify: true, + align: 'right', + spacing: 50, + rgba: true + } + }, + gravity: 'northeast' + }, { + input: { + text: { + text: 'cool', + font: 'sans 30', + dpi, + rgba: true + } + }, + left: 30, + top: 250 + }]); + if (!sharp.versions.pango) { + return t.skip(); + } + const info = await text.toFile(output); + assert.strictEqual('png', info.format); + assert.strictEqual(4, info.channels); + assert.strictEqual(width, info.width); + assert.strictEqual(true, info.premultiplied); + const metadata = await sharp(output).metadata(); + assert.strictEqual('srgb', metadata.space); + assert.strictEqual('uchar', metadata.depth); + assert.strictEqual(true, metadata.hasAlpha); + }); + + it('bad text input', () => { + assert.throws(() => { + sharp({ + text: { + } + }); + }); + }); + + it('fontfile input', () => { + assert.doesNotThrow(() => { + sharp({ + text: { + text: 'text', + fontfile: 'UnknownFont.ttf' + } + }); + }); + }); + + it('bad font input', () => { + assert.throws(() => { + sharp({ + text: { + text: 'text', + font: 12 + } + }); + }); + }); + + it('bad fontfile input', () => { + assert.throws(() => { + sharp({ + text: { + text: 'text', + fontfile: true + } + }); + }); + }); + + it('invalid width', () => { + assert.throws( + () => sharp({ text: { text: 'text', width: 'bad' } }), + /Expected positive integer for text\.width but received bad of type string/ + ); + assert.throws( + () => sharp({ text: { text: 'text', width: 0.1 } }), + /Expected positive integer for text\.width but received 0.1 of type number/ + ); + assert.throws( + () => sharp({ text: { text: 'text', width: -1 } }), + /Expected positive integer for text\.width but received -1 of type number/ + ); + }); + + it('invalid height', () => { + assert.throws( + () => sharp({ text: { text: 'text', height: 'bad' } }), + /Expected positive integer for text\.height but received bad of type string/ + ); + assert.throws( + () => sharp({ text: { text: 'text', height: 0.1 } }), + /Expected positive integer for text\.height but received 0.1 of type number/ + ); + assert.throws( + () => sharp({ text: { text: 'text', height: -1 } }), + /Expected positive integer for text\.height but received -1 of type number/ + ); + }); + + it('bad align input', () => { + assert.throws(() => { + sharp({ + text: { + text: 'text', + align: 'unknown' + } + }); + }); + }); + + it('bad justify input', () => { + assert.throws(() => { + sharp({ + text: { + text: 'text', + justify: 'unknown' + } + }); + }); + }); + + it('invalid dpi', () => { + assert.throws( + () => sharp({ text: { text: 'text', dpi: 'bad' } }), + /Expected integer between 1 and 1000000 for text\.dpi but received bad of type string/ + ); + assert.throws( + () => sharp({ text: { text: 'text', dpi: 0.1 } }), + /Expected integer between 1 and 1000000 for text\.dpi but received 0.1 of type number/ + ); + assert.throws( + () => sharp({ text: { text: 'text', dpi: -1 } }), + /Expected integer between 1 and 1000000 for text\.dpi but received -1 of type number/ + ); + }); + + it('bad rgba input', () => { + assert.throws(() => { + sharp({ + text: { + text: 'text', + rgba: -10 + } + }); + }); + }); + + it('invalid spacing', () => { + assert.throws( + () => sharp({ text: { text: 'text', spacing: 'bad' } }), + /Expected integer between -1000000 and 1000000 for text\.spacing but received bad of type string/ + ); + assert.throws( + () => sharp({ text: { text: 'text', spacing: 0.1 } }), + /Expected integer between -1000000 and 1000000 for text\.spacing but received 0.1 of type number/ + ); + assert.throws( + () => sharp({ text: { text: 'text', spacing: -1000001 } }), + /Expected integer between -1000000 and 1000000 for text\.spacing but received -1000001 of type number/ + ); + }); + + it('only height or dpi not both', () => { + assert.throws(() => { + sharp({ + text: { + text: 'text', + height: 400, + dpi: 100 + } + }); + }); + }); + + it('valid wrap throws', () => { + assert.doesNotThrow(() => sharp({ text: { text: 'text', wrap: 'none' } })); + assert.doesNotThrow(() => sharp({ text: { text: 'text', wrap: 'word-char' } })); + }); + + it('invalid wrap throws', () => { + assert.throws( + () => sharp({ text: { text: 'text', wrap: 1 } }), + /Expected one of: word, char, word-char, none for text\.wrap but received 1 of type number/ + ); + assert.throws( + () => sharp({ text: { text: 'text', wrap: false } }), + /Expected one of: word, char, word-char, none for text\.wrap but received false of type boolean/ + ); + assert.throws( + () => sharp({ text: { text: 'text', wrap: 'invalid' } }), + /Expected one of: word, char, word-char, none for text\.wrap but received invalid of type string/ + ); + }); +}); diff --git a/test/unit/threshold.js b/test/unit/threshold.js index 5dad411ae..1ffb0f9cc 100644 --- a/test/unit/threshold.js +++ b/test/unit/threshold.js @@ -1,16 +1,20 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const sharp = require('../../'); const fixtures = require('../fixtures'); -describe('Threshold', function () { - it('threshold 1 jpeg', function (done) { +describe('Threshold', () => { + it('threshold 1 jpeg', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .threshold(1) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -19,11 +23,11 @@ describe('Threshold', function () { }); }); - it('threshold 40 jpeg', function (done) { + it('threshold 40 jpeg', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .threshold(40) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -32,11 +36,11 @@ describe('Threshold', function () { }); }); - it('threshold 128', function (done) { + it('threshold 128', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .threshold(128) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -45,11 +49,11 @@ describe('Threshold', function () { }); }); - it('threshold true (=128)', function (done) { + it('threshold true (=128)', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .threshold(true) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -58,20 +62,20 @@ describe('Threshold', function () { }); }); - it('threshold false (=0)', function (done) { + it('threshold false (=0)', (_t, done) => { sharp(fixtures.inputJpg) .threshold(false) - .toBuffer(function (err, data, info) { + .toBuffer((err, data) => { if (err) throw err; fixtures.assertSimilar(fixtures.inputJpg, data, done); }); }); - it('threshold grayscale: true (=128)', function (done) { + it('threshold grayscale: true (=128)', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .threshold(128, { grayscale: true }) - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -80,11 +84,11 @@ describe('Threshold', function () { }); }); - it('threshold default jpeg', function (done) { + it('threshold default jpeg', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) .threshold() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -93,11 +97,11 @@ describe('Threshold', function () { }); }); - it('threshold default png transparency', function (done) { + it('threshold default png transparency', (_t, done) => { sharp(fixtures.inputPngWithTransparency) .resize(320, 240) .threshold() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('png', info.format); assert.strictEqual(320, info.width); @@ -106,11 +110,11 @@ describe('Threshold', function () { }); }); - it('threshold default png alpha', function (done) { + it('threshold default png alpha', (_t, done) => { sharp(fixtures.inputPngWithGreyAlpha) .resize(320, 240) .threshold() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('png', info.format); assert.strictEqual(320, info.width); @@ -119,21 +123,21 @@ describe('Threshold', function () { }); }); - it('threshold default webp transparency', function (done) { + it('threshold default webp transparency', (_t, done) => { sharp(fixtures.inputWebPWithTransparency) .threshold() - .toBuffer(function (err, data, info) { + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('webp', info.format); fixtures.assertSimilar(fixtures.expected('threshold-128-transparency.webp'), data, done); }); }); - it('color threshold', function (done) { + it('color threshold', (_t, done) => { sharp(fixtures.inputJpg) .resize(320, 240) - .threshold(128, {'grayscale': false}) - .toBuffer(function (err, data, info) { + .threshold(128, { grayscale: false }) + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual('jpeg', info.format); assert.strictEqual(320, info.width); @@ -142,14 +146,14 @@ describe('Threshold', function () { }); }); - it('invalid threshold -1', function () { - assert.throws(function () { + it('invalid threshold -1', () => { + assert.throws(() => { sharp().threshold(-1); }); }); - it('invalid threshold 256', function () { - assert.throws(function () { + it('invalid threshold 256', () => { + assert.throws(() => { sharp().threshold(256); }); }); diff --git a/test/unit/tiff.js b/test/unit/tiff.js new file mode 100644 index 000000000..be5e5535f --- /dev/null +++ b/test/unit/tiff.js @@ -0,0 +1,553 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const fs = require('node:fs'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +const outputTiff = fixtures.path('output.tiff'); + +describe('TIFF', () => { + it('Load TIFF from Buffer', (_t, done) => { + const inputTiffBuffer = fs.readFileSync(fixtures.inputTiff); + sharp(inputTiffBuffer) + .resize(320, 240) + .jpeg() + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual(data.length, info.size); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + done(); + }); + }); + + it('Load multi-page TIFF from file', (_t, done) => { + sharp(fixtures.inputTiffMultipage) // defaults to page 0 + .jpeg() + .toBuffer((err, defaultData, defaultInfo) => { + if (err) throw err; + assert.strictEqual(true, defaultData.length > 0); + assert.strictEqual(defaultData.length, defaultInfo.size); + assert.strictEqual('jpeg', defaultInfo.format); + + sharp(fixtures.inputTiffMultipage, { page: 1 }) // 50%-scale copy of page 0 + .jpeg() + .toBuffer((err, scaledData, scaledInfo) => { + if (err) throw err; + assert.strictEqual(true, scaledData.length > 0); + assert.strictEqual(scaledData.length, scaledInfo.size); + assert.strictEqual('jpeg', scaledInfo.format); + assert.strictEqual(defaultInfo.width, scaledInfo.width * 2); + assert.strictEqual(defaultInfo.height, scaledInfo.height * 2); + done(); + }); + }); + }); + + it('Load multi-page TIFF from Buffer', (_t, done) => { + const inputTiffBuffer = fs.readFileSync(fixtures.inputTiffMultipage); + sharp(inputTiffBuffer) // defaults to page 0 + .jpeg() + .toBuffer((err, defaultData, defaultInfo) => { + if (err) throw err; + assert.strictEqual(true, defaultData.length > 0); + assert.strictEqual(defaultData.length, defaultInfo.size); + assert.strictEqual('jpeg', defaultInfo.format); + + sharp(inputTiffBuffer, { page: 1 }) // 50%-scale copy of page 0 + .jpeg() + .toBuffer((err, scaledData, scaledInfo) => { + if (err) throw err; + assert.strictEqual(true, scaledData.length > 0); + assert.strictEqual(scaledData.length, scaledInfo.size); + assert.strictEqual('jpeg', scaledInfo.format); + assert.strictEqual(defaultInfo.width, scaledInfo.width * 2); + assert.strictEqual(defaultInfo.height, scaledInfo.height * 2); + done(); + }); + }); + }); + + it('Save TIFF to Buffer', (_t, done) => { + sharp(fixtures.inputTiff) + .resize(320, 240) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual(data.length, info.size); + assert.strictEqual('tiff', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + done(); + }); + }); + + it('Increasing TIFF quality increases file size', () => + sharp(fixtures.inputJpgWithLandscapeExif1) + .resize(320, 240) + .tiff({ quality: 40 }) + .toBuffer() + .then(tiff40 => sharp(fixtures.inputJpgWithLandscapeExif1) + .resize(320, 240) + .tiff({ quality: 90 }) + .toBuffer() + .then(tiff90 => + assert.strictEqual(true, tiff40.length < tiff90.length) + ) + ) + ); + + it('Invalid TIFF quality throws error', () => { + assert.throws(() => { + sharp().tiff({ quality: 101 }); + }); + }); + + it('Missing TIFF quality does not throw error', () => { + assert.doesNotThrow(() => { + sharp().tiff(); + }); + }); + + it('Not squashing TIFF to a bit depth of 1 should not change the file size', (_t, done) => { + const startSize = fs.statSync(fixtures.inputTiff8BitDepth).size; + sharp(fixtures.inputTiff8BitDepth) + .toColourspace('b-w') // can only squash 1 band uchar images + .tiff({ + bitdepth: 8, + compression: 'none', + predictor: 'none' + }) + .toFile(outputTiff, (err, info) => { + if (err) throw err; + assert.strictEqual('tiff', info.format); + assert.strictEqual(startSize, info.size); + fs.rm(outputTiff, done); + }); + }); + + it('Squashing TIFF to a bit depth of 1 should significantly reduce file size', (_t, done) => { + const startSize = fs.statSync(fixtures.inputTiff8BitDepth).size; + sharp(fixtures.inputTiff8BitDepth) + .toColourspace('b-w') // can only squash 1 band uchar images + .tiff({ + bitdepth: 1, + compression: 'none', + predictor: 'none' + }) + .toFile(outputTiff, (err, info) => { + if (err) throw err; + assert.strictEqual('tiff', info.format); + assert(info.size < (startSize / 2)); + fs.rm(outputTiff, done); + }); + }); + + it('Invalid TIFF bitdepth value throws error', () => { + assert.throws(() => { + sharp().tiff({ bitdepth: 3 }); + }, /Error: Expected 1, 2, 4 or 8 for bitdepth but received 3 of type number/); + }); + + it('TIFF setting xres and yres on file', () => + sharp(fixtures.inputTiff) + .resize(8, 8) + .tiff({ + xres: 1000, + yres: 1000 + }) + .toFile(outputTiff) + .then(() => sharp(outputTiff) + .metadata() + .then(({ density }) => { + assert.strictEqual(25400, density); + return fs.promises.rm(outputTiff); + }) + ) + ); + + it('TIFF setting xres and yres on buffer', () => + sharp(fixtures.inputTiff) + .resize(8, 8) + .tiff({ + xres: 1000, + yres: 1000 + }) + .toBuffer() + .then(data => sharp(data) + .metadata() + .then(({ density }) => { + assert.strictEqual(25400, density); + }) + ) + ); + + it('TIFF imputes xres and yres from withMetadataDensity if not explicitly provided', async () => { + const data = await sharp(fixtures.inputTiff) + .resize(8, 8) + .tiff() + .withMetadata({ density: 600 }) + .toBuffer(); + const { density } = await sharp(data).metadata(); + assert.strictEqual(600, density); + }); + + it('TIFF uses xres and yres over withMetadataDensity if explicitly provided', async () => { + const data = await sharp(fixtures.inputTiff) + .resize(8, 8) + .tiff({ xres: 1000, yres: 1000 }) + .withMetadata({ density: 600 }) + .toBuffer(); + const { density } = await sharp(data).metadata(); + assert.strictEqual(25400, density); + }); + + it('TIFF invalid xres value should throw an error', () => { + assert.throws(() => { + sharp().tiff({ xres: '1000.0' }); + }); + }); + + it('TIFF invalid yres value should throw an error', () => { + assert.throws(() => { + sharp().tiff({ yres: '1000.0' }); + }); + }); + + it('TIFF lzw compression with horizontal predictor shrinks test file', (_t, done) => { + const startSize = fs.statSync(fixtures.inputTiffUncompressed).size; + sharp(fixtures.inputTiffUncompressed) + .tiff({ + compression: 'lzw', + predictor: 'horizontal' + }) + .toFile(outputTiff, (err, info) => { + if (err) throw err; + assert.strictEqual('tiff', info.format); + assert.strictEqual(3, info.channels); + assert(info.size < startSize); + fs.rm(outputTiff, done); + }); + }); + + it('TIFF LZW RGBA toFile', () => + sharp({ + create: { + width: 1, + height: 1, + channels: 4, + background: 'red' + } + }) + .tiff({ + compression: 'lzw' + }) + .toFile(outputTiff) + .then(info => { + assert.strictEqual(4, info.channels); + }) + ); + + it('TIFF LZW RGBA toBuffer', () => + sharp({ + create: { + width: 1, + height: 1, + channels: 4, + background: 'red' + } + }) + .tiff({ + compression: 'lzw' + }) + .toBuffer({ resolveWithObject: true }) + .then(({ info }) => { + assert.strictEqual(4, info.channels); + }) + ); + + it('TIFF ccittfax4 compression shrinks b-w test file', (_t, done) => { + const startSize = fs.statSync(fixtures.inputTiff).size; + sharp(fixtures.inputTiff) + .toColourspace('b-w') + .tiff({ + bitdepth: 1, + compression: 'ccittfax4' + }) + .toFile(outputTiff, (err, info) => { + if (err) throw err; + assert.strictEqual('tiff', info.format); + assert(info.size < startSize); + fs.rm(outputTiff, done); + }); + }); + + it('TIFF resolutionUnit of inch (default)', async () => { + const data = await sharp({ create: { width: 8, height: 8, channels: 3, background: 'red' } }) + .tiff() + .toBuffer(); + const { resolutionUnit } = await sharp(data).metadata(); + assert.strictEqual(resolutionUnit, 'inch'); + }); + + it('TIFF resolutionUnit of inch', async () => { + const data = await sharp({ create: { width: 8, height: 8, channels: 3, background: 'red' } }) + .tiff({ resolutionUnit: 'inch' }) + .toBuffer(); + const { resolutionUnit } = await sharp(data).metadata(); + assert.strictEqual(resolutionUnit, 'inch'); + }); + + it('TIFF resolutionUnit of cm', async () => { + const data = await sharp({ create: { width: 8, height: 8, channels: 3, background: 'red' } }) + .tiff({ resolutionUnit: 'cm' }) + .toBuffer(); + const { resolutionUnit } = await sharp(data).metadata(); + assert.strictEqual(resolutionUnit, 'cm'); + }); + + it('TIFF deflate compression with horizontal predictor shrinks test file', (_t, done) => { + const startSize = fs.statSync(fixtures.inputTiffUncompressed).size; + sharp(fixtures.inputTiffUncompressed) + .tiff({ + compression: 'deflate', + predictor: 'horizontal' + }) + .toFile(outputTiff, (err, info) => { + if (err) throw err; + assert.strictEqual('tiff', info.format); + assert(info.size < startSize); + fs.rm(outputTiff, done); + }); + }); + + it('TIFF deflate compression with float predictor shrinks test file', (_t, done) => { + const startSize = fs.statSync(fixtures.inputTiffUncompressed).size; + sharp(fixtures.inputTiffUncompressed) + .tiff({ + compression: 'deflate', + predictor: 'float' + }) + .toFile(outputTiff, (err, info) => { + if (err) throw err; + assert.strictEqual('tiff', info.format); + assert(startSize > info.size); + fs.rm(outputTiff, done); + }); + }); + + it('TIFF deflate compression without predictor shrinks test file', (_t, done) => { + const startSize = fs.statSync(fixtures.inputTiffUncompressed).size; + sharp(fixtures.inputTiffUncompressed) + .tiff({ + compression: 'deflate', + predictor: 'none' + }) + .toFile(outputTiff, (err, info) => { + if (err) throw err; + assert.strictEqual('tiff', info.format); + assert(info.size < startSize); + fs.rm(outputTiff, done); + }); + }); + + it('TIFF jpeg compression shrinks test file', (_t, done) => { + const startSize = fs.statSync(fixtures.inputTiffUncompressed).size; + sharp(fixtures.inputTiffUncompressed) + .tiff({ + compression: 'jpeg' + }) + .toFile(outputTiff, (err, info) => { + if (err) throw err; + assert.strictEqual('tiff', info.format); + assert(info.size < startSize); + fs.rm(outputTiff, done); + }); + }); + + it('TIFF none compression does not throw error', () => { + assert.doesNotThrow(() => { + sharp().tiff({ compression: 'none' }); + }); + }); + + it('TIFF lzw compression does not throw error', () => { + assert.doesNotThrow(() => { + sharp().tiff({ compression: 'lzw' }); + }); + }); + + it('TIFF deflate compression does not throw error', () => { + assert.doesNotThrow(() => { + sharp().tiff({ compression: 'deflate' }); + }); + }); + + it('TIFF invalid compression option throws', () => { + assert.throws(() => { + sharp().tiff({ compression: 0 }); + }); + }); + + it('TIFF invalid compression option throws', () => { + assert.throws(() => { + sharp().tiff({ compression: 'a' }); + }); + }); + + it('TIFF bigtiff true value does not throw error', () => { + assert.doesNotThrow(() => { + sharp().tiff({ bigtiff: true }); + }); + }); + + it('Invalid TIFF bigtiff value throws error', () => { + assert.throws(() => { + sharp().tiff({ bigtiff: 'true' }); + }); + }); + + it('TIFF invalid predictor option throws', () => { + assert.throws(() => { + sharp().tiff({ predictor: 'a' }); + }); + }); + + it('TIFF invalid resolutionUnit option throws', () => { + assert.throws(() => { + sharp().tiff({ resolutionUnit: 'none' }); + }); + }); + + it('TIFF horizontal predictor does not throw error', () => { + assert.doesNotThrow(() => { + sharp().tiff({ predictor: 'horizontal' }); + }); + }); + + it('TIFF float predictor does not throw error', () => { + assert.doesNotThrow(() => { + sharp().tiff({ predictor: 'float' }); + }); + }); + + it('TIFF none predictor does not throw error', () => { + assert.doesNotThrow(() => { + sharp().tiff({ predictor: 'none' }); + }); + }); + + it('TIFF tiled pyramid image without compression enlarges test file', (_t, done) => { + const startSize = fs.statSync(fixtures.inputTiffUncompressed).size; + sharp(fixtures.inputTiffUncompressed) + .tiff({ + compression: 'none', + pyramid: true, + tile: true, + tileHeight: 256, + tileWidth: 256 + }) + .toFile(outputTiff, (err, info) => { + if (err) throw err; + assert.strictEqual('tiff', info.format); + assert(info.size > startSize); + fs.rm(outputTiff, done); + }); + }); + + it('TIFF pyramid true value does not throw error', () => { + assert.doesNotThrow(() => { + sharp().tiff({ pyramid: true }); + }); + }); + + it('Invalid TIFF pyramid value throws error', () => { + assert.throws(() => { + sharp().tiff({ pyramid: 'true' }); + }); + }); + + it('TIFF miniswhite true value does not throw error', () => { + assert.doesNotThrow(() => { + sharp().tiff({ miniswhite: true }); + }); + }); + + it('Invalid TIFF miniswhite value throws error', () => { + assert.throws(() => { + sharp().tiff({ miniswhite: 'true' }); + }); + }); + + it('Invalid TIFF tile value throws error', () => { + assert.throws(() => { + sharp().tiff({ tile: 'true' }); + }); + }); + + it('TIFF tile true value does not throw error', () => { + assert.doesNotThrow(() => { + sharp().tiff({ tile: true }); + }); + }); + + it('Valid TIFF tileHeight value does not throw error', () => { + assert.doesNotThrow(() => { + sharp().tiff({ tileHeight: 512 }); + }); + }); + + it('Valid TIFF tileWidth value does not throw error', () => { + assert.doesNotThrow(() => { + sharp().tiff({ tileWidth: 512 }); + }); + }); + + it('Invalid TIFF tileHeight value throws error', () => { + assert.throws(() => { + sharp().tiff({ tileHeight: '256' }); + }); + }); + + it('Invalid TIFF tileWidth value throws error', () => { + assert.throws(() => { + sharp().tiff({ tileWidth: '256' }); + }); + }); + + it('Invalid TIFF tileHeight value throws error', () => { + assert.throws(() => { + sharp().tiff({ tileHeight: 0 }); + }); + }); + + it('Invalid TIFF tileWidth value throws error', () => { + assert.throws(() => { + sharp().tiff({ tileWidth: 0 }); + }); + }); + + it('TIFF file input with invalid page fails gracefully', (_t, done) => { + sharp(fixtures.inputTiffMultipage, { page: 2 }) + .toBuffer((err) => { + assert.strictEqual(true, !!err); + done(); + }); + }); + + it('TIFF buffer input with invalid page fails gracefully', (_t, done) => { + sharp(fs.readFileSync(fixtures.inputTiffMultipage), { page: 2 }) + .toBuffer((err) => { + assert.strictEqual(true, !!err); + done(); + }); + }); +}); diff --git a/test/unit/tile.js b/test/unit/tile.js index 7af2736e8..8d477ebfa 100644 --- a/test/unit/tile.js +++ b/test/unit/tile.js @@ -1,38 +1,39 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const fs = require('fs'); -const path = require('path'); -const assert = require('assert'); +const fs = require('node:fs'); +const path = require('node:path'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); -const eachLimit = require('async/eachLimit'); -const rimraf = require('rimraf'); -const unzip = require('unzip'); +const extractZip = require('extract-zip'); const sharp = require('../../'); const fixtures = require('../fixtures'); // Verifies all tiles in a given dz output directory are <= size -const assertDeepZoomTiles = function (directory, expectedSize, expectedLevels, done) { +const assertDeepZoomTiles = (directory, expectedSize, expectedLevels, done) => { // Get levels - const levels = fs.readdirSync(directory); + const dirents = fs.readdirSync(directory, { withFileTypes: true }); + const levels = dirents.filter(dirent => dirent.isDirectory()).map(dirent => dirent.name); assert.strictEqual(expectedLevels, levels.length); // Get tiles const tiles = []; - levels.forEach(function (level) { + levels.forEach((level) => { // Verify level directory name assert.strictEqual(true, /^[0-9]+$/.test(level)); - fs.readdirSync(path.join(directory, level)).forEach(function (tile) { + fs.readdirSync(path.join(directory, level)).forEach((tile) => { // Verify tile file name assert.strictEqual(true, /^[0-9]+_[0-9]+\.jpeg$/.test(tile)); tiles.push(path.join(directory, level, tile)); }); }); // Verify each tile is <= expectedSize - eachLimit(tiles, 8, function (tile, done) { - sharp(tile).metadata(function (err, metadata) { - if (err) { - done(err); - } else { + Promise.all(tiles.map((tile) => sharp(tile) + .metadata() + .then((metadata) => { assert.strictEqual('jpeg', metadata.format); assert.strictEqual('srgb', metadata.space); assert.strictEqual(3, metadata.channels); @@ -40,368 +41,933 @@ const assertDeepZoomTiles = function (directory, expectedSize, expectedLevels, d assert.strictEqual(false, metadata.hasAlpha); assert.strictEqual(true, metadata.width <= expectedSize); assert.strictEqual(true, metadata.height <= expectedSize); + }))) + .then(() => done()) + .catch(done); +}; + +const assertZoomifyTiles = (directory, expectedLevels, done) => { + fs.stat(path.join(directory, 'ImageProperties.xml'), (err, stat) => { + if (err) throw err; + assert.ok(stat.isFile()); + assert.ok(stat.size > 0); + + let maxTileLevel = -1; + fs.readdirSync(path.join(directory, 'TileGroup0')).forEach((tile) => { + // Verify tile file name + assert.ok(/^[0-9]+-[0-9]+-[0-9]+\.jpg$/.test(tile)); + const level = Number(tile.split('-')[0]); + maxTileLevel = Math.max(maxTileLevel, level); + }); + + assert.strictEqual(maxTileLevel + 1, expectedLevels); // add one to account for zero level tile + + done(); + }); +}; + +const assertGoogleTiles = (directory, expectedLevels, done) => { + // Get levels + const dirents = fs.readdirSync(directory, { withFileTypes: true }); + const levels = dirents.filter(dirent => dirent.isDirectory()).map(dirent => dirent.name); + assert.strictEqual(expectedLevels, levels.length); + + fs.stat(path.join(directory, 'blank.png'), (err, stat) => { + if (err) throw err; + assert.ok(stat.isFile()); + assert.ok(stat.size > 0); + + // Basic check to confirm lowest and highest level tiles exist + fs.stat(path.join(directory, '0', '0', '0.jpg'), (err, stat) => { + if (err) throw err; + assert.strictEqual(true, stat.isFile()); + assert.strictEqual(true, stat.size > 0); + + fs.stat(path.join(directory, (expectedLevels - 1).toString(), '0', '0.jpg'), (err, stat) => { + if (err) throw err; + assert.strictEqual(true, stat.isFile()); + assert.strictEqual(true, stat.size > 0); done(); - } + }); }); - }, done); + }); +}; + +// Verifies tiles at specified level in a given output directory are > size+overlap +const assertTileOverlap = (directory, tileSize, done) => { + // Get sorted levels + const dirents = fs.readdirSync(directory, { withFileTypes: true }); + const levels = dirents.filter(dirent => dirent.isDirectory()).map(dirent => dirent.name).sort((a, b) => a - b); + // Select the highest tile level + const highestLevel = levels[levels.length - 1]; + // Get sorted tiles from greatest level + const tiles = fs.readdirSync(path.join(directory, highestLevel)).sort(); + // Select a tile from the approximate center of the image + const squareTile = path.join(directory, highestLevel, tiles[Math.floor(tiles.length / 2)]); + + sharp(squareTile).metadata((err, metadata) => { + if (err) { + throw err; + } else { + // Tile with an overlap should be larger than original size + assert.strictEqual(true, metadata.width > tileSize); + assert.strictEqual(true, metadata.height > tileSize); + done(); + } + }); }; -describe('Tile', function () { - it('Valid size values pass', function () { - [1, 8192].forEach(function (size) { - assert.doesNotThrow(function () { +describe('Tile', () => { + it('Valid size values pass', () => { + [1, 8192].forEach((size) => { + assert.doesNotThrow(() => { sharp().tile({ - size: size + size }); }); }); }); - it('Invalid size values fail', function () { - ['zoinks', 1.1, -1, 0, 8193].forEach(function (size) { - assert.throws(function () { + it('Invalid size values fail', () => { + ['zoinks', 1.1, -1, 0, 8193].forEach((size) => { + assert.throws(() => { sharp().tile({ - size: size + size }); }); }); }); - it('Valid overlap values pass', function () { - [0, 8192].forEach(function (overlap) { - assert.doesNotThrow(function () { + it('Valid overlap values pass', () => { + [0, 8192].forEach((overlap) => { + assert.doesNotThrow(() => { sharp().tile({ size: 8192, - overlap: overlap + overlap }); }); }); }); - it('Invalid overlap values fail', function () { - ['zoinks', 1.1, -1, 8193].forEach(function (overlap) { - assert.throws(function () { + it('Invalid overlap values fail', () => { + ['zoinks', 1.1, -1, 8193].forEach((overlap) => { + assert.throws(() => { sharp().tile({ - overlap: overlap + overlap }); }); }); }); - it('Valid container values pass', function () { - ['fs', 'zip'].forEach(function (container) { - assert.doesNotThrow(function () { + it('Valid container values pass', () => { + ['fs', 'zip'].forEach((container) => { + assert.doesNotThrow(() => { sharp().tile({ - container: container + container }); }); }); }); - it('Invalid container values fail', function () { - ['zoinks', 1].forEach(function (container) { - assert.throws(function () { + it('Invalid container values fail', () => { + ['zoinks', 1].forEach((container) => { + assert.throws(() => { sharp().tile({ - container: container + container }); }); }); }); - it('Valid layout values pass', function () { - ['dz', 'google', 'zoomify'].forEach(function (layout) { - assert.doesNotThrow(function () { + it('Valid layout values pass', () => { + ['dz', 'google', 'zoomify'].forEach((layout) => { + assert.doesNotThrow(() => { sharp().tile({ - layout: layout + layout }); }); }); }); - it('Invalid layout values fail', function () { - ['zoinks', 1].forEach(function (layout) { - assert.throws(function () { + it('Invalid layout values fail', () => { + ['zoinks', 1].forEach((layout) => { + assert.throws(() => { sharp().tile({ - layout: layout + layout }); }); }); }); - it('Valid formats pass', function () { - ['jpeg', 'png', 'webp'].forEach(function (format) { - assert.doesNotThrow(function () { + it('Valid formats pass', () => { + ['jpeg', 'png', 'webp'].forEach((format) => { + assert.doesNotThrow(() => { sharp().toFormat(format).tile(); }); }); }); - it('Invalid formats fail', function () { - ['tiff', 'raw'].forEach(function (format) { - assert.throws(function () { + it('Invalid formats fail', () => { + ['tiff', 'raw'].forEach((format) => { + assert.throws(() => { sharp().toFormat(format).tile(); }); }); }); - it('Prevent larger overlap than default size', function () { - assert.throws(function () { - sharp().tile({overlap: 257}); + it('Valid depths pass', () => { + ['onepixel', 'onetile', 'one'].forEach((depth) => { + assert.doesNotThrow(() => sharp().tile({ depth })); + }); + }); + + it('Invalid depths fail', () => { + ['depth', 1].forEach((depth) => { + assert.throws( + () => sharp().tile({ depth }), + /Expected one of: onepixel, onetile, one for depth but received/ + ); + }); + }); + + it('Prevent larger overlap than default size', () => { + assert.throws(() => { + sharp().tile({ + overlap: 257 + }); }); }); - it('Prevent larger overlap than provided size', function () { - assert.throws(function () { - sharp().tile({size: 512, overlap: 513}); + it('Prevent larger overlap than provided size', () => { + assert.throws(() => { + sharp().tile({ + size: 512, + overlap: 513 + }); }); }); - it('Deep Zoom layout', function (done) { - const directory = fixtures.path('output.dzi_files'); - rimraf(directory, function () { - sharp(fixtures.inputJpg) - .toFile(fixtures.path('output.dzi'), function (err, info) { - if (err) throw err; - assert.strictEqual('dz', info.format); - assert.strictEqual(2725, info.width); - assert.strictEqual(2225, info.height); - assert.strictEqual(3, info.channels); - assert.strictEqual('undefined', typeof info.size); - assertDeepZoomTiles(directory, 256, 13, done); + it('Valid rotation angle values pass', () => { + [90, 270, -90].forEach((angle) => { + assert.doesNotThrow(() => { + sharp().tile({ + angle }); + }); }); }); - it('Deep Zoom layout with custom size+overlap', function (done) { - const directory = fixtures.path('output.512.dzi_files'); - rimraf(directory, function () { - sharp(fixtures.inputJpg) - .tile({ - size: 512, - overlap: 16 - }) - .toFile(fixtures.path('output.512.dzi'), function (err, info) { - if (err) throw err; - assert.strictEqual('dz', info.format); - assert.strictEqual(2725, info.width); - assert.strictEqual(2225, info.height); - assert.strictEqual(3, info.channels); - assert.strictEqual('undefined', typeof info.size); - assertDeepZoomTiles(directory, 512 + (2 * 16), 13, done); + it('Invalid rotation angle values fail', () => { + ['zoinks', 1.1, -1, 27].forEach((angle) => { + assert.throws(() => { + sharp().tile({ + angle }); + }); }); }); - it('Zoomify layout', function (done) { - const directory = fixtures.path('output.zoomify.dzi'); - rimraf(directory, function () { - sharp(fixtures.inputJpg) - .tile({ - layout: 'zoomify' - }) - .toFile(fixtures.path('output.zoomify.dzi'), function (err, info) { - if (err) throw err; - assert.strictEqual('dz', info.format); - assert.strictEqual(2725, info.width); - assert.strictEqual(2225, info.height); - assert.strictEqual(3, info.channels); - assert.strictEqual('number', typeof info.size); - fs.stat(path.join(directory, 'ImageProperties.xml'), function (err, stat) { - if (err) throw err; - assert.strictEqual(true, stat.isFile()); - assert.strictEqual(true, stat.size > 0); - done(); - }); + it('Valid skipBlanks threshold values pass', () => { + [-1, 0, 255, 65535].forEach((skipBlanksThreshold) => { + assert.doesNotThrow(() => { + sharp().tile({ + skipBlanks: skipBlanksThreshold }); + }); }); }); - it('Google layout', function (done) { - const directory = fixtures.path('output.google.dzi'); - rimraf(directory, function () { - sharp(fixtures.inputJpg) - .tile({ - layout: 'google' - }) - .toFile(directory, function (err, info) { - if (err) throw err; - assert.strictEqual('dz', info.format); - assert.strictEqual(2725, info.width); - assert.strictEqual(2225, info.height); - assert.strictEqual(3, info.channels); - assert.strictEqual('number', typeof info.size); - fs.stat(path.join(directory, '0', '0', '0.jpg'), function (err, stat) { - if (err) throw err; - assert.strictEqual(true, stat.isFile()); - assert.strictEqual(true, stat.size > 0); - done(); - }); + it('InvalidskipBlanks threshold values fail', () => { + ['zoinks', -2, 65536].forEach((skipBlanksThreshold) => { + assert.throws(() => { + sharp().tile({ + skipBlanks: skipBlanksThreshold }); + }); + }); + }); + + it('Valid center parameter value passes', () => { + assert.doesNotThrow( + () => sharp().tile({ center: true }) + ); + }); + + it('Invalid centre parameter value fails', () => { + assert.throws( + () => sharp().tile({ centre: 'true' }), + /Expected boolean for tileCentre but received true of type string/ + ); + }); + + it('Valid id parameter value passes', () => { + assert.doesNotThrow(() => { + sharp().tile({ + id: 'test' + }); + }); + }); + + it('Invalid id parameter value fails', () => { + assert.throws(() => { + sharp().tile({ + id: true + }); }); }); - it('Google layout with jpeg format', function (done) { - const directory = fixtures.path('output.jpg.google.dzi'); - rimraf(directory, function () { - sharp(fixtures.inputJpg) - .jpeg({ quality: 1 }) - .tile({ - layout: 'google' - }) - .toFile(directory, function (err, info) { - if (err) throw err; - assert.strictEqual('dz', info.format); - assert.strictEqual(2725, info.width); - assert.strictEqual(2225, info.height); - assert.strictEqual(3, info.channels); - assert.strictEqual('number', typeof info.size); - const sample = path.join(directory, '0', '0', '0.jpg'); - sharp(sample).metadata(function (err, metadata) { + it('Valid basename parameter value passes', () => { + assert.doesNotThrow( + () => sharp().tile({ basename: 'pass' }) + ); + }); + + it('Invalid basename parameter value fails', () => { + assert.throws( + () => sharp().tile({ basename: true }), + /Expected string for basename but received/ + ); + }); + + if (sharp.format.dz.output.file) { + it('Deep Zoom layout', (_t, done) => { + const directory = fixtures.path('output.dzi_files'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .toFile(fixtures.path('output.dzi'), (err, info) => { + if (err) throw err; + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual('undefined', typeof info.size); + assertDeepZoomTiles(directory, 256, 13, done); + }); + }); + }); + + it('Deep Zoom layout with custom size+overlap', (_t, done) => { + const directory = fixtures.path('output.512.dzi_files'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .tile({ + size: 512, + overlap: 16 + }) + .toFile(fixtures.path('output.512.dzi'), (err, info) => { + if (err) throw err; + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual('undefined', typeof info.size); + assertDeepZoomTiles(directory, 512 + (2 * 16), 13, () => { + assertTileOverlap(directory, 512, done); + }); + }); + }); + }); + + it('Deep Zoom layout with custom size+angle', (_t, done) => { + const directory = fixtures.path('output.512_90.dzi_files'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .tile({ + size: 512, + angle: 90 + }) + .toFile(fixtures.path('output.512_90.dzi'), (err, info) => { + if (err) throw err; + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual('undefined', typeof info.size); + assertDeepZoomTiles(directory, 512, 13, done); + // Verifies tiles in 10th level are rotated + const tile = path.join(directory, '10', '0_1.jpeg'); + // verify that the width and height correspond to the rotated image + // expected are w=512 and h=170 for the 0_1.jpeg. + // if a 0 angle is supplied to the .tile function + // the expected values are w=170 and h=512 for the 1_0.jpeg + sharp(tile).metadata((err, metadata) => { + if (err) { + throw err; + } else { + assert.strictEqual(true, metadata.width === 512); + assert.strictEqual(true, metadata.height === 170); + } + }); + }); + }); + }); + + it('Deep Zoom layout with depth of one', (_t, done) => { + const directory = fixtures.path('output.512_depth_one.dzi_files'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .tile({ + size: 512, + depth: 'one' + }) + .toFile(fixtures.path('output.512_depth_one.dzi'), (err) => { + if (err) throw err; + // Verify only one depth generated + assertDeepZoomTiles(directory, 512, 1, done); + }); + }); + }); + + it('Deep Zoom layout with depth of onepixel', (_t, done) => { + const directory = fixtures.path('output.512_depth_onepixel.dzi_files'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .tile({ + size: 512, + depth: 'onepixel' + }) + .toFile(fixtures.path('output.512_depth_onepixel.dzi'), (err) => { if (err) throw err; - assert.strictEqual('jpeg', metadata.format); - assert.strictEqual('srgb', metadata.space); - assert.strictEqual(3, metadata.channels); - assert.strictEqual(false, metadata.hasProfile); - assert.strictEqual(false, metadata.hasAlpha); - assert.strictEqual(256, metadata.width); - assert.strictEqual(256, metadata.height); - fs.stat(sample, function (err, stat) { + // Verify only one depth generated + assertDeepZoomTiles(directory, 512, 13, done); + }); + }); + }); + + it('Deep Zoom layout with depth of onetile', (_t, done) => { + const directory = fixtures.path('output.256_depth_onetile.dzi_files'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .tile({ + size: 256, + depth: 'onetile' + }) + .toFile(fixtures.path('output.256_depth_onetile.dzi'), (err) => { + if (err) throw err; + // Verify only one depth generated + assertDeepZoomTiles(directory, 256, 5, done); + }); + }); + }); + + it('Deep Zoom layout with skipBlanks', (_t, done) => { + const directory = fixtures.path('output.256_skip_blanks.dzi_files'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpgOverlayLayer2) + .tile({ + size: 256, + skipBlanks: 0 + }) + .toFile(fixtures.path('output.256_skip_blanks.dzi'), (err) => { + if (err) throw err; + // assert them 0_0.jpeg doesn't exist because it's a white tile + const whiteTilePath = path.join(directory, '11', '0_0.jpeg'); + assert.strictEqual(fs.existsSync(whiteTilePath), false, 'Tile should not exist'); + // Verify only one depth generated + assertDeepZoomTiles(directory, 256, 12, done); + }); + }); + }); + + it('Zoomify layout', (_t, done) => { + const directory = fixtures.path('output.zoomify.dzi'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .tile({ + layout: 'zoomify' + }) + .toFile(fixtures.path('output.zoomify.dzi'), (err, info) => { + if (err) throw err; + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual(undefined, info.size); + fs.stat(path.join(directory, 'ImageProperties.xml'), (err, stat) => { if (err) throw err; - assert.strictEqual(true, stat.size < 2000); + assert.strictEqual(true, stat.isFile()); + assert.strictEqual(true, stat.size > 0); done(); }); }); - }); + }); }); - }); - it('Google layout with png format', function (done) { - const directory = fixtures.path('output.png.google.dzi'); - rimraf(directory, function () { - sharp(fixtures.inputJpg) - .png({ compressionLevel: 1 }) - .tile({ - layout: 'google' - }) - .toFile(directory, function (err, info) { - if (err) throw err; - assert.strictEqual('dz', info.format); - assert.strictEqual(2725, info.width); - assert.strictEqual(2225, info.height); - assert.strictEqual(3, info.channels); - assert.strictEqual('number', typeof info.size); - const sample = path.join(directory, '0', '0', '0.png'); - sharp(sample).metadata(function (err, metadata) { + it('Zoomify layout with depth one', (_t, done) => { + const directory = fixtures.path('output.zoomify.depth_one.dzi'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .tile({ + size: 256, + layout: 'zoomify', + depth: 'one' + }) + .toFile(directory, (err, info) => { if (err) throw err; - assert.strictEqual('png', metadata.format); - assert.strictEqual('srgb', metadata.space); - assert.strictEqual(3, metadata.channels); - assert.strictEqual(false, metadata.hasProfile); - assert.strictEqual(false, metadata.hasAlpha); - assert.strictEqual(256, metadata.width); - assert.strictEqual(256, metadata.height); - fs.stat(sample, function (err, stat) { + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual(undefined, info.size); + assertZoomifyTiles(directory, 1, done); + }); + }); + }); + + it('Zoomify layout with depth onetile', (_t, done) => { + const directory = fixtures.path('output.zoomify.depth_onetile.dzi'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .tile({ + size: 256, + layout: 'zoomify', + depth: 'onetile' + }) + .toFile(directory, (err, info) => { + if (err) throw err; + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual(undefined, info.size); + assertZoomifyTiles(directory, 5, done); + }); + }); + }); + + it('Zoomify layout with depth onepixel', (_t, done) => { + const directory = fixtures.path('output.zoomify.depth_onepixel.dzi'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .tile({ + size: 256, + layout: 'zoomify', + depth: 'onepixel' + }) + .toFile(directory, (err, info) => { + if (err) throw err; + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual(undefined, info.size); + assertZoomifyTiles(directory, 13, done); + }); + }); + }); + + it('Zoomify layout with skip blanks', (_t, done) => { + const directory = fixtures.path('output.zoomify.skipBlanks.dzi'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpgOverlayLayer2) + .tile({ + size: 256, + layout: 'zoomify', + skipBlanks: 0 + }) + .toFile(directory, (err, info) => { + if (err) throw err; + // assert them 0_0.jpeg doesn't exist because it's a white tile + const whiteTilePath = path.join(directory, 'TileGroup0', '2-0-0.jpg'); + assert.strictEqual(fs.existsSync(whiteTilePath), false, 'Tile should not exist'); + assert.strictEqual('dz', info.format); + assert.strictEqual(2048, info.width); + assert.strictEqual(1536, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual(undefined, info.size); + assertZoomifyTiles(directory, 4, done); + }); + }); + }); + + it('Google layout', (_t, done) => { + const directory = fixtures.path('output.google.dzi'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .tile({ + layout: 'google' + }) + .toFile(directory, (err, info) => { + if (err) throw err; + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual(undefined, info.size); + fs.stat(path.join(directory, '0', '0', '0.jpg'), (err, stat) => { if (err) throw err; - assert.strictEqual(true, stat.size > 44000); + assert.strictEqual(true, stat.isFile()); + assert.strictEqual(true, stat.size > 0); done(); }); }); - }); + }); }); - }); - it('Google layout with webp format', function (done) { - const directory = fixtures.path('output.webp.google.dzi'); - rimraf(directory, function () { - sharp(fixtures.inputJpg) - .webp({ quality: 1 }) - .tile({ - layout: 'google' - }) - .toFile(directory, function (err, info) { - if (err) throw err; - assert.strictEqual('dz', info.format); - assert.strictEqual(2725, info.width); - assert.strictEqual(2225, info.height); - assert.strictEqual(3, info.channels); - assert.strictEqual('number', typeof info.size); - const sample = path.join(directory, '0', '0', '0.webp'); - sharp(sample).metadata(function (err, metadata) { + it('Google layout with jpeg format', (_t, done) => { + const directory = fixtures.path('output.jpg.google.dzi'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .jpeg({ + quality: 1 + }) + .tile({ + layout: 'google' + }) + .toFile(directory, (err, info) => { if (err) throw err; - assert.strictEqual('webp', metadata.format); - assert.strictEqual('srgb', metadata.space); - assert.strictEqual(3, metadata.channels); - assert.strictEqual(false, metadata.hasProfile); - assert.strictEqual(false, metadata.hasAlpha); - assert.strictEqual(256, metadata.width); - assert.strictEqual(256, metadata.height); - fs.stat(sample, function (err, stat) { + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual(undefined, info.size); + const sample = path.join(directory, '0', '0', '0.jpg'); + sharp(sample).metadata((err, metadata) => { if (err) throw err; - assert.strictEqual(true, stat.size < 2000); - done(); + assert.strictEqual('jpeg', metadata.format); + assert.strictEqual('srgb', metadata.space); + assert.strictEqual(3, metadata.channels); + assert.strictEqual(false, metadata.hasProfile); + assert.strictEqual(false, metadata.hasAlpha); + assert.strictEqual(256, metadata.width); + assert.strictEqual(256, metadata.height); + fs.stat(sample, (err, stat) => { + if (err) throw err; + assert.strictEqual(true, stat.size < 2000); + done(); + }); }); }); - }); + }); }); - }); - it('Write to ZIP container using file extension', function (done) { - const container = fixtures.path('output.dz.container.zip'); - const extractTo = fixtures.path('output.dz.container'); - const directory = path.join(extractTo, 'output.dz.container_files'); - rimraf(directory, function () { - sharp(fixtures.inputJpg) - .toFile(container, function (err, info) { - if (err) throw err; - assert.strictEqual('dz', info.format); - assert.strictEqual(2725, info.width); - assert.strictEqual(2225, info.height); - assert.strictEqual(3, info.channels); - assert.strictEqual('number', typeof info.size); - fs.stat(container, function (err, stat) { + it('Google layout with png format', (_t, done) => { + const directory = fixtures.path('output.png.google.dzi'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .png({ + compressionLevel: 0 + }) + .tile({ + layout: 'google' + }) + .toFile(directory, (err, info) => { if (err) throw err; - assert.strictEqual(true, stat.isFile()); - assert.strictEqual(true, stat.size > 0); - fs.createReadStream(container) - .pipe(unzip.Extract({path: path.dirname(extractTo)})) - .on('error', function (err) { throw err; }) - .on('close', function () { - assertDeepZoomTiles(directory, 256, 13, done); + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual(undefined, info.size); + const sample = path.join(directory, '0', '0', '0.png'); + sharp(sample).metadata((err, metadata) => { + if (err) throw err; + assert.strictEqual('png', metadata.format); + assert.strictEqual('srgb', metadata.space); + assert.strictEqual(3, metadata.channels); + assert.strictEqual(false, metadata.hasProfile); + assert.strictEqual(false, metadata.hasAlpha); + assert.strictEqual(256, metadata.width); + assert.strictEqual(256, metadata.height); + fs.stat(sample, (err, stat) => { + if (err) throw err; + assert.strictEqual(true, stat.size > 44000); + done(); }); + }); }); - }); + }); }); - }); - it('Write to ZIP container using container tile option', function (done) { - const container = fixtures.path('output.dz.containeropt.zip'); - const extractTo = fixtures.path('output.dz.containeropt'); - const directory = path.join(extractTo, 'output.dz.containeropt_files'); - rimraf(directory, function () { - sharp(fixtures.inputJpg) - .tile({ - container: 'zip' - }) - .toFile(container, function (err, info) { - // Vips overrides .dzi extension to .zip used by container var below - if (err) throw err; - assert.strictEqual('dz', info.format); - assert.strictEqual(2725, info.width); - assert.strictEqual(2225, info.height); - assert.strictEqual(3, info.channels); - assert.strictEqual('number', typeof info.size); - fs.stat(container, function (err, stat) { + it('Google layout with webp format', (_t, done) => { + const directory = fixtures.path('output.webp.google.dzi'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .webp({ + quality: 1, + effort: 0 + }) + .tile({ + layout: 'google' + }) + .toFile(directory, (err, info) => { if (err) throw err; - assert.strictEqual(true, stat.isFile()); - assert.strictEqual(true, stat.size > 0); - fs.createReadStream(container) - .pipe(unzip.Extract({path: path.dirname(extractTo)})) - .on('error', function (err) { throw err; }) - .on('close', function () { - assertDeepZoomTiles(directory, 256, 13, done); + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual(undefined, info.size); + const sample = path.join(directory, '0', '0', '0.webp'); + sharp(sample).metadata((err, metadata) => { + if (err) throw err; + assert.strictEqual('webp', metadata.format); + assert.strictEqual('srgb', metadata.space); + assert.strictEqual(3, metadata.channels); + assert.strictEqual(false, metadata.hasProfile); + assert.strictEqual(false, metadata.hasAlpha); + assert.strictEqual(256, metadata.width); + assert.strictEqual(256, metadata.height); + fs.stat(sample, (err, stat) => { + if (err) throw err; + assert.strictEqual(true, stat.size < 2000); + done(); }); + }); }); - }); + }); }); - }); + + it('Google layout with depth one', (_t, done) => { + const directory = fixtures.path('output.google_depth_one.dzi'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .tile({ + layout: 'google', + depth: 'one', + size: 256 + }) + .toFile(directory, (err, info) => { + if (err) throw err; + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual(undefined, info.size); + assertGoogleTiles(directory, 1, done); + }); + }); + }); + + it('Google layout with depth onetile', (_t, done) => { + const directory = fixtures.path('output.google_depth_onetile.dzi'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .tile({ + layout: 'google', + depth: 'onetile', + size: 256 + }) + .toFile(directory, (err, info) => { + if (err) throw err; + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual(undefined, info.size); + assertGoogleTiles(directory, 5, done); + }); + }); + }); + + it('Google layout with default skip Blanks', (_t, done) => { + const directory = fixtures.path('output.google_depth_skipBlanks.dzi'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputPng) + .tile({ + layout: 'google', + size: 256 + }) + .toFile(directory, (err, info) => { + if (err) throw err; + + const whiteTilePath = path.join(directory, '4', '8', '0.jpg'); + assert.strictEqual(fs.existsSync(whiteTilePath), false, 'Tile should not exist'); + + assert.strictEqual('dz', info.format); + assert.strictEqual(2809, info.width); + assert.strictEqual(2074, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual(undefined, info.size); + assertGoogleTiles(directory, 5, done); + }); + }); + }); + + it('Google layout with center image in tile', (_t, done) => { + const directory = fixtures.path('output.google_center.dzi'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .tile({ + center: true, + layout: 'google' + }) + .toFile(directory, (err, info) => { + if (err) throw err; + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual(undefined, info.size); + fixtures.assertSimilar(fixtures.expected('tile_centered.jpg'), fs.readFileSync(path.join(directory, '0', '0', '0.jpg')), done); + }); + }); + }); + + it('Google layout with center image in tile centre', (_t, done) => { + const directory = fixtures.path('output.google_center.dzi'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .tile({ + centre: true, + layout: 'google' + }) + .toFile(directory, (err, info) => { + if (err) throw err; + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual(undefined, info.size); + fixtures.assertSimilar(fixtures.expected('tile_centered.jpg'), fs.readFileSync(path.join(directory, '0', '0', '0.jpg')), done); + }); + }); + }); + + it('IIIFv2 layout', (_t, done) => { + const name = 'output.iiif.info'; + const directory = fixtures.path(name); + fs.rm(directory, { recursive: true }, () => { + const id = 'https://sharp.test.com/iiif'; + sharp(fixtures.inputJpg) + .tile({ + layout: 'iiif', + id + }) + .toFile(directory, (err, info) => { + if (err) throw err; + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual(undefined, info.size); + const infoJson = require(path.join(directory, 'info.json')); + assert.strictEqual('http://iiif.io/api/image/2/context.json', infoJson['@context']); + assert.strictEqual(`${id}/${name}`, infoJson['@id']); + fs.stat(path.join(directory, '0,0,256,256', '256,', '0', 'default.jpg'), (err, stat) => { + if (err) throw err; + assert.strictEqual(true, stat.isFile()); + assert.strictEqual(true, stat.size > 0); + done(); + }); + }); + }); + }); + + it('IIIFv3 layout', (_t, done) => { + const name = 'output.iiif3.info'; + const directory = fixtures.path(name); + fs.rm(directory, { recursive: true }, () => { + const id = 'https://sharp.test.com/iiif3'; + sharp(fixtures.inputJpg) + .tile({ + layout: 'iiif3', + id + }) + .toFile(directory, (err, info) => { + if (err) throw err; + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual(undefined, info.size); + const infoJson = require(path.join(directory, 'info.json')); + assert.strictEqual('http://iiif.io/api/image/3/context.json', infoJson['@context']); + assert.strictEqual('ImageService3', infoJson.type); + assert.strictEqual(`${id}/${name}`, infoJson.id); + fs.stat(path.join(directory, '0,0,256,256', '256,256', '0', 'default.jpg'), (err, stat) => { + if (err) throw err; + assert.strictEqual(true, stat.isFile()); + assert.strictEqual(true, stat.size > 0); + done(); + }); + }); + }); + }); + + it('Write to ZIP container using file extension', (_t, done) => { + const container = fixtures.path('output.dz.container.zip'); + const extractTo = fixtures.path('output.dz.container'); + const directory = path.join(extractTo, 'output.dz.container_files'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .toFile(container, (err, info) => { + if (err) throw err; + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual('number', typeof info.size); + fs.stat(container, (err, stat) => { + if (err) throw err; + assert.strictEqual(true, stat.isFile()); + assert.strictEqual(true, stat.size > 0); + extractZip(container, { dir: extractTo }) + .then(() => { + assertDeepZoomTiles(directory, 256, 13, done); + }) + .catch(_t, done); + }); + }); + }); + }); + + it('Write to ZIP container using container tile option', (_t, done) => { + const container = fixtures.path('output.dz.containeropt.zip'); + const extractTo = fixtures.path('output.dz.containeropt'); + const directory = path.join(extractTo, 'output.dz.containeropt_files'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .tile({ + container: 'zip' + }) + .toFile(container, (err, info) => { + // Vips overrides .dzi extension to .zip used by container var below + if (err) throw err; + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual('number', typeof info.size); + fs.stat(container, (err, stat) => { + if (err) throw err; + assert.strictEqual(true, stat.isFile()); + assert.strictEqual(true, stat.size > 0); + extractZip(container, { dir: extractTo }) + .then(() => { + assertDeepZoomTiles(directory, 256, 13, done); + }) + .catch(_t, done); + }); + }); + }); + }); + + it('Write ZIP container to Buffer', (_t, done) => { + const container = fixtures.path('output.dz.tiles.zip'); + const extractTo = fixtures.path('output.dz.tiles'); + const directory = path.join(extractTo, 'output.dz.tiles_files'); + fs.rm(directory, { recursive: true }, () => { + sharp(fixtures.inputJpg) + .tile({ basename: 'output.dz.tiles' }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual('dz', info.format); + assert.strictEqual(2725, info.width); + assert.strictEqual(2225, info.height); + assert.strictEqual(3, info.channels); + assert.strictEqual('number', typeof info.size); + fs.writeFileSync(container, data); + fs.stat(container, (err, stat) => { + if (err) throw err; + assert.strictEqual(true, stat.isFile()); + assert.strictEqual(true, stat.size > 0); + extractZip(container, { dir: extractTo }) + .then(() => { + assertDeepZoomTiles(directory, 256, 13, done); + }) + .catch(_t, done); + }); + }); + }); + }); + } }); diff --git a/test/unit/timeout.js b/test/unit/timeout.js new file mode 100644 index 000000000..77aad8fa6 --- /dev/null +++ b/test/unit/timeout.js @@ -0,0 +1,30 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('Timeout', () => { + it('Will timeout after 1s when performing slow blur operation', () => assert.rejects( + () => sharp(fixtures.inputJpg) + .blur(200) + .timeout({ seconds: 1 }) + .toBuffer(), + /timeout: [0-9]+% complete/ + )); + + it('invalid object', () => assert.throws( + () => sharp().timeout('fail'), + /Expected object for options but received fail of type string/ + )); + + it('invalid seconds', () => assert.throws( + () => sharp().timeout({ seconds: 'fail' }), + /Expected integer between 0 and 3600 for seconds but received fail of type string/ + )); +}); diff --git a/test/unit/tint.js b/test/unit/tint.js new file mode 100644 index 000000000..db3f4984f --- /dev/null +++ b/test/unit/tint.js @@ -0,0 +1,109 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +// Allow for small rounding differences between platforms +const maxDistance = 6; + +describe('Tint', () => { + it('tints rgb image red', (_t, done) => { + const output = fixtures.path('output.tint-red.jpg'); + sharp(fixtures.inputJpg) + .resize(320, 240) + .tint('#FF0000') + .toFile(output, (err, info) => { + if (err) throw err; + assert.strictEqual(true, info.size > 0); + fixtures.assertMaxColourDistance(output, fixtures.expected('tint-red.jpg'), maxDistance); + done(); + }); + }); + + it('tints rgb image green', (_t, done) => { + const output = fixtures.path('output.tint-green.jpg'); + sharp(fixtures.inputJpg) + .resize(320, 240) + .tint('#00FF00') + .toFile(output, (err, info) => { + if (err) throw err; + assert.strictEqual(true, info.size > 0); + fixtures.assertMaxColourDistance(output, fixtures.expected('tint-green.jpg'), maxDistance); + done(); + }); + }); + + it('tints rgb image blue', (_t, done) => { + const output = fixtures.path('output.tint-blue.jpg'); + sharp(fixtures.inputJpg) + .resize(320, 240) + .tint('#0000FF') + .toFile(output, (err, info) => { + if (err) throw err; + assert.strictEqual(true, info.size > 0); + fixtures.assertMaxColourDistance(output, fixtures.expected('tint-blue.jpg'), maxDistance); + done(); + }); + }); + + it('tints rgb image with sepia tone', (_t, done) => { + const output = fixtures.path('output.tint-sepia-hex.jpg'); + sharp(fixtures.inputJpg) + .resize(320, 240) + .tint('#704214') + .toFile(output, (err, info) => { + if (err) throw err; + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + fixtures.assertMaxColourDistance(output, fixtures.expected('tint-sepia.jpg'), maxDistance); + done(); + }); + }); + + it('tints rgb image with sepia tone with rgb colour', (_t, done) => { + const output = fixtures.path('output.tint-sepia-rgb.jpg'); + sharp(fixtures.inputJpg) + .resize(320, 240) + .tint([112, 66, 20]) + .toFile(output, (err, info) => { + if (err) throw err; + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + fixtures.assertMaxColourDistance(output, fixtures.expected('tint-sepia.jpg'), maxDistance); + done(); + }); + }); + + it('tints rgb image with alpha channel', (_t, done) => { + const output = fixtures.path('output.tint-alpha.png'); + sharp(fixtures.inputPngRGBWithAlpha) + .resize(320, 240) + .tint('#704214') + .toFile(output, (err, info) => { + if (err) throw err; + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + fixtures.assertMaxColourDistance(output, fixtures.expected('tint-alpha.png'), maxDistance); + done(); + }); + }); + + it('tints cmyk image red', (_t, done) => { + const output = fixtures.path('output.tint-cmyk.jpg'); + sharp(fixtures.inputJpgWithCmykProfile) + .resize(320, 240) + .tint('#FF0000') + .toFile(output, (err, info) => { + if (err) throw err; + assert.strictEqual(true, info.size > 0); + fixtures.assertMaxColourDistance(output, fixtures.expected('tint-cmyk.jpg'), maxDistance); + done(); + }); + }); +}); diff --git a/test/unit/toBuffer.js b/test/unit/toBuffer.js new file mode 100644 index 000000000..7e4b1d452 --- /dev/null +++ b/test/unit/toBuffer.js @@ -0,0 +1,28 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('toBuffer', () => { + it('reusing same sharp object does not reset previously passed parameters to toBuffer', async () => { + const image = sharp(fixtures.inputJpg); + const obj = await image.toBuffer({ resolveWithObject: true }); + assert.strictEqual(typeof obj, 'object'); + assert.strictEqual(typeof obj.info, 'object'); + assert.strictEqual(Buffer.isBuffer(obj.data), true); + const data = await image.toBuffer(); + assert.strictEqual(Buffer.isBuffer(data), true); + }); + + it('correctly process animated webp with height > 16383', async () => { + const data = await sharp(fixtures.inputWebPAnimatedBigHeight, { animated: true }) + .toBuffer(); + assert.strictEqual(Buffer.isBuffer(data), true); + }); +}); diff --git a/test/unit/toFormat.js b/test/unit/toFormat.js new file mode 100644 index 000000000..0974e4604 --- /dev/null +++ b/test/unit/toFormat.js @@ -0,0 +1,31 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const assert = require('node:assert'); +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('toFormat', () => { + it('accepts upper case characters as format parameter (string)', async () => { + const data = await sharp(fixtures.inputJpg) + .resize(8, 8) + .toFormat('PNG') + .toBuffer(); + + const { format } = await sharp(data).metadata(); + assert.strictEqual(format, 'png'); + }); + + it('accepts upper case characters as format parameter (object)', async () => { + const data = await sharp(fixtures.inputJpg) + .resize(8, 8) + .toFormat({ id: 'PNG' }) + .toBuffer(); + + const { format } = await sharp(data).metadata(); + assert.strictEqual(format, 'png'); + }); +}); diff --git a/test/unit/trim.js b/test/unit/trim.js index d18247f8c..99b14a8d9 100644 --- a/test/unit/trim.js +++ b/test/unit/trim.js @@ -1,47 +1,292 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); const sharp = require('../../'); +const inRange = require('../../lib/is').inRange; const fixtures = require('../fixtures'); -describe('Trim borders', function () { - it('Threshold default', function (done) { - const expected = fixtures.expected('alpha-layer-1-fill-trim-resize.png'); - sharp(fixtures.inputPngOverlayLayer1) - .resize(450, 322) +describe('Trim borders', () => { + it('Skip shrink-on-load', (_t, done) => { + const expected = fixtures.expected('alpha-layer-2-trim-resize.jpg'); + sharp(fixtures.inputJpgOverlayLayer2) .trim() - .toBuffer(function (err, data, info) { + .resize({ + width: 300, + fastShrinkOnLoad: false + }) + .toBuffer((err, data, info) => { if (err) throw err; - assert.strictEqual('png', info.format); - assert.strictEqual(450, info.width); - assert.strictEqual(322, info.height); + assert.strictEqual('jpeg', info.format); + assert.strictEqual(300, info.width); + assert.strictEqual(true, inRange(info.trimOffsetLeft, -873, -870)); + assert.strictEqual(-554, info.trimOffsetTop); fixtures.assertSimilar(expected, data, done); }); }); - it('16-bit PNG with alpha channel', function (done) { + it('Single colour PNG where alpha channel provides the image', () => + sharp(fixtures.inputPngImageInAlpha) + .trim() + .toBuffer({ resolveWithObject: true }) + .then(({ data, info }) => { + assert.strictEqual(true, data.length > 0); + assert.strictEqual('png', info.format); + assert.strictEqual(916, info.width); + assert.strictEqual(137, info.height); + assert.strictEqual(4, info.channels); + assert.strictEqual(-6, info.trimOffsetLeft); + assert.strictEqual(-20, info.trimOffsetTop); + }) + ); + + it('16-bit PNG with alpha channel', (_t, done) => { sharp(fixtures.inputPngWithTransparency16bit) .resize(32, 32) - .trim(20) - .toBuffer(function (err, data, info) { + .trim({ + threshold: 20 + }) + .toBuffer((err, data, info) => { if (err) throw err; assert.strictEqual(true, data.length > 0); assert.strictEqual('png', info.format); assert.strictEqual(32, info.width); assert.strictEqual(32, info.height); assert.strictEqual(4, info.channels); + assert.strictEqual(-2, info.trimOffsetLeft); + assert.strictEqual(-2, info.trimOffsetTop); fixtures.assertSimilar(fixtures.expected('trim-16bit-rgba.png'), data, done); }); }); - describe('Invalid thresholds', function () { - [-1, 100, 'fail', {}].forEach(function (threshold) { - it(JSON.stringify(threshold), function () { - assert.throws(function () { - sharp().trim(threshold); + it('Attempt to trim 2x2 pixel image fails', (_t, done) => { + sharp({ + create: { + width: 2, + height: 2, + channels: 3, + background: 'red' + } + }) + .trim() + .toBuffer() + .then(() => { + done(new Error('Expected an error')); + }) + .catch(err => { + assert.strictEqual('Image to trim must be at least 3x3 pixels', err.message); + done(); + }) + .catch(_t, done); + }); + + it('Should rotate before trim', () => + sharp({ + create: { + width: 20, + height: 30, + channels: 3, + background: 'white' + } + }) + .rotate(30) + .png() + .toBuffer() + .then(rotated30 => + sharp(rotated30) + .rotate(-30) + .trim({ + threshold: 128 + }) + .toBuffer({ resolveWithObject: true }) + .then(({ info }) => { + assert.strictEqual(20, info.width); + assert.strictEqual(31, info.height); + assert.strictEqual(-8, info.trimOffsetTop); + assert.strictEqual(-13, info.trimOffsetLeft); + }) + ) + ); + + it('Animated image rejects', () => + assert.rejects(() => sharp(fixtures.inputGifAnimated, { animated: true }) + .trim() + .toBuffer(), + /Trim is not supported for multi-page images/ + ) + ); + + it('Ensure trim uses bounding box of alpha and non-alpha channels', async () => { + const { info } = await sharp(fixtures.inputPngTrimIncludeAlpha) + .trim() + .toBuffer({ resolveWithObject: true }); + + const { width, height, trimOffsetTop, trimOffsetLeft } = info; + assert.strictEqual(width, 179); + assert.strictEqual(height, 123); + assert.strictEqual(trimOffsetTop, -44); + assert.strictEqual(trimOffsetLeft, -13); + }); + + it('Ensure greyscale image can be trimmed', async () => { + const greyscale = await sharp({ + create: { + width: 16, + height: 8, + channels: 3, + background: 'silver' + } + }) + .extend({ left: 12, right: 24, background: 'gray' }) + .toColourspace('b-w') + .png({ compressionLevel: 0 }) + .toBuffer(); + + const { info } = await sharp(greyscale) + .trim() + .raw() + .toBuffer({ resolveWithObject: true }); + + const { width, height, trimOffsetTop, trimOffsetLeft } = info; + assert.strictEqual(width, 16); + assert.strictEqual(height, 8); + assert.strictEqual(trimOffsetTop, 0); + assert.strictEqual(trimOffsetLeft, -12); + }); + + it('Ensure CMYK image can be trimmed', async () => { + const cmyk = await sharp({ + create: { + width: 16, + height: 8, + channels: 3, + background: 'red' + } + }) + .extend({ left: 12, right: 24, background: 'blue' }) + .toColourspace('cmyk') + .jpeg() + .toBuffer(); + + const { info } = await sharp(cmyk) + .trim() + .raw() + .toBuffer({ resolveWithObject: true }); + + const { width, height, trimOffsetTop, trimOffsetLeft } = info; + assert.strictEqual(width, 16); + assert.strictEqual(height, 8); + assert.strictEqual(trimOffsetTop, 0); + assert.strictEqual(trimOffsetLeft, -12); + }); + + it('Ensure trim of image with all pixels same is no-op', async () => { + const { info } = await sharp({ + create: { + width: 5, + height: 5, + channels: 3, + background: 'red' + } + }) + .trim() + .toBuffer({ resolveWithObject: true }); + + const { width, height, trimOffsetTop, trimOffsetLeft } = info; + assert.strictEqual(width, 5); + assert.strictEqual(height, 5); + assert.strictEqual(trimOffsetTop, 0); + assert.strictEqual(trimOffsetLeft, 0); + }); + + it('Works with line-art', async () => { + const { info } = await sharp(fixtures.inputJpgOverlayLayer2) + .trim({ lineArt: true }) + .toBuffer({ resolveWithObject: true }); + + assert.strictEqual(info.trimOffsetTop, -552); + }); + + describe('Invalid parameters', () => { + Object.entries({ + 'Invalid string': 'fail', + 'Invalid background option': { + background: 'fail' + }, + 'Negative threshold option': { + threshold: -1 + }, + 'Invalid lineArt': { + lineArt: 'fail' + } + }).forEach(([description, parameter]) => { + it(description, () => { + assert.throws(() => { + sharp().trim(parameter); }); }); }); }); + + describe('Specific background colour', () => { + it('Doesn\'t trim at all', async () => { + const { info } = await sharp(fixtures.inputPngTrimSpecificColour) + .trim({ + background: 'yellow' + }) + .toBuffer({ resolveWithObject: true }); + + const { width, height, trimOffsetTop, trimOffsetLeft } = info; + assert.strictEqual(width, 900); + assert.strictEqual(height, 600); + assert.strictEqual(trimOffsetTop, 0); + assert.strictEqual(trimOffsetLeft, 0); + }); + + it('Only trims the bottom', async () => { + const { info } = await sharp(fixtures.inputPngTrimSpecificColour) + .trim({ + background: '#21468B' + }) + .toBuffer({ resolveWithObject: true }); + + const { width, height, trimOffsetTop, trimOffsetLeft } = info; + assert.strictEqual(width, 900); + assert.strictEqual(height, 401); + assert.strictEqual(trimOffsetTop, 0); + assert.strictEqual(trimOffsetLeft, 0); + }); + + it('Only trims the bottom, in 16-bit', async () => { + const { info } = await sharp(fixtures.inputPngTrimSpecificColour16bit) + .trim({ + background: '#21468B' + }) + .toBuffer({ resolveWithObject: true }); + + const { width, height, trimOffsetTop, trimOffsetLeft } = info; + assert.strictEqual(width, 900); + assert.strictEqual(height, 401); + assert.strictEqual(trimOffsetTop, 0); + assert.strictEqual(trimOffsetLeft, 0); + }); + + it('Only trims the bottom, including alpha', async () => { + const { info } = await sharp(fixtures.inputPngTrimSpecificColourIncludeAlpha) + .trim({ + background: '#21468B80' + }) + .toBuffer({ resolveWithObject: true }); + + const { width, height, trimOffsetTop, trimOffsetLeft } = info; + assert.strictEqual(width, 900); + assert.strictEqual(height, 401); + assert.strictEqual(trimOffsetTop, 0); + assert.strictEqual(trimOffsetLeft, 0); + }); + }); }); diff --git a/test/unit/unflatten.js b/test/unit/unflatten.js new file mode 100644 index 000000000..1db60af33 --- /dev/null +++ b/test/unit/unflatten.js @@ -0,0 +1,32 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const { describe, it } = require('node:test'); +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('Unflatten', () => { + it('unflatten white background', (_t, done) => { + sharp(fixtures.inputPng).unflatten() + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(fixtures.expected('unflatten-white-transparent.png'), data, { threshold: 0 }, done); + }); + }); + it('unflatten transparent image', (_t, done) => { + sharp(fixtures.inputPngTrimSpecificColourIncludeAlpha).unflatten() + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(fixtures.expected('unflatten-flag-white-transparent.png'), data, { threshold: 0 }, done); + }); + }); + it('unflatten using threshold', (_t, done) => { + sharp(fixtures.inputPngPalette).unflatten().threshold(128, { grayscale: false }) + .toBuffer((err, data) => { + if (err) throw err; + fixtures.assertSimilar(fixtures.expected('unflatten-swiss.png'), data, { threshold: 1 }, done); + }); + }); +}); diff --git a/test/unit/util.js b/test/unit/util.js index 3d99a9c1a..e3676db6d 100644 --- a/test/unit/util.js +++ b/test/unit/util.js @@ -1,30 +1,38 @@ -'use strict'; +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ -const assert = require('assert'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); +const semver = require('semver'); const sharp = require('../../'); -const defaultConcurrency = sharp.concurrency(); - -describe('Utilities', function () { - describe('Cache', function () { - it('Can be disabled', function () { - sharp.cache(false); - const cache = sharp.cache(false); - assert.strictEqual(cache.memory.current, 0); - assert.strictEqual(cache.memory.max, 0); - assert.strictEqual(typeof cache.memory.high, 'number'); - assert.strictEqual(cache.files.current, 0); - assert.strictEqual(cache.files.max, 0); - assert.strictEqual(cache.items.current, 0); - assert.strictEqual(cache.items.max, 0); +describe('Utilities', () => { + describe('Cache', () => { + it('Can be disabled', (_t, done) => { + const check = setInterval(() => { + const cache = sharp.cache(false); + const empty = + cache.memory.current + + cache.memory.max + + cache.files.current + + cache.files.max + + cache.items.current + + cache.items.max === 0; + if (empty) { + clearInterval(check); + done(); + } + }, 2000); }); - it('Can be enabled with defaults', function () { + it('Can be enabled with defaults', () => { const cache = sharp.cache(true); assert.strictEqual(cache.memory.max, 50); assert.strictEqual(cache.files.max, 20); assert.strictEqual(cache.items.max, 100); }); - it('Can be set to zero', function () { + it('Can be set to zero', () => { const cache = sharp.cache({ memory: 0, files: 0, @@ -34,7 +42,7 @@ describe('Utilities', function () { assert.strictEqual(cache.files.max, 0); assert.strictEqual(cache.items.max, 0); }); - it('Can be set to a maximum of 10MB, 100 files and 1000 items', function () { + it('Can be set to a maximum of 10MB, 100 files and 1000 items', () => { const cache = sharp.cache({ memory: 10, files: 100, @@ -44,7 +52,7 @@ describe('Utilities', function () { assert.strictEqual(cache.files.max, 100); assert.strictEqual(cache.items.max, 1000); }); - it('Ignores invalid values', function () { + it('Ignores invalid values', () => { sharp.cache(true); const cache = sharp.cache('spoons'); assert.strictEqual(cache.memory.max, 50); @@ -53,55 +61,58 @@ describe('Utilities', function () { }); }); - describe('Concurrency', function () { - it('Can be set to use 16 threads', function () { + describe('Concurrency', () => { + it('Can be set to use 16 threads', () => { sharp.concurrency(16); assert.strictEqual(16, sharp.concurrency()); }); - it('Can be reset to default', function () { + it('Can be reset to default', () => { sharp.concurrency(0); - assert.strictEqual(defaultConcurrency, sharp.concurrency()); + assert.strictEqual(true, sharp.concurrency() > 0); }); - it('Ignores invalid values', function () { - sharp.concurrency(0); + it('Ignores invalid values', () => { + const defaultConcurrency = sharp.concurrency(); sharp.concurrency('spoons'); assert.strictEqual(defaultConcurrency, sharp.concurrency()); }); }); - describe('Counters', function () { - it('Have zero value at rest', function () { - const counters = sharp.counters(); - assert.strictEqual(0, counters.queue); - assert.strictEqual(0, counters.process); + describe('Counters', () => { + it('Have zero value at rest', (_t, done) => { + queueMicrotask(() => { + const counters = sharp.counters(); + assert.strictEqual(0, counters.queue); + assert.strictEqual(0, counters.process); + done(); + }); }); }); - describe('SIMD', function () { - it('Can get current state', function () { + describe('SIMD', () => { + it('Can get current state', () => { const simd = sharp.simd(); assert.strictEqual(typeof simd, 'boolean'); }); - it('Can disable', function () { + it('Can disable', () => { const simd = sharp.simd(false); assert.strictEqual(simd, false); }); - it('Can attempt to enable', function () { + it('Can attempt to enable', () => { const simd = sharp.simd(true); assert.strictEqual(typeof simd, 'boolean'); }); }); - describe('Format', function () { - it('Contains expected attributes', function () { + describe('Format', () => { + it('Contains expected attributes', () => { assert.strictEqual('object', typeof sharp.format); - Object.keys(sharp.format).forEach(function (format) { + Object.keys(sharp.format).forEach((format) => { assert.strictEqual(true, 'id' in sharp.format[format]); assert.strictEqual(format, sharp.format[format].id); - ['input', 'output'].forEach(function (direction) { + ['input', 'output'].forEach((direction) => { assert.strictEqual(true, direction in sharp.format[format]); assert.strictEqual('object', typeof sharp.format[format][direction]); - assert.strictEqual(3, Object.keys(sharp.format[format][direction]).length); + assert.strictEqual(true, [3, 4].includes(Object.keys(sharp.format[format][direction]).length)); assert.strictEqual(true, 'file' in sharp.format[format][direction]); assert.strictEqual(true, 'buffer' in sharp.format[format][direction]); assert.strictEqual(true, 'stream' in sharp.format[format][direction]); @@ -111,19 +122,70 @@ describe('Utilities', function () { }); }); }); - it('Raw file=false, buffer=true, stream=true', function () { - ['input', 'output'].forEach(function (direction) { + it('Raw file=false, buffer=true, stream=true', () => { + ['input', 'output'].forEach((direction) => { assert.strictEqual(false, sharp.format.raw[direction].file); assert.strictEqual(true, sharp.format.raw[direction].buffer); assert.strictEqual(true, sharp.format.raw[direction].stream); }); }); + it('vips format supports filesystem only', () => { + ['input', 'output'].forEach((direction) => { + assert.strictEqual(true, sharp.format.vips[direction].file); + assert.strictEqual(false, sharp.format.vips[direction].buffer); + assert.strictEqual(false, sharp.format.vips[direction].stream); + }); + }); + it('input fileSuffix', () => { + assert.deepStrictEqual(['.jpg', '.jpeg', '.jpe', '.jfif'], sharp.format.jpeg.input.fileSuffix); + }); + it('output alias', () => { + assert.deepStrictEqual(['jpe', 'jpg'], sharp.format.jpeg.output.alias); + }); }); - describe('Versions', function () { - it('Contains expected attributes', function () { + describe('Versions', () => { + it('Contains expected attributes', () => { assert.strictEqual('object', typeof sharp.versions); - assert.strictEqual('string', typeof sharp.versions.vips); + assert(semver.valid(sharp.versions.vips)); + assert(semver.valid(sharp.versions.sharp)); + }); + }); + + describe('Block', () => { + it('Can block a named operation', () => { + sharp.block({ operation: ['test'] }); + }); + it('Can unblock a named operation', () => { + sharp.unblock({ operation: ['test'] }); + }); + it('Invalid block operation throws', () => { + assert.throws(() => sharp.block(1), + /Expected object for options but received 1 of type number/ + ); + assert.throws(() => sharp.block({}), + /Expected Array for operation but received undefined of type undefined/ + ); + assert.throws(() => sharp.block({ operation: 'fail' }), + /Expected Array for operation but received fail of type string/ + ); + assert.throws(() => sharp.block({ operation: ['maybe', false] }), + /Expected Array for operation but received maybe,false of type object/ + ); + }); + it('Invalid unblock operation throws', () => { + assert.throws(() => sharp.unblock(1), + /Expected object for options but received 1 of type number/ + ); + assert.throws(() => sharp.unblock({}), + /Expected Array for operation but received undefined of type undefined/ + ); + assert.throws(() => sharp.unblock({ operation: 'fail' }), + /Expected Array for operation but received fail of type string/ + ); + assert.throws(() => sharp.unblock({ operation: ['maybe', false] }), + /Expected Array for operation but received maybe,false of type object/ + ); }); }); }); diff --git a/test/unit/webp.js b/test/unit/webp.js new file mode 100644 index 000000000..22d37a522 --- /dev/null +++ b/test/unit/webp.js @@ -0,0 +1,320 @@ +/*! + Copyright 2013 Lovell Fuller and others. + SPDX-License-Identifier: Apache-2.0 +*/ + +const fs = require('node:fs'); +const { describe, it } = require('node:test'); +const assert = require('node:assert'); + +const sharp = require('../../'); +const fixtures = require('../fixtures'); + +describe('WebP', () => { + it('WebP output', (_t, done) => { + sharp(fixtures.inputJpg) + .resize(320, 240) + .toFormat(sharp.format.webp) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('webp', info.format); + assert.strictEqual(320, info.width); + assert.strictEqual(240, info.height); + done(); + }); + }); + + it('Invalid WebP quality throws error', () => { + assert.throws(() => { + sharp().webp({ quality: 101 }); + }); + }); + + it('Invalid WebP alpha quality throws error', () => { + assert.throws(() => { + sharp().webp({ alphaQuality: 101 }); + }); + }); + + it('should work for webp alpha quality', (_t, done) => { + sharp(fixtures.inputPngAlphaPremultiplicationSmall) + .webp({ alphaQuality: 80, effort: 0 }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('webp', info.format); + fixtures.assertSimilar(fixtures.expected('webp-alpha-80.webp'), data, done); + }); + }); + + it('should work for webp lossless', (_t, done) => { + sharp(fixtures.inputPngAlphaPremultiplicationSmall) + .webp({ lossless: true, effort: 0 }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('webp', info.format); + fixtures.assertSimilar(fixtures.expected('webp-lossless.webp'), data, done); + }); + }); + + it('should work for webp near-lossless', (_t, done) => { + sharp(fixtures.inputPngAlphaPremultiplicationSmall) + .webp({ nearLossless: true, quality: 50, effort: 0 }) + .toBuffer((err50, data50, info50) => { + if (err50) throw err50; + assert.strictEqual(true, data50.length > 0); + assert.strictEqual('webp', info50.format); + fixtures.assertSimilar(fixtures.expected('webp-near-lossless-50.webp'), data50, done); + }); + }); + + it('should use near-lossless when both lossless and nearLossless are specified', (_t, done) => { + sharp(fixtures.inputPngAlphaPremultiplicationSmall) + .webp({ nearLossless: true, quality: 50, lossless: true, effort: 0 }) + .toBuffer((err50, data50, info50) => { + if (err50) throw err50; + assert.strictEqual(true, data50.length > 0); + assert.strictEqual('webp', info50.format); + fixtures.assertSimilar(fixtures.expected('webp-near-lossless-50.webp'), data50, done); + }); + }); + + it('should produce a larger file size using smartSubsample', () => + sharp(fixtures.inputJpg) + .resize(320, 240) + .webp({ smartSubsample: false }) + .toBuffer() + .then(withoutSmartSubsample => + sharp(fixtures.inputJpg) + .resize(320, 240) + .webp({ smartSubsample: true }) + .toBuffer() + .then(withSmartSubsample => { + assert.strictEqual(true, withSmartSubsample.length > withoutSmartSubsample.length); + }) + ) + ); + + it('invalid smartSubsample throws', () => { + assert.throws(() => { + sharp().webp({ smartSubsample: 1 }); + }); + }); + + it('can produce a different file size using smartDeblock', () => + sharp(fixtures.inputPngOverlayLayer0) + .resize(320, 240) + .webp({ quality: 30, smartDeblock: false }) + .toBuffer() + .then(withoutSmartDeblock => + sharp(fixtures.inputPngOverlayLayer0) + .resize(320, 240) + .webp({ quality: 30, smartDeblock: true }) + .toBuffer() + .then(withSmartDeblock => { + assert.strictEqual(true, withSmartDeblock.length !== withoutSmartDeblock.length); + }) + ) + ); + + it('invalid smartDeblock throws', () => { + assert.throws( + () => sharp().webp({ smartDeblock: 1 }), + /Expected boolean for webpSmartDeblock but received 1 of type number/ + ); + }); + + it('should produce a different file size with specific preset', () => + sharp(fixtures.inputJpg) + .resize(320, 240) + .webp({ preset: 'default' }) + .toBuffer() + .then(presetDefault => + sharp(fixtures.inputJpg) + .resize(320, 240) + .webp({ preset: 'picture' }) + .toBuffer() + .then(presetPicture => { + assert.notStrictEqual(presetDefault.length, presetPicture.length); + }) + ) + ); + + it('invalid preset throws', () => { + assert.throws( + () => sharp().webp({ preset: 'fail' }), + /Expected one of: default, photo, picture, drawing, icon, text for preset but received fail of type string/ + ); + }); + + it('should produce a smaller file size with increased effort', () => + sharp(fixtures.inputJpg) + .resize(320, 240) + .webp() + .toBuffer() + .then(effort4 => + sharp(fixtures.inputJpg) + .resize(320, 240) + .webp({ effort: 6 }) + .toBuffer() + .then(effort6 => { + assert.strictEqual(true, effort4.length > effort6.length); + }) + ) + ); + + it('should produce different file size with/out shrink-on-load', async () => { + const [shrunk, resized] = await Promise.all([ + sharp(fixtures.inputWebP).resize({ width: 16 }).toBuffer(), + sharp(fixtures.inputWebP).resize({ width: 16, fastShrinkOnLoad: false, kernel: 'nearest' }).toBuffer() + ]); + assert.notStrictEqual(shrunk.length, resized.length); + }); + + it('invalid effort throws', () => { + assert.throws(() => { + sharp().webp({ effort: true }); + }); + }); + + it('out of range effort throws', () => { + assert.throws(() => { + sharp().webp({ effort: -1 }); + }); + }); + + it('should set effort to 0', () => { + const effort = sharp().webp({ effort: 0 }).options.webpEffort; + + assert.strictEqual(effort, 0); + }); + + it('valid minSize', () => { + assert.doesNotThrow(() => sharp().webp({ minSize: true })); + }); + + it('invalid minSize throws', () => { + assert.throws( + () => sharp().webp({ minSize: 1 }), + /Expected boolean for webpMinSize but received 1 of type number/ + ); + }); + + it('valid mixed', () => { + assert.doesNotThrow(() => sharp().webp({ mixed: true })); + }); + + it('invalid mixed throws', () => { + assert.throws( + () => sharp().webp({ mixed: 'fail' }), + /Expected boolean for webpMixed but received fail of type string/ + ); + }); + + it('invalid loop throws', () => { + assert.throws(() => { + sharp().webp({ loop: -1 }); + }); + + assert.throws(() => { + sharp().webp({ loop: 65536 }); + }); + }); + + it('invalid delay throws', () => { + assert.throws(() => { + sharp().webp({ delay: -1 }); + }); + + assert.throws(() => { + sharp().webp({ delay: [65536] }); + }); + }); + + it('should repeat a single delay for all frames', async () => { + const updated = await sharp(fixtures.inputWebPAnimated, { pages: -1 }) + .webp({ delay: 100 }) + .toBuffer() + .then(data => sharp(data, { pages: -1 }).metadata()); + + assert.deepStrictEqual(updated.delay, Array(updated.pages).fill(100)); + }); + + it('should limit animation loop', async () => { + const updated = await sharp(fixtures.inputWebPAnimated, { pages: -1 }) + .webp({ loop: 3 }) + .toBuffer() + .then(data => sharp(data, { pages: -1 }).metadata()); + + assert.strictEqual(updated.loop, 3); + }); + + it('should change delay between frames', async () => { + const original = await sharp(fixtures.inputWebPAnimated, { pages: -1 }).metadata(); + + const expectedDelay = [...Array(original.pages).fill(40)]; + const updated = await sharp(fixtures.inputWebPAnimated, { pages: -1 }) + .webp({ delay: expectedDelay }) + .toBuffer() + .then(data => sharp(data, { pages: -1 }).metadata()); + + assert.deepStrictEqual(updated.delay, expectedDelay); + }); + + it('should preserve delay between frames', async () => { + const updated = await sharp(fixtures.inputWebPAnimated, { pages: -1 }) + .webp() + .toBuffer() + .then(data => sharp(data, { pages: -1 }).metadata()); + + assert.deepStrictEqual(updated.delay, [120, 120, 90, 120, 120, 90, 120, 90, 30]); + }); + + it('should work with streams when only animated is set', (_t, done) => { + fs.createReadStream(fixtures.inputWebPAnimated) + .pipe(sharp({ animated: true })) + .webp({ lossless: true, effort: 0 }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('webp', info.format); + fixtures.assertSimilar(fixtures.inputWebPAnimated, data, done); + }); + }); + + it('should work with streams when only pages is set', (_t, done) => { + fs.createReadStream(fixtures.inputWebPAnimated) + .pipe(sharp({ pages: -1 })) + .webp({ lossless: true, effort: 0 }) + .toBuffer((err, data, info) => { + if (err) throw err; + assert.strictEqual(true, data.length > 0); + assert.strictEqual('webp', info.format); + fixtures.assertSimilar(fixtures.inputWebPAnimated, data, done); + }); + }); + + it('should resize animated image to page height', async () => { + const updated = await sharp(fixtures.inputWebPAnimated, { pages: -1 }) + .resize({ height: 570 }) + .webp({ effort: 0 }) + .toBuffer() + .then(data => sharp(data, { pages: -1 }).metadata()); + + assert.strictEqual(updated.height, 570 * 9); + assert.strictEqual(updated.pageHeight, 570); + }); + + it('should take page parameter into account when animated is set', async () => { + const updated = await sharp(fixtures.inputWebPAnimated, { animated: true, page: 2 }) + .resize({ height: 570 }) + .webp({ effort: 0 }) + .toBuffer() + .then(data => sharp(data, { pages: -1 }).metadata()); + + assert.strictEqual(updated.height, 570 * 7); + assert.strictEqual(updated.pageHeight, 570); + }); +});