Native Node.js bindings for liblzma — XZ/LZMA2 compression with browser support via WebAssembly.
- Quick Start
- What's New
- Browser Usage
- CLI Tool (nxz)
- Ecosystem Packages
- API Reference
- Advanced Configuration
- Benchmark
- Installation
- Testing
- Migration Guide (v1 → v2)
- Contributing
- Troubleshooting
- Bugs
- Acknowledgements
- License
XZ is a container for compressed archives. It offers one of the best compression ratios available, with a good balance between compression time and decompression speed/memory.
Only LZMA2 is supported for compression output. But the library can open and read any LZMA1 or LZMA2 compressed file.
npm install node-liblzmaimport { xzAsync, unxzAsync, createXz, createUnxz } from 'node-liblzma';
// Simple: Compress a buffer
const compressed = await xzAsync(Buffer.from('Hello, World!'));
const decompressed = await unxzAsync(compressed);
// Streaming: Compress a file
import { createReadStream, createWriteStream } from 'fs';
createReadStream('input.txt')
.pipe(createXz())
.pipe(createWriteStream('output.xz'));
// With progress monitoring
const compressor = createXz();
compressor.on('progress', ({ bytesRead, bytesWritten }) => {
console.log(`${bytesRead} bytes in → ${bytesWritten} bytes out`);
});Promise style (with .then())
import { xzAsync, unxzAsync } from 'node-liblzma';
xzAsync(Buffer.from('Hello, World!'))
.then(compressed => {
console.log('Compressed size:', compressed.length);
return unxzAsync(compressed);
})
.then(decompressed => {
console.log('Decompressed:', decompressed.toString());
})
.catch(err => {
console.error('Compression failed:', err);
});Callback style (Node.js traditional)
import { xz, unxz } from 'node-liblzma';
xz(Buffer.from('Hello, World!'), (err, compressed) => {
if (err) throw err;
unxz(compressed, (err, decompressed) => {
if (err) throw err;
console.log('Decompressed:', decompressed.toString());
});
});📖 Full API documentation: oorabona.github.io/node-liblzma
Live Demo — Try XZ compression in your browser.
- Browser/WASM support: Full XZ compression and decompression via WebAssembly
- Same API as Node.js (
xzAsync,unxzAsync,createXz,createUnxz) - WASM binary: ~52KB gzipped (under 100KB budget)
- Web Streams API for streaming compression/decompression
- Zero-config inline mode:
import from 'node-liblzma/inline'
- Same API as Node.js (
- CLI tool (nxz): Portable xz-like command line tool
- Full xz compatibility:
-z,-d,-l,-k,-f,-c,-o,-v,-q - Compression presets 0-9 with extreme mode (
-e) - tar.xz archive support: create, list, extract (compatible with system
tar) - Progress display, stdin/stdout piping, benchmarking (
-B)
- Full xz compatibility:
- tar-xz package: Create/extract
.tar.xzarchives — Node.js streaming + browser WASM - Progress events: Monitor compression/decompression in real-time
- XZ Utils 5.8.x: Updated to latest stable version
- 519 tests: Comprehensive test suite with 100% code coverage
- Full TypeScript migration: Complete rewrite from CoffeeScript
- Promise-based APIs:
xzAsync()andunxzAsync() - Modern tooling: Vitest, Biome, pnpm, pre-commit hooks
- Node.js >= 16 required (updated from >= 12)
Legacy (N-API migration)
In previous versions, N-API replaced nan as the stable ABI for native modules.
Tested on: Linux x64, macOS (x64/arm64), Raspberry Pi 2/3/4, Windows.
Prebuilt binaries are bundled for: Windows x64, Linux x64, macOS x64/arm64.
| Flag | Description | Default | Values |
|---|---|---|---|
USE_GLOBAL |
Use system liblzma library | yes (no on Windows) |
yes, no |
RUNTIME_LINK |
Static or shared linking | shared |
static, shared |
ENABLE_THREAD_SUPPORT |
Enable thread support | yes |
yes, no |
If no prebuilt binary matches your platform, node-gyp will compile from source automatically.
Live Demo — Try XZ compression in your browser right now.
node-liblzma v3.0.0+ supports XZ compression in the browser via WebAssembly. The same API works in both Node.js and browsers — bundlers (Vite, Webpack, esbuild) automatically resolve the WASM-backed implementation.
// Bundlers auto-resolve to WASM in browser, native in Node.js
import { xzAsync, unxzAsync, isXZ } from 'node-liblzma';
// Compress
const compressed = await xzAsync('Hello, browser!');
// Decompress
const original = await unxzAsync(compressed);
// Check if data is XZ-compressed
if (isXZ(someBuffer)) {
const data = await unxzAsync(someBuffer);
}import { createXz, createUnxz } from 'node-liblzma';
// Compress a fetch response
const response = await fetch('/large-file.bin');
const compressed = response.body.pipeThrough(createXz({ preset: 6 }));
// Decompress
const decompressed = compressedStream.pipeThrough(createUnxz());| Import | When to use |
|---|---|
node-liblzma |
Standard — bundler resolves to WASM (browser) or native (Node.js) |
node-liblzma/wasm |
Explicit WASM usage in Node.js (no native addon needed) |
node-liblzma/inline |
Zero-config — WASM embedded as base64 (no external file to serve) |
// Explicit WASM (works in Node.js too, no native build required)
import { xzAsync } from 'node-liblzma/wasm';
// Inline mode (larger bundle, but no WASM file to configure)
import { ensureInlineInit, xzAsync } from 'node-liblzma/inline';
await ensureInlineInit(); // Decodes embedded base64 WASM
const compressed = await xzAsync(data);- No sync APIs:
xzSync()/unxzSync()throwLZMAErrorin browsers - Presets 0-6 only: Presets 7-9 require more memory than WASM's 256MB limit
- No filesystem:
xzFile()/unxzFile()are not available - No Node Streams: Use
createXz()/createUnxz()(Web TransformStream) instead ofXz/Unxzclasses
| Component | Raw | Gzipped |
|---|---|---|
| liblzma.wasm | ~107KB | ~52KB |
| Glue code (liblzma.js) | ~6KB | ~2KB |
| Total | ~113KB | ~54KB |
For detailed browser setup instructions, see docs/BROWSER.md.
This package includes nxz, a portable xz-like CLI tool that works on any platform with Node.js.
# Global installation (recommended for CLI usage)
npm install -g node-liblzma
# Then use directly
nxz --help# Compress a file (creates file.txt.xz, deletes original)
nxz file.txt
# Decompress (auto-detected from .xz extension)
nxz file.txt.xz
# Keep original file (-k)
nxz -k file.txt
# Maximum compression (-9) with extreme mode (-e)
nxz -9e large-file.bin
# Compress to stdout (-c) for piping
nxz -c file.txt > file.txt.xz
# List archive info (-l / -lv for verbose)
nxz -l file.txt.xz
# Benchmark native vs WASM performance (-B)
nxz -B file.txtnxz can create, list, and extract .tar.xz archives — auto-detected from file extension:
# Create a tar.xz archive from files/directories
nxz -T src/ lib/ README.md -o project.tar.xz
# List archive contents
nxz -Tl project.tar.xz
# Extract archive to a directory
nxz -Td project.tar.xz -o output/
# Extract with path stripping (like tar --strip-components)
nxz -Td project.tar.xz --strip=1 -o output/Archives created by nxz are fully compatible with system tar -xJf.
| Option | Long | Description |
|---|---|---|
-z |
--compress |
Force compression mode |
-d |
--decompress |
Force decompression mode |
-l |
--list |
List archive information |
-T |
--tar |
Treat file as tar.xz archive (auto-detected for .tar.xz/.txz) |
-B |
--benchmark |
Benchmark native vs WASM performance |
-k |
--keep |
Keep original file (don't delete) |
-f |
--force |
Overwrite existing output file |
-c |
--stdout |
Write to stdout, keep original file |
-o |
--output=FILE |
Write output to specified file (or directory for tar extract) |
-v |
--verbose |
Show progress for large files |
-q |
--quiet |
Suppress warning messages |
-0..-9 |
Compression level (default: 6) | |
-e |
--extreme |
Extreme compression (slower) |
--strip=N |
Strip N leading path components on tar extract | |
-h |
--help |
Show help |
-V |
--version |
Show version |
One-shot usage (without global install)
# Standalone nxz-cli package (recommended — smaller, faster install)
npx nxz-cli --help
pnpm dlx nxz-cli --help
# Or via the full node-liblzma package
npx --package node-liblzma nxz --help
pnpm dlx --package node-liblzma nxz --help| Code | Meaning |
|---|---|
| 0 | Success |
| 1 | Error (file not found, format error, etc.) |
| 130 | Interrupted (SIGINT/Ctrl+C) |
node-liblzma powers a family of focused packages:
| Package | Description | Install |
|---|---|---|
node-liblzma |
Core XZ library — Node.js native + browser WASM | npm i node-liblzma |
tar-xz |
Create/extract .tar.xz archives — Node.js + browser | npm i tar-xz |
nxz-cli |
Standalone CLI — npx nxz-cli file.txt |
npx nxz-cli |
Live Demo — Create and extract tar.xz archives in your browser.
A library for working with .tar.xz archives, with dual Node.js (streaming) and browser (buffer-based) APIs. This fills the gap left by node-tar which does not support .tar.xz.
// Node.js — streaming API
import { create, extract, list } from 'tar-xz';
await create({ file: 'archive.tar.xz', cwd: './src', files: ['index.ts', 'utils.ts'] });
const entries = await list({ file: 'archive.tar.xz' });
await extract({ file: 'archive.tar.xz', cwd: './output' });
// Browser — buffer-based API
import { createTarXz, extractTarXz, listTarXz } from 'tar-xz';
const archive = await createTarXz({ files: [{ name: 'hello.txt', content: data }] });
const files = await extractTarXz(archive);
const entries = await listTarXz(archive);A lightweight wrapper package for running nxz without installing the full node-liblzma:
# No install needed
npx nxz-cli file.txt # compress
npx nxz-cli -d file.txt.xz # decompress
npx nxz-cli -T src/ -o app.tar.xz # create tar.xz archive
# Or install globally
npm install -g nxz-cliThe API mirrors Node.js Zlib for easy adoption:
// CommonJS
var lzma = require('node-liblzma');
// TypeScript / ES6 modules
import * as lzma from 'node-liblzma';| Zlib | node-liblzma | Arguments |
|---|---|---|
createGzip |
createXz |
([options]) |
createGunzip |
createUnxz |
([options]) |
gzip |
xz |
(buf, [options], callback) |
gunzip |
unxz |
(buf, [options], callback) |
gzipSync |
xzSync |
(buf, [options]) |
gunzipSync |
unxzSync |
(buf, [options]) |
| - | xzAsync |
(buf, [options]) → Promise<Buffer> |
| - | unxzAsync |
(buf, [options]) → Promise<Buffer> |
| - | xzFile |
(input, output, [options]) → Promise<void> |
| - | unxzFile |
(input, output, [options]) → Promise<void> |
| Attribute | Type | Description | Values |
|---|---|---|---|
check |
number | Integrity check | check.NONE, check.CRC32, check.CRC64, check.SHA256 |
preset |
number | Compression level (0-9) | preset.DEFAULT (6), preset.EXTREME |
mode |
number | Compression mode | mode.FAST, mode.NORMAL |
threads |
number | Thread count | 0 = auto (all cores), 1 = single-threaded, N = N threads |
filters |
array | Filter chain | filter.LZMA2, filter.X86, filter.ARM, etc. |
chunkSize |
number | Processing chunk size | Default: 64KB |
For further information, see the XZ SDK documentation.
Multi-threaded compression is available when built with ENABLE_THREAD_SUPPORT=yes (default).
| Value | Behavior |
|---|---|
0 |
Auto-detect: use all available CPU cores |
1 |
Single-threaded (default) |
N |
Use exactly N threads |
import { createXz, hasThreads } from 'node-liblzma';
if (hasThreads()) {
const compressor = createXz({ threads: 0 }); // auto-detect
}Note: Threads only apply to compression, not decompression.
Track compression and decompression progress in real-time:
import { createXz, createUnxz } from 'node-liblzma';
const compressor = createXz({ preset: 6 });
compressor.on('progress', ({ bytesRead, bytesWritten }) => {
const ratio = bytesWritten / bytesRead;
console.log(`Progress: ${bytesRead} in, ${bytesWritten} out (ratio: ${ratio.toFixed(2)})`);
});
inputStream.pipe(compressor).pipe(outputStream);Progress events fire after each chunk is processed. Works with streams, not buffer APIs.
For production environments with high concurrency needs:
import { LZMAPool } from 'node-liblzma';
const pool = new LZMAPool(10); // Max 10 concurrent operations
pool.on('metrics', (metrics) => {
console.log(`Active: ${metrics.active}, Queued: ${metrics.queued}`);
});
const compressed = await pool.compress(buffer);
const decompressed = await pool.decompress(compressed);import { xzFile, unxzFile } from 'node-liblzma';
await xzFile('input.txt', 'output.txt.xz');
await unxzFile('output.txt.xz', 'restored.txt');
// With options
await xzFile('large-file.bin', 'compressed.xz', { preset: 9, threads: 4 });Handles files > 512MB automatically via streams with lower memory footprint.
Typed error classes for precise error handling:
import {
xzAsync, LZMAError, LZMAMemoryError, LZMADataError, LZMAFormatError
} from 'node-liblzma';
try {
const compressed = await xzAsync(buffer);
} catch (error) {
if (error instanceof LZMAMemoryError) {
console.error('Out of memory:', error.message);
} else if (error instanceof LZMADataError) {
console.error('Corrupt data:', error.message);
} else if (error instanceof LZMAFormatError) {
console.error('Invalid format:', error.message);
}
}Available error classes: LZMAError (base), LZMAMemoryError, LZMAMemoryLimitError, LZMAFormatError, LZMAOptionsError, LZMADataError, LZMABufferError, LZMAProgrammingError.
const stream = createXz({
preset: lzma.preset.DEFAULT,
chunkSize: 256 * 1024 // 256KB chunks (default: 64KB)
});| File Size | Recommended chunkSize |
|---|---|
| < 1MB | 64KB (default) |
| 1-10MB | 128-256KB |
| > 10MB | 512KB-1MB |
Maximum buffer size: 512MB per operation (security limit). For larger files, use streaming APIs.
The low-level native callback follows an errno-style contract matching liblzma behavior:
- Signature:
(errno: number, availInAfter: number, availOutAfter: number) - Success:
errnoisLZMA_OKorLZMA_STREAM_END - Error: any other
errnovalue
High-level APIs remain ergonomic — Promise functions resolve to Buffer or reject with Error, stream users listen to error events.
All three backends use the same liblzma library and produce identical compression ratios:
System xz > nxz native (C++ addon) > nxz WASM (Emscripten)
fastest ~1-2x slower ~2-5x slower (decompress)
~1x (compress, large files)
| Backend | Compress | Decompress | Size | Environment |
|---|---|---|---|---|
System xz 5.8 |
81 ms | 4 ms | 76.7 KB | C binary |
| nxz native | 90 ms | 3.4 ms | 76.7 KB | Node.js + C++ addon |
| nxz WASM | 86 ms | 7.9 ms | 76.7 KB | Node.js + Emscripten |
| Data | Compress | Decompress | Notes |
|---|---|---|---|
| 1 KB text | WASM 2.8x slower | WASM 4.9x slower | Startup overhead dominates |
| 135 KB binary | ~1:1 | WASM 2x slower | Compression near-parity |
| 246 KB source | ~1:1 | WASM 2.3x slower | Realistic workload |
| 1 MB random | ~1:1 | WASM 1.6x slower | Gap narrows with size |
Running benchmarks
# Compare nxz (native) vs system xz across file sizes
./scripts/benchmark.sh
./scripts/benchmark.sh -s 1,50,200 -p 6,9 # custom sizes/presets
./scripts/benchmark.sh -o csv > results.csv # export as CSV/JSON
# Compare native addon vs WASM backend
nxz --benchmark file.txt
nxz -B -3 large-file.bin # with preset 3| Scenario | Recommended |
|---|---|
| Browser | WASM (only option) |
| Node.js, performance-critical | Native addon |
| Node.js, no C++ toolchain available | WASM (node-liblzma/wasm) |
| Cross-platform scripts | nxz CLI |
| Batch processing many files | System xz |
| CI/CD with Node.js already installed | nxz CLI |
npm install node-liblzma
# or
pnpm add node-liblzmaIf prebuilt binaries don't match your platform, install system development libraries:
# Debian/Ubuntu
sudo apt-get install liblzma-dev
# macOS
brew install xz
# Windows (automatic download and build)
npm install node-liblzma --build-from-source# Force rebuild with default options
npm install node-liblzma --build-from-source
# Disable thread support
ENABLE_THREAD_SUPPORT=no npm install node-liblzma --build-from-sourceCustom XZ installation
If you compiled XZ outside of node-liblzma:
export CPATH=$HOME/path/to/headers
export LIBRARY_PATH=$HOME/path/to/lib
export LD_LIBRARY_PATH=$HOME/path/to/lib:$LD_LIBRARY_PATH
npm installpnpm test # Run all 519 tests
pnpm test:watch # Watch mode
pnpm test:coverage # Coverage report
pnpm type-check # TypeScript type checkingTests use Vitest with 100% code coverage across statements, branches, functions, and lines.
Breaking changes and new features
- Node.js >= 16 required (was >= 12)
- ESM module format (
importinstead ofrequire) - TypeScript source (CoffeeScript removed)
// Promise-based APIs (recommended)
const compressed = await xzAsync(buffer);
// Typed error classes
import { LZMAMemoryError, LZMADataError } from 'node-liblzma';
// Concurrency control
const pool = new LZMAPool(10);
const results = await Promise.all(files.map(f => pool.compress(f)));
// File helpers
await xzFile('input.txt', 'output.txt.xz');- Linter: Biome (replaces ESLint + Prettier)
- Tests: Vitest (replaces Mocha)
- Package Manager: pnpm recommended
We welcome contributions! See the full contributing guidelines below.
git clone https://github.com/oorabona/node-liblzma.git
cd node-liblzma
pnpm install
pnpm build
pnpm testpnpm test # Run tests
pnpm test:watch # Watch mode
pnpm test:coverage # Coverage report
pnpm check # Lint + format check (Biome)
pnpm check:write # Auto-fix lint/format
pnpm type-check # TypeScript typesWe follow Conventional Commits:
feat: add LZMAPool for concurrency control
fix: resolve memory leak in FunctionReference
docs: add migration guide for v2.0
- Fork and create a feature branch (
feat/,fix/,refactor/,docs/) - Add tests for new functionality (100% coverage required)
- Run
pnpm check:write && pnpm type-check && pnpm test - Commit with conventional commits and push
- CI checks run automatically on PR
Build issues
"Cannot find liblzma library" — Install system dev package:
sudo apt-get install liblzma-dev # Debian/Ubuntu
brew install xz # macOS"node-gyp rebuild failed" — Install build tools:
sudo apt-get install build-essential python3 # Linux
xcode-select --install # macOS
npm install --global windows-build-tools # Windows"Prebuilt binary not found" — Build from source:
npm install node-liblzma --build-from-sourceRuntime issues
LZMAMemoryError — Input too large for buffer API. Use streaming:
createReadStream('large.bin').pipe(createXz()).pipe(createWriteStream('large.xz'));LZMADataError — File is not XZ-compressed or is corrupted. Verify with file compressed.xz or xz -t compressed.xz.
Slow on multi-core — Enable threads: createXz({ threads: 0 }) (auto-detect cores).
High memory with concurrency — Use LZMAPool to limit simultaneous operations.
Windows-specific
Build fails — Install Visual Studio Build Tools and set Python:
npm install --global windows-build-tools
npm config set python python3Path issues — Use path.join() instead of hardcoded separators.
- tar-xz — Create/extract tar.xz archives (powered by node-liblzma)
- nxz-cli — Standalone CLI for XZ compression
- lzma-purejs — Pure JavaScript LZMA implementation
- node-xz — Node binding of XZ library
- lzma-native — Complete XZ library bindings
If you find one, feel free to contribute and post a new issue! PRs are accepted as well :)
If you compile with threads, you may see warnings about -Wmissing-field-initializers.
This is normal and does not prevent threading from being active and working.
Kudos to addaleax for helping out with C++ stuff!
This software is released under LGPL-3.0+.