Asset processing pipeline that runs entirely in the browser. Build pipelines with the node-based editor, then run them in your web app with the pipemagic runtime. Supports AI models via WebGPU — no server required.
Node Editor · Example App · npm Package
import { PipeMagic } from "pipemagic";
const pm = new PipeMagic();
const result = await pm.run(
pipeline, // create these with the Node Editor
imageFile,
);
// result.blob → output PNG- Input — Resize and fit source images
- Remove BG — AI background removal using RMBG-1.4 via Transformers.js (WebGPU / WASM)
- Normalize — Auto-crop to content bounding box, center on square canvas with padding
- Outline — Configurable outline via Jump Flooding Algorithm on WebGPU, with canvas fallback
- Estimate Depth — Monocular depth estimation using Depth Anything V2 via Transformers.js (WebGPU / WASM). Fast (~25 MB) and Quality (~40 MB) modes
- Face Parse — Face segmentation into 19 classes (skin, eyes, hair, etc.) using face-parsing via Transformers.js (WebGPU / WASM). Outputs a color-coded segmentation map
- Upscale 2x — Super-resolution with WebSR (Anime4K CNN models)
- Output — Encode to PNG / JPEG / WebP
| Package | Description |
|---|---|
packages/runtime |
Standalone pipeline runtime, published as pipemagic on npm. Framework-agnostic — use it in any web app to run image processing pipelines. |
packages/example |
Minimal demo app (live). Vanilla Vite + TypeScript, ~180 lines. Drop an image and it runs the full sticker pipeline. |
app/ |
The main PipeMagic editor UI — Nuxt 3 / Vue 3 app with Vue Flow, Pinia, and Tailwind CSS. Imports the runtime from pipemagic. |
shared/types/ |
TypeScript type definitions shared between the editor and the runtime. |
yarn install
yarn devOpen http://localhost:3003.
To run the example app:
yarn build:runtime
yarn dev:exampleOpen http://localhost:3005.
MIT
