Use Cases
Migrating to UV
Applications
Explore real-world workflows you can ship today—powered by xorq’s flexible, engine-agnostic design.



30+ ML integrations
"The Xorq framework greatly simplified our ML pipelines, improving performance 10x and reducing the compute and storage resources required to run them."

Daniel Ashy
Chief Technology Officer, Yendo
Install xorq
Get started with Xorq
Start yourself or request your free build.
Install Xorq
Spin up your first Xorq engine in minutes—locally or in the cloud.
Install with pip
pip install xorq
Or use nix instead
nix run github:xorq-labs/xorq
Request a demo
Not sure where to start? We’ll build your first xorq pipeline—free. Tailored just to your stack, your use case, and your goals.
Start with a template
Start with a pre-built pipeline tailored to real-world ML tasks—modify it, run it, and make it yours in minutes.
More ways to use xorq
Users often build custom machinery for their common ML inference tasks e.g. online inference and batch scoring. To address the batch scoring use-case, we are previewing a new built-in UDF for XGBoost models in that can be used to score data row-wise via a DataFrame API. Moreoever, with ’s multi-engine support, data practitioners do not have to leave their DataFrame API to connect to SQL sources from a familiar high-level, pandas-like, DSL.
Deploy an ML pipeline across DuckDB for training and Spark for large-scale scoring—without rewriting a single line.
From dev to prod,
without any rewrites.
Better ML pipelines.
Launched anywhere.
Try xorq today, or request a walkthrough.
