Skip to content
Hero Background Light

Enterprise Batch Processing Powered by Rust

Spring Batch patterns you know. Rust performance you need.

The Problem

Batch jobs fail silently. Memory grows until the process crashes. Errors get swallowed. Retries are manual. Monitoring is an afterthought. Rolling your own framework means reinventing the same hard parts every time.

The Solution

Spring Batch RS brings proven patterns from the Java ecosystem — chunk-oriented processing, skip & retry policies, job lifecycle tracking — rewritten in Rust for memory safety and predictable performance. No GC. No surprises.

RUNTIME

No GC Pauses

Rust’s ownership model means no garbage collector to interrupt your batch jobs at runtime. Your throughput is consistent — no pause-the-world surprises at 3 AM.

SAFETY

Memory Safety

Buffer overflows and null pointer errors are compile-time failures, not production incidents. The compiler catches entire classes of bugs before they reach your data.

ABSTRACTIONS

Zero-Cost Abstractions

The chunk-oriented pipeline, builder patterns, and trait-based extensibility add no overhead beyond what you explicitly write. High-level code, systems-level performance.

TYPE SAFETY

Type-Safe Pipelines

Your reader, processor, and writer types must match at compile time. Connect a CsvReader<Product> to a JsonWriter<Order> and it simply won’t compile.

Works With Your Stack

Databases
PostgreSQLMySQLSQLiteMongoDBSeaORM
Formats
CSVJSONXML
Utilities
Tokio AsyncFault ToleranceZIPFTPFake Data
LIVE EXAMPLE

CSV → JSON Pipeline

Transform data in just a few lines of elegant, type-safe code

use spring_batch_rs::prelude::*;
// 1. Define your data shape — Rust enforces type safety at compile time
#[derive(Deserialize, Serialize)]
struct Product {
id: u32,
name: String,
price: f64,
}
// 2. Configure source and destination using builder patterns
let reader = CsvItemReaderBuilder::<Product>::new()
.from_path("products.csv")
.has_headers(true)
.build();
// Passthrough processor — forwards items unchanged (swap in your own logic here)
let processor = PassThroughProcessor::<Product>::new();
let writer = JsonItemWriterBuilder::<Product>::new()
.from_path("products.json");
// 3. Build the pipeline
// chunk(100) → accumulate 100 items, then write once (balances I/O vs memory)
// skip_limit(10) → tolerate up to 10 bad records before the job fails
let step = StepBuilder::new("convert-products")
.reader(&reader)
.processor(&processor)
.writer(&writer)
.chunk(100)
.skip_limit(10)
.build();
// 4. Run — the job tracks status, counts, and errors automatically
JobBuilder::new().start(&step).build().run();
spring-batch-rs/
$cargo new my-batch-app
$cd my-batch-app
$cargo add spring-batch-rs —features full

Added spring-batch-rs v0.3.0

Ready to explore Dockit experience?

Discover tips, resources, and guidance to maximize experience with our documentation.

Get StartedView Docs