Skip to content
Hero Background Light

Enterprise Batch Processing Powered by Rust

Spring Batch patterns you know. Rust performance you need. No GC. No surprises.

Performance · benchmark reproductible

4.5× faster than Spring Batch Java

10 million financial transactions · CSV → PostgreSQL → XML · same chunk size, same connection pool

Total pipeline
42s
4.5× faster · 187s
Peak memory
30× less · 62 MB vs 1 840 MB
Cold start
320× faster · <10ms vs 3.2s
Spring Batch RS (Rust)Spring Batch (Java)
→ Methodology, full numbers & how to reproduce
LIVE EXAMPLE

CSV → JSON Pipeline

Type-safe, fault-tolerant, zero boilerplate

use spring_batch_rs::prelude::*;
// 1. Define your data shape — Rust enforces type safety at compile time
#[derive(Deserialize, Serialize)]
struct Product {
id: u32,
name: String,
price: f64,
}
// 2. Configure source and destination using builder patterns
let reader = CsvItemReaderBuilder::<Product>::new()
.from_path("products.csv")
.has_headers(true)
.build();
// Passthrough processor — forwards items unchanged (swap in your own logic here)
let processor = PassThroughProcessor::<Product>::new();
let writer = JsonItemWriterBuilder::<Product>::new()
.from_path("products.json");
// 3. Build the pipeline
// chunk(100) → accumulate 100 items, then write once (balances I/O vs memory)
// skip_limit(10) → tolerate up to 10 bad records before the job fails
let step = StepBuilder::new("convert-products")
.reader(&reader)
.processor(&processor)
.writer(&writer)
.chunk(100)
.skip_limit(10)
.build();
// 4. Run — the job tracks status, counts, and errors automatically
JobBuilder::new().start(&step).build().run();