Skip to content

Getting Started

Welcome to Spring Batch RS! This guide will walk you through creating your first batch processing application in Rust.

Before you begin, ensure you have:

  • Rust 1.70+ installed (Install Rust)
  • Basic familiarity with Rust programming
  • A text editor or IDE (VS Code with rust-analyzer recommended)
Terminal window
cargo new my-batch-app
cd my-batch-app

Add Spring Batch RS to your Cargo.toml:

Cargo.toml
[dependencies]
spring-batch-rs = { version = "0.3", features = ["csv", "json"] }
serde = { version = "1.0", features = ["derive"] }

Create a simple CSV to JSON converter:

src/main.rs
use spring_batch_rs::{
core::{job::JobBuilder, step::StepBuilder, item::PassThroughProcessor},
item::{csv::CsvItemReaderBuilder, json::JsonItemWriterBuilder},
BatchError,
};
use serde::{Deserialize, Serialize};
#[derive(Debug, Clone, Deserialize, Serialize)]
struct Product {
id: u32,
name: String,
price: f64,
category: String,
}
fn main() -> Result<(), BatchError> {
// Sample CSV data
let csv_data = r#"id,name,price,category
1,Laptop,999.99,Electronics
2,Coffee Mug,12.99,Kitchen
3,Notebook,5.99,Office
4,Wireless Mouse,29.99,Electronics"#;
// Create CSV reader
let reader = CsvItemReaderBuilder::<Product>::new()
.has_headers(true)
.from_reader(csv_data.as_bytes());
// Create JSON writer
let writer = JsonItemWriterBuilder::<Product>::new()
.pretty_formatter(true)
.from_path("products.json");
// Create processor (pass-through in this case)
let processor = PassThroughProcessor::<Product>::new();
// Build the step
let step = StepBuilder::new("csv-to-json-step")
.chunk(10)
.reader(&reader)
.processor(&processor)
.writer(&writer)
.build();
// Build and run the job
let job = JobBuilder::new()
.start(&step)
.build();
// Execute the job
let result = job.run()?;
println!("✅ Job completed successfully!");
println!("📊 Processed {} steps", result.get_step_executions().len());
Ok(())
}
Terminal window
cargo run

You should see:

✅ Job completed successfully!
📊 Processed 1 steps

And a products.json file with your converted data!

A Job is the top-level container for your entire batch process. It’s composed of one or more Steps that execute sequentially.

let job = JobBuilder::new()
.start(&step1) // First step
.next(&step2) // Second step (optional)
.next(&step3) // Third step (optional)
.build();

A Step represents an independent phase of processing. There are two types:

Process large datasets in configurable chunks using the read-process-write pattern:

let step = StepBuilder::new("process-data")
.chunk(100) // Process 100 items at a time
.reader(&reader) // Read data source
.processor(&processor) // Transform items
.writer(&writer) // Write results
.build();

An ItemReader retrieves input data one item at a time from various sources.

use spring_batch_rs::item::csv::CsvItemReaderBuilder;
let reader = CsvItemReaderBuilder::<Product>::new()
.has_headers(true)
.delimiter(b',')
.from_path("products.csv")?;

An ItemProcessor applies business logic to transform or filter items.

use spring_batch_rs::core::item::ItemProcessor;
struct PriceDiscountProcessor {
discount_rate: f64,
}
impl ItemProcessor<Product, Product> for PriceDiscountProcessor {
fn process(&self, item: Product) -> Result<Option<Product>, BatchError> {
let mut product = item;
// Apply discount
product.price *= (1.0 - self.discount_rate);
// Filter out items below minimum price
if product.price < 5.0 {
return Ok(None); // Skip this item
}
Ok(Some(product))
}
}
// Usage
let processor = PriceDiscountProcessor { discount_rate: 0.15 };

An ItemWriter outputs processed items to various destinations.

use spring_batch_rs::item::json::JsonItemWriterBuilder;
// Replace MyType with your actual output type
let writer = JsonItemWriterBuilder::<MyType>::new()
.pretty_formatter(true)
.from_path("output.json");

Complete Example: Data Processing Pipeline

Section titled “Complete Example: Data Processing Pipeline”

Let’s build a more sophisticated batch job that:

  1. Reads products from CSV
  2. Applies business rules and transformations
  3. Filters invalid items
  4. Writes to JSON
src/main.rs
use spring_batch_rs::{
core::{job::JobBuilder, step::StepBuilder, item::ItemProcessor},
item::{csv::CsvItemReaderBuilder, json::JsonItemWriterBuilder},
BatchError,
};
use serde::{Deserialize, Serialize};
#[derive(Deserialize, Clone)]
struct RawProduct {
id: u32,
name: String,
price: f64,
category: String,
stock: i32,
}
#[derive(Serialize)]
struct ProcessedProduct {
id: u32,
name: String,
final_price: f64,
category: String,
stock_status: String,
price_tier: String,
}
struct ProductProcessor;
impl ItemProcessor<RawProduct, ProcessedProduct> for ProductProcessor {
fn process(&self, item: RawProduct) -> Result<Option<ProcessedProduct>, BatchError> {
// Validate price
if item.price <= 0.0 {
return Ok(None); // Skip invalid items
}
// Apply category-specific discount
let discount = match item.category.as_str() {
"Electronics" => 0.15,
"Kitchen" => 0.10,
"Office" => 0.05,
_ => 0.0,
};
let final_price = item.price * (1.0 - discount);
// Determine stock status
let stock_status = if item.stock == 0 {
"Out of Stock"
} else if item.stock < 10 {
"Low Stock"
} else {
"In Stock"
}.to_string();
// Classify price tier
let price_tier = if final_price < 20.0 {
"Budget"
} else if final_price < 100.0 {
"Mid-Range"
} else {
"Premium"
}.to_string();
Ok(Some(ProcessedProduct {
id: item.id,
name: item.name,
final_price,
category: item.category,
stock_status,
price_tier,
}))
}
}
fn main() -> Result<(), BatchError> {
// Create input file with sample data
let csv_data = r#"id,name,price,category,stock
1,Laptop,999.99,Electronics,25
2,Coffee Mug,12.99,Kitchen,150
3,Notebook,5.99,Office,8
4,Wireless Mouse,29.99,Electronics,0
5,Desk Lamp,45.00,Office,50"#;
let reader = CsvItemReaderBuilder::<RawProduct>::new()
.has_headers(true)
.from_reader(csv_data.as_bytes());
let writer = JsonItemWriterBuilder::<ProcessedProduct>::new()
.pretty_formatter(true)
.from_path("processed_products.json");
let processor = ProductProcessor;
let step = StepBuilder::new("process-products")
.chunk(10)
.reader(&reader)
.processor(&processor)
.writer(&writer)
.build();
let job = JobBuilder::new()
.start(&step)
.build();
job.run()?;
println!("✅ Processing complete! Check processed_products.json");
Ok(())
}

Spring Batch RS provides robust error handling with configurable skip limits:

let step = StepBuilder::new("fault-tolerant-step")
.chunk(100)
.reader(&reader)
.processor(&processor)
.writer(&writer)
.skip_limit(10) // Allow up to 10 errors before failing
.build();

Start Small

Begin with small chunk sizes (10-100) and adjust based on your data and memory constraints

Validate Early

Perform validation in your processor to catch errors before writing

Use Skip Limits Wisely

Set appropriate skip limits based on acceptable data quality thresholds

Log Errors

Implement custom error logging to track skipped items for later review

Now that you understand the basics, explore more advanced features:

Processing Models

Learn about chunk-oriented vs tasklet processing patterns

Read More →

Item Readers & Writers

Explore all available data sources and destinations

View Options →

Happy batch processing! 🚀