Aller au contenu

Overview

Ce contenu n’est pas encore disponible dans votre langue.

Spring Batch RS provides a rich set of item readers and writers for various data sources and formats. All readers and writers use the builder pattern for easy, type-safe configuration.

Enable only the features you need to keep your dependencies minimal:

FeatureDescriptionStatus
csvCSV file reading and writing✅ Stable
jsonJSON file reading and writing✅ Stable
xmlXML file reading and writing✅ Stable
mongodbMongoDB database integration✅ Stable
rdbc-postgresPostgreSQL via RDBC✅ Stable
rdbc-mysqlMySQL/MariaDB via RDBC✅ Stable
rdbc-sqliteSQLite via RDBC✅ Stable
ormSeaORM integration✅ Stable
fakeMock data generation✅ Stable
loggerDebug logging writer✅ Stable

CSV

Read and write CSV files with configurable delimiters, headers, and quoting

Learn More →

JSON

Handle JSON arrays and objects with pretty printing support

Learn More →

XML

Process XML documents with custom root and item elements

Learn More →

RDBC (Generic SQL)

Direct database access for PostgreSQL, MySQL, and SQLite using SQLx

Learn More →

Fake Data Generator

Generate realistic mock data for testing

Learn More →

use spring_batch_rs::item::csv::CsvItemReaderBuilder;
use serde::Deserialize;
#[derive(Deserialize)]
struct Product {
id: u32,
name: String,
price: f64,
}
let reader = CsvItemReaderBuilder::<Product>::new()
.has_headers(true)
.delimiter(b',')
.from_path("products.csv")?;
use spring_batch_rs::item::json::JsonItemWriterBuilder;
let writer = JsonItemWriterBuilder::<MyType>::new()
.pretty_formatter(true)
.from_path("output.json")?;
use spring_batch_rs::item::rdbc::rdbc_reader::{RdbcItemReaderBuilder, RdbcRowMapper};
use sqlx::{AnyPool, Row};
struct ProductRowMapper;
impl RdbcRowMapper<Product> for ProductRowMapper {
fn map_row(&self, row: &sqlx::any::AnyRow) -> Product {
Product {
id: row.get("id"),
name: row.get("name"),
price: row.get("price"),
}
}
}
let pool = AnyPool::connect("postgresql://user:pass@localhost/db").await?;
let reader = RdbcItemReaderBuilder::new()
.pool(&pool)
.query("SELECT id, name, price FROM products")
.page_size(1000)
.row_mapper(&ProductRowMapper)
.build();

You can create custom implementations by implementing the respective traits:

use spring_batch_rs::core::item::ItemReader;
use spring_batch_rs::BatchError;
struct MyCustomReader {
data: Vec<String>,
index: usize,
}
impl ItemReader<String> for MyCustomReader {
fn read(&mut self) -> Result<Option<String>, BatchError> {
if self.index < self.data.len() {
let item = self.data[self.index].clone();
self.index += 1;
Ok(Some(item))
} else {
Ok(None) // No more items
}
}
}

Choose appropriate chunk sizes based on your data and memory:

// Small chunks (10-50): Memory-constrained environments
.chunk(25)
// Medium chunks (100-500): Balanced performance
.chunk(250)
// Large chunks (1000+): High-throughput scenarios
.chunk(1000)

For database readers, configure page sizes to optimize memory:

let reader = RdbcItemReaderBuilder::new()
.page_size(500) // Fetch 500 records at a time
.build();

File-based readers and writers use buffered I/O by default for optimal performance.

Explore detailed documentation for each reader and writer type: