Skip to content

New spring-batch-s3 module #182

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 10 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 27 additions & 0 deletions .github/workflows/spring-batch-s3.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
name: Spring Batch S3

on:
pull_request:
paths:
- 'spring-batch-s3/**'
push:
paths:
- 'spring-batch-s3/**'

env:
MAVEN_ARGS: -B -V -ntp -e -Djansi.passthrough=true -Dstyle.color=always

jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Java
uses: actions/setup-java@v4
with:
distribution: 'temurin'
java-version: '21'
cache: 'maven'
- name: Build with Maven
run: ./mvnw $MAVEN_ARGS verify javadoc:javadoc
working-directory: spring-batch-s3
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ This project is part of the [Spring organization][] on GitHub.
| [`spring-batch-geode`](spring-batch-geode) | Support for [Apache Geode] | TBA | [![Maven Central](https://img.shields.io/maven-central/v/org.springframework.batch.extensions/spring-batch-geode?label)](https://central.sonatype.com/artifact/org.springframework.batch.extensions/spring-batch-geode) | [![Spring Batch Geode](https://github.com/spring-projects/spring-batch-extensions/actions/workflows/spring-batch-geode.yml/badge.svg)](https://github.com/spring-projects/spring-batch-extensions/actions/workflows/spring-batch-geode.yml?query=branch%3Amain) |
| [`spring-batch-neo4j`](spring-batch-neo4j) | Support for [Neo4j] | [@michael-simons](https://github.com/michael-simons) | [![Maven Central](https://img.shields.io/maven-central/v/org.springframework.batch.extensions/spring-batch-neo4j?label)](https://central.sonatype.com/artifact/org.springframework.batch.extensions/spring-batch-neo4j) | [![Spring Batch Neo4j](https://github.com/spring-projects/spring-batch-extensions/actions/workflows/spring-batch-neo4j.yml/badge.svg)](https://github.com/spring-projects/spring-batch-extensions/actions/workflows/spring-batch-neo4j.yml?query=branch%3Amain) |
| [`spring-batch-notion`](spring-batch-notion) | Support for [Notion] | [@scordio](https://github.com/scordio) | [![Maven Central](https://img.shields.io/maven-central/v/org.springframework.batch.extensions/spring-batch-notion?label)](https://central.sonatype.com/artifact/org.springframework.batch.extensions/spring-batch-notion) | [![Spring Batch Notion](https://github.com/spring-projects/spring-batch-extensions/actions/workflows/spring-batch-notion.yml/badge.svg?branch=main)](https://github.com/spring-projects/spring-batch-extensions/actions/workflows/spring-batch-notion.yml?query=branch%3Amain) |
| [`spring-batch-s3`](spring-batch-s3) | Support for [Amazon S3] | [@andreacioni](https://github.com/andreacioni) | [![Maven Central](https://img.shields.io/maven-central/v/org.springframework.batch.extensions/spring-batch-s3?label)](https://central.sonatype.com/artifact/org.springframework.batch.extensions/spring-batch-s3) | [![Spring Batch S3](https://github.com/spring-projects/spring-batch-extensions/actions/workflows/spring-batch-s3.yml/badge.svg?branch=main)](https://github.com/spring-projects/spring-batch-extensions/actions/workflows/spring-batch-s3.yml?query=branch%3Amain) |

## Getting support

Expand Down Expand Up @@ -101,6 +102,7 @@ noted differently for individual extension Modules, but this should be the rare

**We look forward to your contributions!!**

[Amazon S3]: https://aws.amazon.com/s3/
[Apache Geode]: https://geode.apache.org
[Apache License]: https://www.apache.org/licenses/LICENSE-2.0
[Contributor Guidelines]: CONTRIBUTING.md
Expand Down
13 changes: 13 additions & 0 deletions spring-batch-s3/.editorconfig
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
root = true

# Unix-style newlines with a newline ending every file
[*]
charset = utf-8
end_of_line = lf
insert_final_newline = true

# Maven POM code convention
[pom.xml]
indent_size = 2
indent_style = space
max_line_length = 205
32 changes: 32 additions & 0 deletions spring-batch-s3/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
### Maven ###
target/
.mvn/wrapper/maven-wrapper.jar
.flattened-pom.xml

### STS ###
.apt_generated
.classpath
.factorypath
.project
.settings
.springBeans
.sts4-cache

### IntelliJ IDEA ###
.idea
*.iws
*.iml
*.ipr

### NetBeans ###
/nbproject/private/
/nbbuild/
/dist/
/nbdist/
/.nb-gradle/
build/
!**/src/main/**/build/
!**/src/test/**/build/

### VS Code ###
.vscode/
19 changes: 19 additions & 0 deletions spring-batch-s3/.mvn/wrapper/maven-wrapper.properties
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
wrapperVersion=3.3.2
distributionType=only-script
distributionUrl=https://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.9.8/apache-maven-3.9.8-bin.zip
157 changes: 157 additions & 0 deletions spring-batch-s3/README.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,157 @@
= spring-batch-s3
:toc:
:icons: font
:source-highlighter: highlightjs

https://spring.io/projects/spring-batch[Spring Batch] extension for https://aws.amazon.com/it/s3/[Amazon S3] which contains `S3ItemReader` and `S3ItemWriter` implementations for reading from and writing to S3 buckets, including support for multipart uploads. Other S3-compatible technologies (like as Google Cloud Storage, MinIO, etc.) may be supported, but they are not guaranteed to work.

*Note*: these writers are based on the *AWS SDK V2*.

== Installation

To use the `spring-batch-s3` extension, you need to add the following dependency to your Maven or Gradle project:

=== Maven

[source,xml]
----
<dependency>
<groupId>org.springframework.batch.extensions</groupId>
<artifactId>spring-batch-s3</artifactId>
<version>${spring-batch-extensions.version}</version>
</dependency>
----

=== Gradle

[source,groovy]
----
implementation 'org.springframework.batch.extensions:spring-batch-s3:${springBatchExtensionsVersion}'
----

== Known limitations

* The `S3ItemReader` and `S3ItemWriter` are designed to work with the synchronous AWS S3 client (`S3Client`). They do not support the asynchronous client (`S3AsyncClient`) at this time.

== Pre-requisites

In order to set up these components you need to provide some additional beans in your Spring Batch configuration:

* An `S3Client` bean to interact with AWS S3.
* In case you want to use the `S3ItemReader`: an instance of `S3Deserializer` for the data you want to read.
* In case you want to use the `S3ItemWriter`: an instance of `S3Serializer` for the data you want to write.

There are two examples of implementation for both `S3Serializer` and `S3Deserializer` provided in this project:

* `S3StringSerializer`: take a `String` as input and writes it to S3 as a UTF-8 encoded byte array. The write functions add a line termination character at the end of each string.
* `S3StringDeserializer`: takes a UTF-8 encoded byte array from S3 and converts it to a `String`. The implementation of this deserializer is *stateful* because lines may arrive in different chunks.

More details in the JavaDocs of the classes.

=== Configuration of the `S3Client`

To use the `S3ItemReader` and `S3ItemWriter`, you need to configure the AWS S3 client by providing a `S3Client`. Checkout the https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/http-configuration-apache.html[AWS SDK for Java] documentation for more details on how to configure this.

=== Configure `S3Serializer`

`S3StringSerializer` is a simple implementation of `S3Serializer` that takes a `String` as input and writes it to S3 as a UTF-8 encoded byte array. You are encouraged to implement your own serializer if you need to handle different data types or formats.

==== Java Config

[source,java]
----
@Bean
S3Serializer<String> s3Serializer() {
return new S3StringSerializer();
}
----

=== Configure `S3Deserializer`

Similarly, `S3StringDeserializer` is a simple implementation of `S3Deserializer` that takes a UTF-8 encoded byte array from S3 and converts it to a `String`. You can implement your own deserializer if you need to handle different data types or formats.

In case you don't want to implement your serializer checkout the "Alternatives readers" section below.

==== Java Config

[source,java]
----
@Bean
S3Deserializer<String> s3Deserializer() {
return new S3StringDeserializer();
}
----

== Configuration of `S3ItemReader`

Given the `S3Client` and `S3Deserializer` beans, you can now configure the `S3ItemReader`.

=== Java Config

To configure the `S3ItemReader`, you need to set up the AWS S3 client and specify the bucket and object key from which you want to read data.
[source,java]
----
@Bean
ItemReader<String> downloadItemReader() throws Exception {
return new S3ItemReader.Builder<String>()
.s3Client(s3Client())
.bucketName("bucket_name")
.objectKey("object_key")
.deserializer(s3Deserializer())
.bufferSize((int) DataSize.ofMegabytes(1).toBytes()) // Default 128 Bytes
.build();
}
----

There is also an additional option to set the `bufferSize` which is the size of the buffer used to read data from S3. The default value is 128 bytes, but you can increase it to improve memory consumption The bast value for this parameter is the average length of the lines in your file.

=== Alternative reader

Instead `S3ItemReader` you can also use `FlatFileItemReader` with `InputStreamResources` to read files from S3 as well.
To do so this package exposes a `S3InputStreamResource` that can be used for that purpose. Below an example:

[source,java]
----
@Bean
ItemReader<String> itemReader() throws Exception {
final var inputStreamResource = new InputStreamResource(
new S3InputStream(s3Client(),
"bucket_name",
"object_key"));

return new FlatFileItemReaderBuilder<String>()
.name("itemReader")
.resource(inputStreamResource)
.lineMapper(new PassThroughLineMapper( ))
.build();
}
----

== Configuration of `S3ItemWriter`

Given the `S3Client` and `S3Serializer` beans, you can now configure the `S3ItemWriter`.

=== Java Config

To configure the `S3ItemWriter`, you need to set up the AWS S3 client and specify the bucket and object key to which you want to write data.
[source,java]
----
@Bean
ItemWriter<String> uploadItemWriter() throws IOException {
return new S3ItemWriter.Builder<String>()
.s3Client(s3Client())
.bucketName("bucket_name")
.objectKey("object_key")
.multipartUpload(true) // Default is false
.partSize((int) DataSize.ofMegabytes(10).toBytes()) // Default is 5 MB
.contentType("text/csv") // Default is application/octet-stream
.serializer(s3Serializer())
.build();
}
----

There are several additional options you can set for the `S3ItemWriter`:

* `multipartUpload`: If set to `true`, the writer will use multipart upload for large files. The default is `false`.
* `partSize`: The size of each part in a multipart upload. The default is 5 MB.
* `contentType`: The content type of the uploaded file. The default is `application/octet-stream`.
Loading