whatsnew.md 5.4 KB
Newer Older
茶陵後's avatar
茶陵後 已提交
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145
# What’s New in Spring Batch 4.3

## [](#whatsNew)What’s New in Spring Batch 4.3

This release comes with a number of new features, performance improvements,
dependency updates and API deprecations. This section describes the most
important changes. For a complete list of changes, please refer to the[release notes](https://github.com/spring-projects/spring-batch/releases/tag/4.3.0).

### [](#newFeatures)New features

#### [](#new-synchronized-itemstreamwriter)New synchronized ItemStreamWriter

Similar to the `SynchronizedItemStreamReader`, this release introduces a`SynchronizedItemStreamWriter`. This feature is useful in multi-threaded steps
where concurrent threads need to be synchronized to not override each other’s writes.

#### [](#new-jpaqueryprovider-for-named-queries)New JpaQueryProvider for named queries

This release introduces a new `JpaNamedQueryProvider` next to the`JpaNativeQueryProvider` to ease the configuration of JPA named queries when
using the `JpaPagingItemReader`:

```
JpaPagingItemReader<Foo> reader = new JpaPagingItemReaderBuilder<Foo>()
   .name("fooReader")
   .queryProvider(new JpaNamedQueryProvider("allFoos", Foo.class))
   // set other properties on the reader
   .build();
```

#### [](#new-jpacursoritemreader-implementation)New JpaCursorItemReader Implementation

JPA 2.2 added the ability to stream results as a cursor instead of only paging.
This release introduces a new JPA item reader that uses this feature to
stream results in a cursor-based fashion similar to the `JdbcCursorItemReader`and `HibernateCursorItemReader`.

#### [](#new-jobparametersincrementer-implementation)New JobParametersIncrementer implementation

Similar to the `RunIdIncrementer`, this release adds a new `JobParametersIncrementer`that is based on a `DataFieldMaxValueIncrementer` from Spring Framework.

#### [](#graalvm-support)GraalVM Support

This release adds initial support to run Spring Batch applications on GraalVM.
The support is still experimental and will be improved in future releases.

#### [](#java-records-support)Java records Support

This release adds support to use Java records as items in chunk-oriented steps.
The newly added `RecordFieldSetMapper` supports data mapping from flat files to
Java records, as shown in the following example:

```
@Bean
public FlatFileItemReader<Person> itemReader() {
	return new FlatFileItemReaderBuilder<Person>()
			.name("personReader")
			.resource(new FileSystemResource("persons.csv"))
			.delimited()
			.names("id", "name")
			.fieldSetMapper(new RecordFieldSetMapper<>(Person.class))
			.build();
}
```

In this example, the `Person` type is a Java record defined as follows:

```
public record Person(int id, String name) { }
```

The `FlatFileItemReader` uses the new `RecordFieldSetMapper` to map data from
the `persons.csv` file to records of type `Person`.

### [](#performanceImprovements)Performance improvements

#### [](#use-bulk-writes-in-repositoryitemwriter)Use bulk writes in RepositoryItemWriter

Up to version 4.2, in order to use `CrudRepository#saveAll` in `RepositoryItemWriter`,
it was required to extend the writer and override `write(List)`.

In this release, the `RepositoryItemWriter` has been updated to use`CrudRepository#saveAll` by default.

#### [](#use-bulk-writes-in-mongoitemwriter)Use bulk writes in MongoItemWriter

The `MongoItemWriter` used `MongoOperations#save()` in a for loop
to save items to the database. In this release, this writer has been
updated to use `org.springframework.data.mongodb.core.BulkOperations` instead.

#### [](#job-startrestart-time-improvement)Job start/restart time improvement

The implementation of `JobRepository#getStepExecutionCount()` used to load
all job executions and step executions in-memory to do the count on the framework
side. In this release, the implementation has been changed to do a single call to
the database with a SQL count query in order to count step executions.

### [](#dependencyUpdates)Dependency updates

This release updates dependent Spring projects to the following versions:

* Spring Framework 5.3

* Spring Data 2020.0

* Spring Integration 5.4

* Spring AMQP 2.3

* Spring for Apache Kafka 2.6

* Micrometer 1.5

### [](#deprecation)Deprecations

#### [](#apiDeprecation)API deprecation

The following is a list of APIs that have been deprecated in this release:

* `org.springframework.batch.core.repository.support.MapJobRepositoryFactoryBean`

* `org.springframework.batch.core.explore.support.MapJobExplorerFactoryBean`

* `org.springframework.batch.core.repository.dao.MapJobInstanceDao`

* `org.springframework.batch.core.repository.dao.MapJobExecutionDao`

* `org.springframework.batch.core.repository.dao.MapStepExecutionDao`

* `org.springframework.batch.core.repository.dao.MapExecutionContextDao`

* `org.springframework.batch.item.data.AbstractNeo4jItemReader`

* `org.springframework.batch.item.file.transform.Alignment`

* `org.springframework.batch.item.xml.StaxUtils`

* `org.springframework.batch.core.launch.support.ScheduledJobParametersFactory`

* `org.springframework.batch.item.file.MultiResourceItemReader#getCurrentResource()`

* `org.springframework.batch.core.JobExecution#stop()`

Suggested replacements can be found in the Javadoc of each deprecated API.

#### [](#sqlfireDeprecation)SQLFire support deprecation

SQLFire has been in [EOL](https://www.vmware.com/latam/products/pivotal-sqlfire.html)since November 1st, 2014. This release deprecates the support of using SQLFire
as a job repository and schedules it for removal in version 5.0.