Note: a generated documentation is available here: generated documentation page.
Each time an implementation/reference needs to be resolved this API is used. The default one respects the same rules as the implementation used to resolve ref attributes of the batch xml file (it means you can use qualified names, CDI names if you are in a CDI container…).
Readers, writers, processors have always a shortname which will only work with batchee implementation. To use it with other JBatch implementation use the full qualified name.
Allow to set multiple javax.batch.api.chunk.ItemProcessor through a single processor. The n+1 processor processes the returned value of the n processor.
Sample:
<step id="step1"> <chunk> <reader ref="..." /> <processor ref="org.apache.batchee.extras.chain.ChainProcessor"> <properties> <property name="chain" value="ref1,ref2,ref3"/> </properties> </processor> <writer ref="..." /> </chunk></step>
Note: org.apache.batchee.extras.chain.ChainBatchlet does the same for javax.batch.api.Batchlet.
Shortname: chainProcessor
A reader reading line by line a file. By default the line is returned as a java.lang.String. To return another object just override protected Object preReturn(String line, long lineNumber) method:
public class MyFlatReader extends FlatFileItemReader { @Override protected Object preReturn(String line, long lineNumber) { return new Person(line); } }
Sample:
<step id="step1"> <chunk> <reader ref="org.apache.batchee.extras.flat.FlatFileItemReader"> <properties> <property name="input" value="#{jobParameters['input']}" /> </properties> </reader> <processor ref="..." /> <writer ref="..." /> </chunk> </step>
Configuration:
Shortname: flatReader
A writer writing an item by line. By default toString() is used on items, to change it just override protected String preWrite(Object object) method:
public class MyFlatReader extends FlatFileItemReader { @Override protected String preWrite(final Object object) { final Person person = (Person) object; return person.getName() + "," + person.getAge(); } }
Sample:
<step id="step1"> <chunk> <reader ref="..."/> <processor ref="..." /> <writer ref="org.apache.batchee.extras.flat.FlatFileItemWriter"> <properties> <property name="output" value="#{jobParameters['output']}"/> </properties> </writer> </chunk> </step>
Configuration:
Shortname: flatWriter
A simple Batchlet to execute sql.
Sample:
<step id="step1"> <batchlet ref="jdbcBatchlet"> <properties> <property name="sql" value="delete from Person p where p.name = 'forbidden'" /> <!-- connection info --> <property name="driver" value="org.apache.derby.jdbc.EmbeddedDriver" /> <property name="url" value="jdbc:derby:memory:jdbcbatchlet;create=true" /> <property name="user" value="app" /> <property name="password" value="app" /> </properties> </batchlet> </step>
Configuration:
Shortname: jdbcBatchlet
This reader execute a query while the query returns items.
Sample:
<step id="step1"> <chunk> <reader ref="org.apache.batchee.extras.jdbc.JdbcReader"> <properties> <property name="mapper" value="org.apache.batchee.extras.JdbcReaderTest$SimpleMapper" /> <property name="query" value="select * from FOO where name like 't%'" /> <property name="driver" value="org.apache.derby.jdbc.EmbeddedDriver" /> <property name="url" value="jdbc:derby:memory:jdbcreader;create=true" /> <property name="user" value="app" /> <property name="password" value="app" /> </properties> </reader> <processor ref="..." /> <writer ref="..." /> </chunk> </step>
Configuration:
Here is a sample record mapper deleting items once read (Note: you probably don’t want to do so or at least not without a managed datasource):
public class SimplePersonMapper implements RecordMapper { @Override public Object map(final ResultSet resultSet) throws SQLException { final String name = resultSet.getString("name"); // extract some fields to create an object resultSet.deleteRow(); return new Person(name); } }
Shortname: jdbcReader
A writer storing items in a database.
Sample:
<step id="step1"> <chunk> <reader ref="..."/> <processor ref="..." /> <writer ref="org.apache.batchee.extras.jdbc.JdbcWriter"> <properties> <property name="mapper" value="org.apache.batchee.extras.JdbcWriterTest$SimpleMapper" /> <property name="sql" value="insert into FOO (name) values(?)" /> <property name="driver" value="org.apache.derby.jdbc.EmbeddedDriver" /> <property name="url" value="jdbc:derby:memory:jdbcwriter;create=true" /> <property name="user" value="app" /> <property name="password" value="app" /> </properties> </writer> </chunk> </step>
Configuration:
Here is a sample object mapper:
public class SimpleMapper implements ObjectMapper { @Override public void map(final Object item, final PreparedStatement statement) throws SQLException { statement.setString(1, item.toString()); // 1 because our insert statement uses values(?) } }
Shortname: jdbcWriter
Reads items from a JPA query.
Sample:
<step id="step1"> <chunk> <reader ref="org.apache.batchee.extras.jpa.JpaItemReader"> <properties> <property name="entityManagerProvider" value="org.apache.batchee.extras.util.MyProvider" /> <property name="query" value="select e from Person e" /> </properties> </reader> <processor ref="..." /> <writer ref="..." /> </chunk> </step>
Configuration:
Shortname: jpaReader
Write items through JPA API.
Sample:
<step id="step1"> <chunk> <reader ref="..." /> <processor ref="..." /> <writer ref="org.apache.batchee.extras.jpa.JpaItemWriter"> <properties> <property name="entityManagerProvider" value="org.apache.batchee.extras.util.MyProvider" /> <property name="jpaTransaction" value="true" /> </properties> </writer> </chunk> </step>
Configuration:
Shortname: jpaWriter
A writer doing nothing (in a writer is mandatory so it can mock one if you don’t need one).
Sample:
<step id="step1"> <chunk> <reader ref="..." /> <processor ref="..." /> <writer ref="org.apache.batchee.extras.noop.NoopItemWriter" /> </chunk> </step>
Shortname: noopWriter
Just abstract class allowing to use typed items instead of Object from the JBatch API.
A reader using StAX API to read a XML file.
Sample:
<step id="step1"> <chunk> <reader ref="org.apache.batchee.extras.stax.StaxItemReader"> <properties> <property name="input" value="#{jobParameters['input']}"/> <property name="marshallingClasses" value="org.apache.batchee.extras.StaxItemReaderTest$Bar"/> <property name="tag" value="bar"/> </properties> </reader> <processor ref="..." /> <writer ref="..." /> </chunk> </step>
Configuration:
Shortname: staxReader
A writer using StAX API to write a XML file.
Sample:
<step id="step1"> <chunk> <reader ref="..." /> <processor ref="..." /> <writer ref="org.apache.batchee.extras.stax.StaxItemWriter"> <properties> <property name="output" value="#{jobParameters['output']}"/> <property name="marshallingClasses" value="org.apache.batchee.extras.StaxItemWriterTest$Foo"/> </properties> </writer> </chunk> </step>
Configuration:
Shortname: staxWriter
A reader using BeanIO.
Sample:
<step id="step1"> <chunk> <reader ref="org.apache.batchee.beanio.BeanIOReader"> <properties> <property name="file" value="#{jobParameters['input']}"/> <property name="streamName" value="readerCSV"/> <property name="configuration" value="beanio.xml"/> </properties> </reader> <processor ref="..." /> <writer ref="..." /> </chunk> </step>
Here is the associated beanio.xml:
<beanio xmlns="http://www.beanio.org/2012/03" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.beanio.org/2012/03 http://www.beanio.org/2012/03/mapping.xsd"> <stream name="readerCSV" format="csv"> <record name="record1" class="org.apache.batchee.beanio.bean.Record"> <field name="field1"/> <field name="field2"/> </record> </stream> </beanio>
Configuration:
Shortname: beanIOReader
A writer using BeanIO.
Sample:
<step id="step1"> <chunk> <reader ref="..." /> <processor ref="..." /> <writer ref="org.apache.batchee.beanio.BeanIOWriter"> <properties> <property name="file" value="#{jobParameters['output']}"/> <property name="streamName" value="writerCSV"/> <property name="configuration" value="beanio.xml"/> </properties> </writer> </chunk> </step>
Configuration:
Shortname: beanIOWriter
A processor reusing Camel logic.
Sample:
<step id="step1"> <chunk> <reader ref="..." /> <processor ref="org.apache.batchee.camel.CamelItemProcessor"> <properties> <property name="endpoint" value="direct:processor"/> </properties> </processor> <writer ref="..." /> </chunk> </step>
Configuration:
Shortname: camelProcessor
Same as previous one but with a chain
Sample:
<step id="step1"> <chunk> <reader ref="..." /> <processor ref="org.apache.batchee.camel.CamelChainItemProcessor"> <properties> <property name="chain" value="test:foo?value=first,test:bar?value=second"/> </properties> </processor> <writer ref="..." /> </chunk> </step>
Configuration: mainly the chain configuration excepted “chain” value is a list of endpoints.
Shortname: camelChainProcessor
A reader using camel consumers.
Sample:
<step id="step1"> <chunk> <reader ref="org.apache.batchee.camel.CamelItemReader"> <properties> <property name="endpoint" value="direct:reader"/> </properties> </reader> <processor ref="..." /> <writer ref="..." /> </chunk> </step>
Configuration:
Shortname: camelReader
A writer using camel producer.
Sample:
<step id="step1"> <chunk> <reader ref="..." /> <processor ref="..." /> <writer ref="org.apache.batchee.camel.CamelItemWriter"> <properties> <property name="endpoint" value="direct:writer"/> </properties> </writer> </chunk> </step>
Configuration:
Shortname: camelWriter
batchee-camel includes a Camel component. Here is its format:
jbatch:name[?synchronous=xxx]
with name the batch name. By default it is not intended to be synchronous but it can be forced (by polling) using synchronous attribute. Synchronous attribute is the polling period and needs to be > 0 to be active.
After this endpoint (even in asynchrnous mode) the exchange will get the headers:
Note: if you set JBatchExecutionId in the headers before this endpoint you can use ?restart=true or ?stop=true or ?abandon=true to restart/stop/abandon the job instead of starting it.
A reader delegating to a groovy script.
Sample:
<step id="step1"> <chunk> <reader ref="groovyReader"> <properties> <property name="scriptPath" value="target/work/reader.groovy"/> </properties> </reader> <processor ref="..." /> <writer ref="..." /> </chunk> </step>
Configuration:
Shortname: groovyReader
A processor delegating to a groovy script.
Sample:
<step id="step1"> <chunk> <reader ref="..." /> <processor ref="groovyProcessor"> <properties> <property name="scriptPath" value="/groovy/processor.groovy"/> </properties> </processor> <writer ref="..." /> </chunk> </step>
Configuration:
Shortname: groovyProcessor
A writer delegating to a groovy script.
Sample:
<step id="step1"> <chunk> <reader ref="..." /> <processor ref="..." /> <writer ref="groovyWriter"> <properties> <property name="scriptPath" value="/groovy/writer.groovy"/> </properties> </writer> </chunk> </step>
Configuration:
Shortname: groovyWriter
A batchlet delegating to a groovy script.
Sample:
<step id="step1"> <batchlet ref="groovyBatchlet"> <properties> <property name="scriptPath" value="/groovy/batchlet.groovy"/> </properties> </batchlet> </step>
Configuration:
Shortname: groovyBatchlet
A simple processor validating an item using bean validation.
Sample:
<step id="step1"> <chunk> <reader ref="..." /> <processor ref="beanValidationProcessor" /> <writer ref="..." /> </chunk> </step>
Configuration:
Shortname: beanValidationProcessor
Use JSefa to read a CSV file.
Sample:
<step id="step1"> <chunk> <reader ref="jsefaCsvReader"> <properties> <property name="file" value="#{jobParameters['input']}"/> <property name="objectTypes" value="org.superbiz.Record"/> </properties> </reader> <writer ref="..." /> </chunk> </step>
Configuration (excepted for file see org.jsefa.csv.config.CsvConfiguration for detail):
Shortname: jsefaCsvReader
Use JSefa to write a CSV file.
Sample:
<step id="step1"> <chunk> <reader ref="..." /> <writer ref="jsefaCsvWriter"> <properties> <property name="file" value="#{jobParameters['output']}"/> <property name="objectTypes" value="org.superbiz.Record"/> </properties> </writer> </chunk> </step>
Configuration (excepted for file and encoding see org.jsefa.csv.config.CsvConfiguration for detail):
Shortname: jsefaCsvWriter
Use JSefa to read a FLR file.
Sample:
<step id="step1"> <chunk> <reader ref="jsefaFlrReader"> <properties> <property name="file" value="#{jobParameters['input']}"/> <property name="objectTypes" value="org.superbiz.Record"/> </properties> </reader> <writer ref="..." /> </chunk> </step>
Configuration (excepted for file see org.jsefa.flr.config.FlrConfiguration for detail):
Shortname: jsefaFlrReader
Use JSefa to write a FLR file.
Sample:
<step id="step1"> <chunk> <reader ref="..." /> <writer ref="jsefaFlrWriter"> <properties> <property name="file" value="#{jobParameters['output']}"/> <property name="objectTypes" value="org.superbiz.Record"/> </properties> </writer> </chunk> </step>
Configuration (excepted for file see org.jsefa.flr.config.FlrConfiguration for detail):
Shortname: jsefaFlrWriter
Use JSefa to read a XML file.
Sample:
<step id="step1"> <chunk> <reader ref="jsefaXmlReader"> <properties> <property name="file" value="#{jobParameters['input']}"/> <property name="objectTypes" value="org.apache.batchee.jsefa.bean.Record"/> </properties> </reader> <processor ref="org.apache.batchee.jsefa.JSefaXmlReaderTest$StoreItems" /> <writer ref="noopWriter" /> </chunk> </step>
Configuration (excepted for file see org.jsefa.flr.config.FlrConfiguration for detail):
Shortname: jsefaXmlReader
Use JSefa to write a XML file.
Sample:
<step id="step1"> <chunk> <reader ref="org.apache.batchee.jsefa.JSefaXmlWriterTest$TwoItemsReader" /> <writer ref="jsefaXmlWriter"> <properties> <property name="file" value="#{jobParameters['output']}"/> <property name="objectTypes" value="org.apache.batchee.jsefa.bean.Record"/> </properties> </writer> </chunk> </step>
Configuration (excepted for file see org.jsefa.flr.config.FlrConfiguration for detail):
Shortname: jsefaXmlWriter
Use Jackson to read a JSon file.
Sample:
<step id="step1"> <chunk> <reader ref="jacksonJSonReader"> <properties> <property name="type" value="..."/> <property name="file" value="work/jackson-input.json"/> </properties> </reader> <writer ref="org.apache.batchee.jackson.JacksonJsonReaderTest$Writer" /> </chunk> </step>
Configuration (excepted for file see org.jsefa.flr.config.FlrConfiguration for detail):
Shortname: jacksonJSonReader
Use Jackson to write a JSon file.
Sample:
<step id="step1"> <chunk> <reader ref="org.apache.batchee.jackson.JacksonJSonWriterTest$Reader" /> <writer ref="jacksonJSonWriter"> <properties> <property name="file" value="target/work/jackson-field-output.json"/> <property name="fieldNameGeneratorClass" value="default"/> <!-- item1, item2, ... --> </properties> </writer> </chunk> </step>
Configuration (excepted for file see org.jsefa.flr.config.FlrConfiguration for detail):
Shortname: jacksonJSonWriter
Just a simple item processor mapping input to another type based on ModelMapper library. To customize the modelMapper just override newMapper() method.
Sample:
<step id="step1"> <chunk> <reader ref="..." /> <processor ref="org.apache.batchee.modelmapper.ModelMapperItemProcessor"> <properties> <property name="destinationType" value="...." /> </properties> </processor> <writer ref="..." /> </chunk> </step>
Configuration (excepted for file see org.jsefa.flr.config.FlrConfiguration for detail):
Shortname: modelMapperProcessor
A batchlet getting a hazelcast lock.
Sample:
<step id="lock" next="check-lock"> <batchlet ref="hazelcastLock"> <properties> <property name="instanceName" value="batchee-test"/> <property name="lockName" value="batchee-lock"/> </properties> </batchlet> </step>
Configuration (excepted for file see org.jsefa.flr.config.FlrConfiguration for detail):
Shortname: hazelcastLock
A batchlet releasing a hazelcast lock.
Sample:
<step id="unlock" next="check-unlock"> <batchlet ref="hazelcastUnlock"> <properties> <property name="instanceName" value="batchee-test"/> <property name="lockName" value="batchee-lock"/> </properties> </batchlet> </step>
Configuration (excepted for file see org.jsefa.flr.config.FlrConfiguration for detail):
Shortname: hazelcastUnlock
@org.apache.batchee.cdi.scope.JobScoped allows you to define a bean scoped to a job execution. @org.apache.batchee.cdi.scope.StepScoped allows you to define a bean scoped to a step execution.
To activate these scopes you need to define 3 listeners: * org.apache.batchee.cdi.listener.BeforeJobScopeListener * org.apache.batchee.cdi.listener.AfterJobScopeListener * org.apache.batchee.cdi.listener.AfterStepScopeListener
If your implementation supports ordering on listeners use them to ensure Before* are executed first and After* are executed last. This will let you use these scopes in your own listeners. *JobScopeListener are javax.batch.api.listener.JobListener and the AfterStepScopeListener is a javax.batch.api.listener.StepListener.
NB: these listeners are @Named so you can use their CDI name to reference them (not mandatory)
If the implementation doesn’t provide any ordering of the listeners be aware these scopes will only work in steps.
For BatchEE you can add them in batchee.properties this way:
org.apache.batchee.job.listeners.before = beforeJobScopeListener org.apache.batchee.job.listeners.after = afterJobScopeListener org.apache.batchee.step.listeners.after = afterStepScopeListener