At the moment one of my job definition files looks like below. The problem is that I would like to run many identical chunks(parallel queue processing) and in this case I would have to duplicate for example 20 identical flows which differ only in id. Is it possible to do it somehow programatically or maybe there is some property that would be useful?
It is also not the problem to switch to Spring Batch if there is such possibility.
<?xml version="1.0" encoding="UTF-8"?>
<job id="socialJob" xmlns="http://ift.tt/19L2NlC"
xmlns:xsi="http://ift.tt/ra1lAU"
xsi:schemaLocation="http://ift.tt/19L2NlC http://ift.tt/1qXYCN2"
version="1.0">
<step id="prepare" next="mySplit">
<batchlet ref="myPreProcessor" />
</step>
<split id="mySplit">
<flow id="s1">
<step id="myStep1">
<chunk item-count="1">
<reader ref="myReader" />
<processor ref="myProcessor" />
<writer ref="socialWriter" />
<skippable-exception-classes>
<include class="java.lang.Exception" />
</skippable-exception-classes>
</chunk>
</step>
</flow>
<flow id="s2">
<step id="myStep2">
<chunk item-count="1">
<reader ref="myReader" />
<processor ref="myProcessor" />
<writer ref="socialWriter" />
<skippable-exception-classes>
<include class="java.lang.Exception" />
</skippable-exception-classes>
</chunk>
</step>
</flow>
</split>
</job>
Aucun commentaire:
Enregistrer un commentaire