Multirow task execution

  • 4 February 2019
  • 6 replies
Multirow task execution
Userlevel 6
Badge +6
When executing a task for a selection of rows, the user interface first executes the context procedure for every single row to check if the user is allowed to execute the task for that row, and then executes the task for each permitted row. This can be quite time consuming, especially for web clients where the progress is also reported back to the client.

To speed up multirow task execution, a new feature has been developed to execute tasks for a selection of rows.

To enable multirow execution for a task:
  1. Add an XML-typed parameter to the task
  2. Make sure Popup for each row is switched off
  3. Set the MultiselectParameterID extended property with the name of the parameter

With multirow execution enabled for a task, when the user interface executes the task, the specified XML parameter is filled with the parameter values for all rows. Even if only one row is selected.

Example XML parameter value:

Be aware that both the default and the context logic are executed only once and not for every row. The parameter values for all rows are included in the XML, even if the context logic would have disabled the execution of the task for a specific row. The user interface also does not remove duplicate parameter rows when composing the XML, so you'll need to take care of that yourself.

In the task logic, use the XML parameter value instead of the regular parameters, for example:

select distinct
row.value('./param1[1]', 'int'),
row.value('./param2[1]', 'varchar(10)')
from @xml.nodes('/rows/row') as t2(row)

select distinct 
from xmltable ('$xml/rows/row' passing v_xml as "xml"
param1 integer path './param1[1]',
param2 varchar(10) path './param2[1]'
) as r;

For information on working with SQL data on SQL Server, see this link.
More info on SQL XML programming for IBM i can be found here.

⚠️ In the future this feature will be configurable through the metamodel and will probably use table valued parameters instead of XML. We'll be able to automatically upgrade the model, but you'll need to change the code yourself!

6 replies

Userlevel 6
Badge +2
This feature works very well in conjunction with drag-drop. When you drag multiple rows, you can have a multirow-task execute once with all the dragged rows.
Userlevel 5
Badge +3
Sounds great. But I can't quite see yet whether this would solve our issue where the process in question contains a http connector. The api we're talking to, only allows a single item to be requested every time. Which means the http connector needs to be ran for every item in the selection, which (if I read correctly) is still not possible.
Userlevel 6
Badge +6
Hi Pim, good question.

A possible solution would be to create a process flow in which you recursively store the values of the first row in process flow parameters, delete the row from the XML, and then call the HTTP connector using these parameters.

The SQL code for the task or process procedure to assign the parameters would look something like this:
-- Store the values of the first row
@param1 = row.value('./param1[1]', 'int'),
@param2 = row.value('./param2[1]', 'varchar(10)')
from @xml_input.nodes('/rows/row[1]') as t2(row)

-- Delete the first row
set @xml_input.modify('delete /rows/row[1]')

The HTTP connector should be called while the parameters have a value.
Also make sure you use a different name for the XML input parameter of the task than the MultiselectParameterID.
Userlevel 5
Badge +3
I don't think looping in processflows ever even crossed my mind. This could be a very nice solution.
One thing though, I would personally prefer keeping a pointer (int) to the current row, this would keep the input intact.

I had to fiddle around a bit, but here's an example of using a pointer:
Userlevel 5
Badge +3
declare @pointer int = 1
while @pointer < 3
select row.value('(./param1[1])', 'int')
, row.value('(./param2[1])', 'varchar(10)')
from @xml.nodes('(/rows/row[sql:variable("@pointer")])') as t2(row)

set @pointer += 1
Userlevel 4
Badge +10
I'm trying something similar with the copy file process action to copy a batch of files by obtaining "from file" and "to file" values from a select query. I put all those rows in a table variable and then loop through these rows one by one. But this is the part that eludes me:

The HTTP connector should be called while the parameters have a value.

What happens now is that the loop finishes and ends up outputing a single set of null values. How can I insert a sql "while" loop into a process flow so that it outputs parameters set by set into the process variables?