Multirow task execution

Multirow task execution
Userlevel 6
Badge +8

When executing a task for a selection of rows, the user interface first executes the context procedure for every single row to check if the user is allowed to execute the task for that row, and then executes the task for each permitted row. This can be quite time consuming, especially for web clients where the progress is also reported back to the client.

To speed up multirow task execution, a new feature has been developed to execute tasks for a selection of rows.

To enable multirow execution for a task:

  1. Add an XML-typed parameter to the task
  2. Make sure Popup for each row is switched off
  3. Set the MultiselectParameterID extended property with the name of the parameter


With multirow execution enabled for a task, when the user interface executes the task, the specified XML parameter is filled with the parameter values for all rows. Even if only one row is selected.

Example XML parameter value:

<rows>
<row>
<param1>1</param1>
<param2>a</param2>
</row>
<row>
<param1>3</param1>
<param2>b</param2>
</row>
</rows>

Be aware that both the default and the context logic are executed only once and not for every row. The parameter values for all rows are included in the XML, even if the context logic would have disabled the execution of the task for a specific row. The user interface also does not remove duplicate parameter rows when composing the XML, so you'll need to take care of that yourself.

In the task logic, use the XML parameter value instead of the regular parameters, for example:

T-SQL

select distinct
row.value('./param1[1]', 'int'),
row.value('./param2[1]', 'varchar(10)')
from @xml.nodes('/rows/row') as t2(row)


IBM i SQL

select distinct 
r.param1,
r.param2
from xmltable ('$xml/rows/row' passing v_xml as "xml"
columns
param1 integer path './param1[1]',
param2 varchar(10) path './param2[1]'
) as r;

For information on working with SQL data on SQL Server, see this link.
More info on SQL XML programming for IBM i can be found here.


⚠️ In the future this feature will be configurable through the metamodel and will probably use table valued parameters instead of XML. We'll be able to automatically upgrade the model, but you'll need to change the code yourself!


12 replies

Userlevel 6
Badge +4
This feature works very well in conjunction with drag-drop. When you drag multiple rows, you can have a multirow-task execute once with all the dragged rows.
Userlevel 5
Badge +3
Sounds great. But I can't quite see yet whether this would solve our issue where the process in question contains a http connector. The api we're talking to, only allows a single item to be requested every time. Which means the http connector needs to be ran for every item in the selection, which (if I read correctly) is still not possible.
Userlevel 6
Badge +8
Hi Pim, good question.

A possible solution would be to create a process flow in which you recursively store the values of the first row in process flow parameters, delete the row from the XML, and then call the HTTP connector using these parameters.

The SQL code for the task or process procedure to assign the parameters would look something like this:
code:
-- Store the values of the first row
select
@param1 = row.value('./param1[1]', 'int'),
@param2 = row.value('./param2[1]', 'varchar(10)')
from @xml_input.nodes('/rows/row[1]') as t2(row)

-- Delete the first row
set @xml_input.modify('delete /rows/row[1]')

The HTTP connector should be called while the parameters have a value.
Also make sure you use a different name for the XML input parameter of the task than the MultiselectParameterID.
Userlevel 5
Badge +3
I don't think looping in processflows ever even crossed my mind. This could be a very nice solution.
One thing though, I would personally prefer keeping a pointer (int) to the current row, this would keep the input intact.

I had to fiddle around a bit, but here's an example of using a pointer:
Userlevel 5
Badge +3
code:
declare @pointer int = 1
while @pointer < 3
begin
select row.value('(./param1[1])', 'int')
, row.value('(./param2[1])', 'varchar(10)')
from @xml.nodes('(/rows/row[sql:variable("@pointer")])') as t2(row)

set @pointer += 1
end
Userlevel 4
Badge +10
I'm trying something similar with the copy file process action to copy a batch of files by obtaining "from file" and "to file" values from a select query. I put all those rows in a table variable and then loop through these rows one by one. But this is the part that eludes me:

The HTTP connector should be called while the parameters have a value.


What happens now is that the loop finishes and ends up outputing a single set of null values. How can I insert a sql "while" loop into a process flow so that it outputs parameters set by set into the process variables?
Userlevel 6
Badge +14

Does this feature also work for reports? 

I need to make a report with an output based on selected rows in a grid.

 

Does Multirow task execution still work from version 2020.1?

Userlevel 4
Badge +3

Does this feature also work for reports?

 

Reports currently don't support this. We should think about this in a design and investigate whether report implementations are able to process these kind of structured data.

I'd suggest to park this question in a separate post/idea to think about alternatives and to investigate the need for this.

Link to the parked idea:

https://community.thinkwisesoftware.com/ideas-18/multirow-report-execution-1048#post3001

Userlevel 2
Badge +2


⚠️ In the future this feature will be configurable through the metamodel and will probably use table valued parameters instead of XML. We'll be able to automatically upgrade the model, but you'll need to change the code yourself!

 

Gradually our codebase gets cluttered more and more with your originally suggested solution. Can you get us an update on when you expect the future will arrive on this matter?

Userlevel 6
Badge +8

@Ricky This is on our backlog, but we have not yet planned a date for this.

It is, however, trivial to locate and rewrite the select (and delete) from the XML to use a table valued parameter. And from a functional point of view, it doesn't make any difference.

Userlevel 4
Badge +11


⚠️ In the future this feature will be configurable through the metamodel and will probably use table valued parameters instead of XML. We'll be able to automatically upgrade the model, but you'll need to change the code yourself!

 

Gradually our codebase gets cluttered more and more with your originally suggested solution. Can you get us an update on when you expect the future will arrive on this matter?

What we've currently have done is having everywhere the same structure of creating a temporary table, which should be replaceable with the actual solution. We’re everywhere using @multi_row as input and @record as actual table used.

On each template we apply it (almost all tasks), we do something like this:

declare @record table (project_id id not null
, part_id id not null)

insert into @record (project_id
, part_id)
select x.value('(project_id)[1]', 'id')
, x.value('(part_id)[1]', 'id')
from @multi_row.nodes('rows/row') as t(x)

I hope for us that we by doing so we only need to remove the code above and can use the @records table.

 

@Ricky This is on our backlog, but we have not yet planned a date for this.

It is, however, trivial to locate and rewrite the select (and delete) from the XML to use a table valued parameter. And from a functional point of view, it doesn't make any difference.

And how to deal with unit tests on multi rows? Currently I have a long input string (<rows><row><item>1</item></row><row><item>2</item></row></rows>), but that should be converted to some kind of other data structure allowing multiple rows at once? That is less trivial than find and replace?

Userlevel 6
Badge +8

Hi Ricky,

I hope for us that by doing so we only need to remove the code above and can use the @records table.

That is indeed how it will need to be modified. 

 

And how to deal with unit tests on multi rows?

We will automatically convert these input strings to the required data structure.

Reply