Solved

Records in report queue are not being processed

  • 7 February 2019
  • 5 replies
  • 222 views

Userlevel 3
Badge +11
We have a Reporting Service installed and configured to look at multiple instances/databases (development and test). All records in table rapport_queue are processed for the first connection. When a record is added in the second application, it is not processed by the Reporting Service. After a restart of the service, all records with report_status = 0 are processed correctly. New records will not be processed after a restart.

We have uninstalled the Reporting service (3.00) and installed the latest version (3.20) with no different result.

What could be the problem?

Config:

connectionString="
Data Source=CFS-TWDB01\THINKWISE;
Initial Catalog=CF_FINAS;
Integrated Security=true;
User Id=;
Password=;
Persist Security Info=True"
providerName="System.Data.SqlClient"/>

connectionString="
Data Source=CFS-TWDB01\TEST;
Initial Catalog=CF_FINAS;
Integrated Security=true;
User Id=;
Password=;
Persist Security Info=True"
providerName="System.Data.SqlClient"/>
icon

Best answer by Tim Waalewijn 13 February 2019, 14:59

View original

5 replies

Userlevel 5
Badge +5
I usually have just one connection per reporting service instance running. Have you tried removing the first connection (temporarily) in order to see if it picks up the data from the second, 'test'?

Also, is the service user authorised to access the database mentioned in the second connection?
Userlevel 5
Badge +5
@Hugo Nienhuis I wouldn't know. Perhaps someone with more knowledge about the inner workings of the reporting service can chime in?

Looking a bit further at my current setup:
I'd expect the logs to be created, provided the reporting service (user) has got the right authorisation to be able to write to the folder where the log files are written to. I think the location of the log-file can be set up in NLog.config.
Also, the event-viewer should be able to provide additional information if the Reporting Service stops.
Userlevel 2
Badge
EDIT: XML examples got removed inside code blocks.

As @Robert Jan de Nie pointed out, the default configuration of the NLog.config that is installed with the service is set-up to log to a file named "thinkwise.log" in the same directory where the service was installed to.

The target for this rule can be found inside the NLog.config on line 30 and should look like this by default:

code:
target name="logfile" xsi:type="File" layout="${longdate}|${uppercase:${level}}|${connectionName}|${reportid}|${message}" fileName="${basedir}/thinkwise.log" keepFileOpen="true" autoFlush="true"

Simply change the value of the fileName attribute to modify where the log will be written to.

In addition, the default configuration only logs Warnings, Errors and Fatal errors.
Should you need more information (such as when a polling attempt to either connection was made) you can change the minLevel attribute of the rule on line 41 to Info like so:

code:
logger name="*" minlevel="Info" writeTo="logfile"

As for the original question, polling multiple databases from one service installation is indeed supported as long as the name given to each connectionString is unique.
The following configuration should work in your case:

code:
connectionStrings
clear
add name="Production"
connectionString="
Data Source=CFS-TWDB01\THINKWISE;
Initial Catalog=CF_FINAS;
Integrated Security=True;
Persist Security Info=True"
providerName="System.Data.SqlClient"
add name="Test"
connectionString="
Data Source=CFS-TWDB01\TEST;
Initial Catalog=CF_FINAS;
Integrated Security=True;
Persist Security Info=True"
providerName="System.Data.SqlClient"
connectionStrings

Do note however that when using multiple connections they all share the same level of concurrency.
What I mean by that is that if you have two connections that both queue 10 reports at the same time, with the default ConcurrencyLevel value of 5, those 20 reports will still be generated 5 at a time.

Also note that some report types (SSRS/CR etc.) handle concurrent processing better than others so changing the ConcurrencyLevel value in the config file on line 7 to a higher value might not speed up report generation as much as you'd expect it to.

@Hugo Nienhuis You also mentioned that the service "stops".
Is this related to the earlier issue you've reported in TCP where the service completely stops after processing a number of reports?

If so, and for the sake of context, I'll reiterate my assumed cause of the problem here.

The log you've posted in said issue looked like this:

code:
2018-10-19 11:58:59.3054|FATAL|[PRODUCTIE_SERVER]|1179|Something went wrong while processing this report: Het laden van het rapport is mislukt.
Inner exception: Het laden van het rapport is mislukt.
2018-10-19 11:58:59.3054|FATAL|[PRODUCTIE_SERVER]|1178|Something went wrong while processing this report: Het laden van het rapport is mislukt.
Inner exception: Het laden van het rapport is mislukt.
2018-10-19 11:58:59.3224|ERROR|[PRODUCTIE_SERVER]|1179|An error occured while updating the status of the report: Attachment file \\\sync\herhaalde_bon_pdf\47164.pdf is invalid.
2018-10-19 11:58:59.3224|ERROR|[PRODUCTIE_SERVER]|1178|An error occured while updating the status of the report: Attachment file \\\sync\herhaalde_bon_pdf\47164.pdf is invalid.
2018-10-19 11:59:09.2776|FATAL|[PRODUCTIE_SERVER]||An unrecoverable error has occured on this connection, aborting polling.
2018-10-19 11:59:09.2776|FATAL|||All polling connections have been aborted, stopping application/service.

What I can deduce from this log is that the service was generating reports for a connectionString that was named [PRODUCTIE_SERVER] and that it was currently generating two reports with IDs 1178 and 1179.

Evidently both these reports failed to be generated properly giving a message: "Het laden van het rapport is mislukt." or in English: "The loading of the report has failed."
Which I'll guess means that there is something wrong in the report file itself.

Next both reports indicate that "An error has occured while updating the status of the report:".
This means that the service called the stored procedure it uses to update the report queue and something went wrong inside the implementation of this procedure.

The name of this procedure is "rpt_set_report" by default but can be changed in the service's config file on line 11:

code:
add key="UpdateReportQueueStatus" value="rpt_set_report"

The reporting service does not allow implementations of this procedure to ever throw an error.
This is because updating the status of the report in the queue is considered a critical part of the entire service connection.

In the design of the reporting service we've opted to close the polling connection, in this case to [PRODUCTIE_SERVER], when an error occurs to prevent the service from processing the same reports over and over without anyone noticing.
Because, if an error occurs during the status update, we cannot be sure that the status was actually updated.

Of course the implementation of this procedure could also just not remove the report from the queue without giving an error but there is no way of reliably detecting that on the service side of things.

Continuing on, the actual error that occurred inside the update procedure for both reports is: "Attachment file \\\sync\herhaalde_bon_pdf\47164.pdf is invalid."

Since you are using SQL Server a Google search for this error gives a bunch of results for a "sp_send_dbmail" stored procedure.
This makes me guess that you are also using the status update procedure implementation to mail the generated report somewhere.

Given that both report records 1178 and 1179 failed to load, and thus never generated the actual report, I'll assume that "\\\sync\herhaalde_bon_pdf\47164.pdf" was the location that it was supposed to be saved to.
Assuming that is the case it makes sense for "sp_send_dbmail" to complain that the report it expects as an attachment couldn't be found and thus is invalid.

I'd like to suggest that you ask the developer that made the reporting service implementation for your application to check if the "rpt_set_report" procedure looks at the status parameter value before attempting to mail the report.

Perhaps something like this:

code:
-- Update the status of the report in the queue.
-- This is just an example.
update report_service_queue
set report_status = @status,
report_message = @message
where rpt_id = @id;

-- @status is 2 if the report row was processed without errors.
-- @status is 3 if an error occurred while processing the row.
if @status = 2
begin
-- @status is 2 so report should be available as an attachment now.
-- exec sp_send_dbmail here.
end

Sorry for the wall of text but I hope this information gave more insight into why your reporting service installation eventually stops completely.
Userlevel 3
Badge +11
I usually have just one connection per reporting service instance running. Have you tried removing the first connection (temporarily) in order to see if it picks up the data from the second, 'test'?

Also, is the service user authorised to access the database mentioned in the second connection?


I tried that and it shows that after processing the records the Reporting Service stops. I didn't find a log file (thinkwise.log) in the directory. Are there additional steps necessary after installing the Reporting Service to create the log?
Userlevel 3
Badge +11
@Tim Waalewijn thanks for your wall of text 🙂. It was very helpfull so far, we have been able to activate the log and find the reasons for the fatal error that occurred.

Basically it was induced because of a false path to an html-file. It is not desirable I think that the Reporting Service behaves like this: in the first batch the report is finished and it results in a document, afterwards the service stops due to a fatal error, and therefore the second report/batch of reports is not being processed.

But that aside, it works for us now.

I will look into the reporting service on the production server again, but it seemsd the problem has been solved somehow. We haven't had any problems alike since october/november 2018.

Reply