I'm trying to let the end user import ‘large’ sets of data (CSV/Excel) into the database via the Web GUI. It's not that large as in this (https://community.thinkwisesoftware.com/development-13/best-practice-for-inserting-large-amounts-of-rows-into-one-table-837) thread, but large enough to let the web interface time-out. The problem then is that the import is partially done: It could have imported for example 3k of 50k records. The end-user doesn't know what to do: What part is partially imported? And how to fix it?
I'm trying to speed this up, but I can't really increase it's performance. When I take a look at the debugger in de Windows GUI I see a lot of database-calls per row. For each row the default and layout concept are triggered, and the record is inserted per piece. Imagine you want to insert 50k records, and each cycle takes only 5ms, but when it's multiplied with 50k it's 250 seconds. From a user point of view this is absolutely unacceptable (besides it probably crashes in the web interface).
I have thought of disabling the logic concepts, but the problem still remains. And disabling these concepts introduces other problems, so it's not really a good idea to do.
I'm not sure if it's an feature request, but this part (the Import) could be improved a lot. If I compare a 50k set that is imported via SqlBulkCopy.WriteToServer it takes only two seconds. Due to the fact that this operation is set-based and does not need unneccesary roundtrips to the server. My suggestion would be to do have a “Bulk” Excel/CSV import that is set-based, ignoring the concepts and inserting set-based, not loop-based.
Are there any other possibilities within the Thinkwise platform to increase the performance of the Import feature, in specific the web GUI?
Best answer by Vincent Doppenberg
I think that both asynchronous imports and transactional imports are nice feature requests. An idea has already been posted for asynchronous imports, you can vote on it here. I recommend posting an idea for transactional imports, if you would like to see this feature.
While asynchrony will solve the timeout issue that you are facing, I don't think it's a complete solution to the problem, which is that imports are simply quite slow when the number of records is very high (i.e. >10k). I think that there's a trade-off here between applying application logic to each record and importing records very quickly. I don't think that either option is a one size fits all solution, so it might be worthwhile to submit an idea to support both and making it configurable.
It is already possible to avoid the timeout by the way. You can set the executionTimeout option in the web.config of the Web GUI. It is not ideal, since this is an application-wide setting and it won't prevent the Web GUI from being blocked for a long time, but it should allow the import to finish.