I'm trying to let the end user import ‘large’ sets of data (CSV/Excel) into the database via the Web GUI. It's not that large as in this (https://community.thinkwisesoftware.com/development-13/best-practice-for-inserting-large-amounts-of-rows-into-one-table-837) thread, but large enough to let the web interface time-out. The problem then is that the import is partially done: It could have imported for example 3k of 50k records. The end-user doesn't know what to do: What part is partially imported? And how to fix it?
I'm trying to speed this up, but I can't really increase it's performance. When I take a look at the debugger in de Windows GUI I see a lot of database-calls per row. For each row the default and layout concept are triggered, and the record is inserted per piece. Imagine you want to insert 50k records, and each cycle takes only 5ms, but when it's multiplied with 50k it's 250 seconds. From a user point of view this is absolutely unacceptable (besides it probably crashes in the web interface).
I have thought of disabling the logic concepts, but the problem still remains. And disabling these concepts introduces other problems, so it's not really a good idea to do.
I'm not sure if it's an feature request, but this part (the Import) could be improved a lot. If I compare a 50k set that is imported via SqlBulkCopy.WriteToServer it takes only two seconds. Due to the fact that this operation is set-based and does not need unneccesary roundtrips to the server. My suggestion would be to do have a “Bulk” Excel/CSV import that is set-based, ignoring the concepts and inserting set-based, not loop-based.
Are there any other possibilities within the Thinkwise platform to increase the performance of the Import feature, in specific the web GUI?