Skip to main content
I'm trying to move our product to the Cloud using this blog post: https://community.thinkwisesoftware.com/blogs-21/to-the-cloud-leveraging-microsoft-azure-paas-155. I'm facing some difficulties regarding the differences between an on-premise environment (SQL Server and a file system) and Azure (Azure SQL Database and Azure File Storage). To start with, it's the difference in handling files.



In the on-premise setting I'm able to upload files and read them using methods in SQL Server like xp_DirTree for reading a directory, and openrowset / bulk insert reading the contents of the file. In an Azure environment this isn't possible.



Lets say the steps for the end user are currently as following:


  1. The user starts a task where it can upload a file using a 'File upload' control. This upload is bound to an Azure File Storage location, so it'll upload immediately.
  2. The user executes the task: The sql procedure of the task is called to process the uploaded file. In example the user uploads a .txt file with some data to be parsed. The result of this action is in example a table record which is added.

What is the best approach using Thinkwise of processing files via an Azure SQL Database? Is the Azure File Storage really necessary, or could it be skipped as it's only an input parameter for the task? Or is it possible to access the Azure File Storage (Not blob!) via a SQL command? Or can I use a process flow to add one or more files as parameter to a sql stored procedure?



Another task an user can do in the product, is generate files (export) and save them on a storage location. In the on-premise situation this would be a network drive. In the cloud environment you don't have a network drive. But I think that could / should be replaced by a Azure File Storage. Because the Azure File Storage can be mounted by the end user as a regular network drive, so they (or another piece of software) could process the generated files.



Same question here: Is it possible to create a file via a SQL stored procedure and store it on a Azure File Storage? Or is something built-in with a process flow possible, saving the results of a stored procedure into a file.



Please note we're using the web client here, as with the new universal client it probably will be web-only.
If you only want to use the file content in the task's business logic then you can use database storage instead of Azure storage. The contents of the file will then be available through the configured storage parameter.



Regarding the second question: apparently it is possible to access Azure Blob (and File?) storage using T-SQL bulk insert and openrowset commands from Azure SQL Database. This does, however, require some configuration on the database side.



An alternative would be to use a process flow to upload the file. We don't provide a dedicated process flow connector for Azure File Storage yet, but it should be possible using the regular HTTP connector.
If you only want to use the file content in the task's business logic then you can use database storage instead of Azure storage. The contents of the file will then be available through the configured storage parameter.



Regarding the second question: apparently it is possible to access Azure Blob (and File?) storage using T-SQL bulk insert and openrowset commands from Azure SQL Database. This does, however, require some configuration on the database side.



An alternative would be to use a process flow to upload the file. We don't provide a dedicated process flow connector for Azure File Storage yet, but it should be possible using the regular HTTP connector.




I've managed to solve the 'upload' by using a database storage, so there is no longer a file read from sql server. I've also changed a multi-file read to a single-file read: For time being the users have to zip their files. Using a http connector and a web service the contents is unzipped and returned as xml, so it can be further processed. From end user perspective it should be more friendly they can select multiple files and upload it. That is also mentioned over here: https://community.thinkwisesoftware.com/ideas-18/upload-and-process-multiple-files-drag-drop-547



Regarding the Azure file storage, maybe we should skip this initially. I was thinking about a simple 'download', just like downloading a new version of the SF in TCP. I've looked around in the docs, but cannot find how to set up this. How is this done? Process flow?
Hi René,



There currently is no Download file process action yet, so you would have to show a task popup with a file control to download the file.
Hi René,



There currently is no Download file process action yet, so you would have to show a task popup with a file control to download the file.




I've managed to create a 'download' experience in a regular form, or click the file (from database) and open it, so that's achieved🙂.



This is a sample hard coded xml file that can be downloaded. The final export will be a zip file. This must be generated by using a process flow sending the content to be zipped in xml to an self built API, which wraps it up into a zip-file. We also do something similar unzipping on an API and sending it back as XML for further processing.



I also tried to use a task, but that doesn't work unfortunately. In the web client I receive an error message like below. In the Windows GUI I also get an error message. I think they occur due to the fact the file storage type is a database storage. As far i can see the code should be OK.



Message on web:

Error message

code:
The requested file could not be found or is not accessible.

System.Exception


Message on Windows:





View used to simulate file download:



code:
-- view iproject_export]
select p.project_id
, 'export.xml'
, cast('export' as varbinary(max))
from project p

Reply