Solved

Uploading large files to Azure Files using staging with Indicium

  • 22 May 2023
  • 2 replies
  • 70 views

Userlevel 1
Badge +1

Hi,

I am trying to upload a (potentially large) document using staging within indidium. The document itself makes use of a storage location based on Azure Files. When I use the straight forward HTTP POST API  .. /<app>/document:

{
    "document_type_id": 1,
    "document_date": "2023-05-18T00:00:00Z",
    "name": "Test upload document 1",
    "storage_location": {
        "FileName": "test_upload_doc-1.pdf",
        "File": "JVBERi0...”
    }
}

this works fine. The file is BASE64 encoded and will be, as far as I know, be posted as one chunk.

For large documents however this approach is to limited. I have read the posts about staging and think this will be the way to go for large documents. I have implemented a setup which uses the following steps:

  1. Create the stage for a new document: HTTP POST .. /<app>/document/stage_add
  2. Patch some columns: HTTP PATCH .. /<app>/staged_document(#<stage_id)
    1. { “name” : “test_upload_doc-1.pdf” }
  3. Try to upload the document
  4. Commit the stage: HTTP POST: .. /<app>/staged_document(#stage_id)/commit

So steps 1, 2 and 4 are executed without a problem. For me however it is unclear how to upload the document (step 3). I have looked at the other posts with an example using Insomnia, but for me it is not clear what happens there. The example shows a HTTP POST with an .. /upload_photo. When I try to do an HTTP POST .. /staged_document(<#stage_id>)/upload_photo and a image file as a binary upload I get an Invalid OData URL. message. A HTTP PATCH is in my opinion not possible because there is no column to upload the binary to. I am using Postman but also downloaded Insomnia because Postman doesn’t seem to support HTTP/2 requests as shown in the previous example post. Insomnia with HTTP/2  produces the same result however. I didn't find information on upload_photo or upload_document and how these can be used.

In the HTTP POST of the binary there doesn’t seem to be a relation to the column that references the storage location (in my example storage_location which implements a domain with varchar(1000) for the filename and an upload control pointing to the Azure Files storage definition).

My question is how this upload can be achieved using staging. Hope some can help...

icon

Best answer by Robert Wijn 2 23 May 2023, 09:58

View original

This topic has been closed for comments

2 replies

Userlevel 6
Badge +4

Hello Robert,

The problem with your upload request is the upload_photo part in the URL. As you speculated yourself here:

In the HTTP POST of the binary there doesn’t seem to be a relation to the column that references the storage location (in my example storage_location which implements a domain with varchar(1000) for the filename and an upload control pointing to the Azure Files storage definition).

There must be a relation to the upload request and the column that references the storage location. In your case it should be upload_name. The upload request points to the column by name, with an upload_ prefix.

I hope this helps.

Userlevel 1
Badge +1

Hello Vincent,

Thanks for your clarification. Uploading of the document now works as expected.