Solved

Reduce amount of deployment packages

  • 31 October 2019
  • 2 replies
  • 101 views

Userlevel 5
Badge +15

I'm trying to reduce the amount of deployment packages necessary to be executed.

Currently we're developing with several branches, resulting in a lot of project versions. We don't want to deploy each version to the customer (i.e. the changes are to minor), but only when we think it's needed.

Using the deployment package, a package can be created. Unfortunately this creates an upgrade script based on the previous version, and not on an earlier version. In other words, we've created V1.00, V1.10 and V1.20. We want to skip V1.10, but it does contain database-updates that are necessary.

How can I create a deployment package in the Software Factory creating a package that allows upgrading from V1.00 to V1.20 including the V1.10 update?

In the documentation of the manifest file it looks like it could be there ( https://office.thinkwisesoftware.com/docs/docs/deployment/deployer.html#manifest). And as I can see it's used in the SF and IAM. By having multiple "supportedVersions" values in the manifest.json.

When I've generated a deployment package for “1.20” this is what I get:


{
"schema": 2,
"products": [
{
"type": "Application",
"projectId": "SANDBOX",
"version": "1.20",
"metaVersion": "2019.2",
"projectFolder": "",
"dependencies": [
"CompatibilityLevel110"
],
"packages": [
{
"type": "Install",
"path": "Install",
"defaultDatabaseName": "SANDBOX"
},
{
"type": "Upgrade",
"path": "Upgrade",
"supportedVersions": [
{
"version": "1.10",
"upgradesTo": "1.20",
"files": [
"1.20\\020_Upgrade.sql",
"1.20\\040_Constraints.sql",
"1.20\\050_Indexes.sql"
]
}
]
}
]
}
]
}

But what I want is the same file, but the following line

"version": "1.10",
"upgradesTo": "1.20",

changed (and the update scripts adjusted) to:

"version": "1.00",
"upgradesTo": "1.20",

or

"supportedVersions": [
{
"version": "1.00",
"upgradesTo": "1.10",
"files": [
"1.10\\020_Upgrade.sql"
]
},
{
"version": "1.10",
"upgradesTo": "1.20",
"files": [
"1.20\\020_Upgrade.sql",
"1.20\\040_Constraints.sql",
"1.20\\050_Indexes.sql"
]
}
]

So the questions are:

  • Can I generate via the SF multiple "supportedVersions"?

    or

  • Can I have 1 supportedVersions with "version" 1.00 to "upgradesTo" 1.20 via the SF?

Off course, this could be done manually, but if it could be done via the SF that would be better.

icon

Best answer by Ester 4 November 2019, 12:14

View original

2 replies

Userlevel 3
Badge +5

Hello René,

The anwser to the first question is no. You can only generate a packages from the SF from one version to another.

Once you have these versions, you could combine them yourself to create one deployer with mulitple “supportedVersions’.

  • Only use the folders “Install”, "MetaModel”  and "Resources” from the latest package.
  • Use all subfolders in the Upgrade folder of all your versions.
  • Add all the supportedVersion parts to the latest package.
    • When you use smart upgrades, leave all scripts
    • When you use full upgrades (2020.1 feature, sorry for now), only upgrades scripts will be enough.

So, this one is correct:

"supportedVersions": [
{
"version": "1.00",
"upgradesTo": "1.10",
"files": [
"1.10\\020_Upgrade.sql"
]
},
{
"version": "1.10",
"upgradesTo": "1.20",
"files": [
"1.20\\020_Upgrade.sql",
"1.20\\040_Constraints.sql",
"1.20\\050_Indexes.sql"
]
}
]

 

For everyone else, if you use this too, please vote. When the toppic gets al lot of votes, we can give it priority to automate this process and support deployment packages for mulitple versions.

Userlevel 5
Badge +15

Thanks, I think I'll find a way to manage it in it's current form.

This is also a good remark:

Only use the folders “Install”, "MetaModel”  and "Resources” from the latest package.

By not applying the MetaModel from each version (both 1.10, 1.20 in the given example) the duration to apply this script decreases a lot.

We currently read all manifest.json files in a given deployment folder. This all together results in a sequence of upgrade ‘steps’  / manifest files to be executed.

For now that will work, but I don't exactly know what the consequences are in the future: Applying certain database updates can be very heavy due to schema changes which can lead into a full table copy or re-applying indexes again. If in example a table copy occurs in the upgrade from 1.00 to 1.10 and again in 1.10 to 1.20 it could have been prevented. Please note I have not yet met this situation.

Reply