Month: March 2020

Encrypt Acquired FTP file Dynamically Before Storing In Storage Account

I encuntered a requirement to acquire some unencrypted files from an on premise FTP server and place them on a target Azure Storage Account after first encrypting them. The goal is to not have these files in Azure unless they are encrypted.

This can be achieved in possibly a number of a ways but this post is about using Logic Apps to do this.

Pre-Requisites

The following are assumed to be in place already:

  • An accessible FTP Server where the source file is hosted
  • An accessible Storage Account where the encrypted file is to be targeted to
  • An accessible Key vault with an encryption Key
A number of the organic testosterone inoculations that work generic levitra india raindogscine.com are oysters, cottage cheese, and fruits similar to banana and figs. Even, disorders related to eating like bulimia or anorexia nervosa can be the contributing factors. generic sildenafil 100mg Instead, when the order for the prescription medicines is approved by the website’s physicians the order is sent out for shipping. order generic viagra raindogscine.com levitra pill It is a very common sexual problem affecting millions of men face the same problem.

Create Logic App

I’m going to use Azure Portal to do this.

First type Logic App in the search bar on the dashboard/home screen. Click on Logic App to bring up the screen.

You can see I have already attempted the action but we’ll create a new one for the purposes of writing this post. Hit Add and fill out the details for a new Logic App:

I’ve used an existing resource group but you could create one if needed. I called the Logic App “encrypt-ftp-to-storageaccount”. Hit Review + Create and it will validate before offering the create screen:

Hit Create and the deployment will execute until completion:

Now hit Go to resource and it will bring up the initial screen for the new Logic App…

Scroll down and click on “Blank Logic App”:

Notice that because I’ve already been playing in this area my Recent selections include some of the components we’re going to use here.

We need some kind of simple trigger to kick this off so let’s just use a Schedule with a once a day recurrence. Click on Schedule and then Recurrence and set the Interval to 1 and the Frequency to Day:

Now add a new step – we want to acquire the file from the FTP server.

Click on New Step.

Click on FTP if it is in your recent list or search for it. Then choose Get File Content (or Get File Content With Path if you need to specify a path). You then get the action on the editor with a number of fields to fill out:

Give the Connection a name (e.g. “ftp source”). Set the ftp server address, username and password and port number. I’ve not tried changing any of the other details but some of them might be sensible to address in your own environment. Click Create to create this action.

Now you can select a file – this could be programmed but for this exercise I’m just going to point it at a fixed file (robots.txt):

Now add another step by clicking “+ New Step”:

Choose Azure Key Vault from the Recent (if available) or by searching and then select “Encrypt data with key”. This creates the step:

Depending on whether you are already connected you may see that it tries to use an existing connection or if not asks you to specify the connection to the required Key Vault. In my case above I have a connection but I’ll choose to change that to show what needs to be set. Hitting Change Connection brings up this:

Choose Add New and it shows:

Set the Vault name to the name of the Key Vault where the Key for encryption is held, in my case akv-dev and click Sign In:

Azure brings up the usual credentials access dialog to allow you to connect.

Once connected you get the dialog box to select the Key:

I choose my Key (akv-dev-testkey) and set the Raw Data to the Dynamic Content value of File Content.

Now click on “+ New Step” again to add the write out of the data to the Storage Account.

Choose Azure Blob Storage and Create Blob:

I set the Connection Name and choose the Storage Account (oramossadls2) and then hit Create.

This creates the Create Blob Step and we can specify the folder path on the target Storage Account where we want to create the file. We can specify the target file name (robots.txt) and then we should specify the Dynamic Content of the encrypted data as the Blob Content but notice that the Dynamic Content doesn’t show it. It does, however, show the message “We can’t find any outputs to match this input format. Select See more to see all outputs from previous actions”:

Click on the “See More” and it will show the “encryptedData” as an option:

Choose “encryptedData” so that the Create blob dialog looks like:

Save the Logic App and Run it.

The Logic App runs and the output looks like this:

If we look on the Storage Account we see the file robots.txt:

And if I looked at the file in an editor it looks like:

Hope this helps.

Configure Linked Templates Use for Azure Data Factory with Azure DevOps Deployment

I’ve finally worked out how to do this so I thought I’d write a post on it since I can’t find any single resource that accurately covers it all – my apologies, in advance, if someone has already done this.

I’ve been using Azure Data Factory v2 for quite a while now and have it integrated with Azure DevOps for CI/CD between environments. I follow the standard approach which is documented here and I won’t repeat.

I’ll assume that you have a git enabled source Data Factory and a non git enabled target Data Factory and that your main code branch is “master” and the publish branch is “adf_publish”.

As the documentation there says:

“If you’ve configured Git, the linked templates are generated and saved alongside the full Resource Manager templates in the adf_publish branch in a new folder called linkedTemplates”

…that happens when you publish the master branch from Azure Data Factory.

We can see the non linked template files (ARMTemplateForFactory.json, ARMTemplateParametersForFactory.json) and linked template files (ArmTemplate_master.json, ArmTemplateParameters_master.json and ARMTemplate_0.json) in the picture below:

Note – this is a very small demonstration factory and there is only one linked template file (ArmTemplate_0.json) – as the factory grows in size additional, consecutively numbered, files will appear.

The question then is how do you get Azure DevOps to use those Linked Template files instead of the non linked ones sitting in the adf_publish branch root directory?

Supposedly you can just follow this link but unfortunately that document is a little out of date and no longer being updated. The document covers the deployment of a VNET with Network Security Group and makes no mention of Azure Data Factory but it still provides some useful pointers.

The document correctly points out that in order for the linked templates to be deployed they need to be accessible to Azure Resource Manager and the easiest way of doing that is by having the files in an Azure Storage Account – that article illustrates the use of “Storage (general purpose v1)” but I used a Gen 2 ADLS Storage Account instead and that worked fine.

My Gen 2 Storage Account “oramossadls2” looks like this:


I then created a Shared Access Signature for oramossadls2 Storage Account and copied the SAS token which I then put into a secret called StorageSASToken in an Azure Key Vault called akv-dev:

I created an Access Policy on this Key Vault to allow the Azure DevOps Service Principal to be able to read the Secret:

On my Gen 2 ADLS Storage Account I then created a Container called demo:

From the container properties the URL looks like:

https://oramossadls2.blob.core.windows.net/demo 

In Azure Storage Explorer, I grant access to the container to the Service Principal of my Azure DevOps site in order that it can access the files:

In Azure DevOps I then created a Variable Group called akv-dev which brings in the StorageSASToken Secret from the akv-dev Azure Key Vault:

I created a second Variable Group called “Production-Static” in which I created some more variables for use later on:

Following this article from Kamil Nowinski I have a Build Pipeline “ADF-CI” in Azure DevOps which stores the ARM template files as artifacts ready for use on a Release.

Now for the bit that took me a while to work out…the Release Pipeline.

My release pipeline has the artifacts from the latest Pipeline Build (_ADF-CI) and the code from the “master” branch as artifacts and a single stage with five tasks:

The Production-Static and akv-dev Variable Groups are attached to the Pipeline:

The five tasks are:

The first step copies the files on the latest Build, attached as an artifact (_ADF-CI) to this Pipeline, to the ADLS Gen2 Storage Account that I’ve created:

The value of StorageAccountName from the “Production-Static” Variable Group is “oramossadls2” and blobContainerName is “demo”.

The YAML for step 1 is:

steps:
 task: AzureFileCopy@3
 displayName: 'AzureBlob File Copy'
 inputs:
 SourcePath: '$(System.DefaultWorkingDirectory)/_ADF-CI'
 azureSubscription: 'Pay-As-You-Go (xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)'
 Destination: AzureBlob
 storage: '$(StorageAccountName)'
 ContainerName: '$(blobContainerName)'
 BlobPrefix: adf 

When this step eventually runs the Storage Account will look like this:


The second step stops the ADF Triggers – if you don’t stop active triggers the deployment can fail. I use a Powershell script to do this:

The YAML for step 2 looks like this:

steps:
 task: AzurePowerShell@4
 displayName: 'Stop ADF Triggers'
 inputs:
 azureSubscription: 'Pay-As-You-Go (xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)'
 ScriptPath: '$(System.DefaultWorkingDirectory)/_ADF/powershell/SetADFTriggersState.ps1'
 ScriptArguments: '-DataFactoryName $(DataFactoryName) -DataFactoryResourceGroupName $(DataFactoryResourceGroupName) -State "Stop" -ReleaseIdentifier $(Release.Artifacts._ADF.BuildId)'
 azurePowerShellVersion: LatestVersion 

The third step actually deploys the template to the target environment (in this case oramoss-prod Data Factory resource group “adf-prod-rg”):

It took a while to work this out. First I tried using Template Location of “Linked Artifact” which allows you just to select the template/parameter file from the attached artifacts but that doesn’t work because “nested templates ALWAYS have to be deployed from url” according to this. So, we have to set Template Location to “URL of the file”. We then have to specify the Template and Template Parameter file link using a URL which consists of the Primary Blob Service Endpoint, the container, Blob Prefix, folder hierarchy, filename and the SAS Storage Key, i.e.

Template File:

https://oramossadls2.blob.core.windows.net/demo/adf/drop/linkedTemplates/ArmTemplate_master.json$(StorageSASToken)

Parameters File:

https://oramossadls2.blob.core.windows.net/demo/adf/drop/linkedTemplates/ArmTemplateParameters_master.json$(StorageSASToken)

Note – we are getting the Storage SAS Token from the attached Variable group akv-dev.

We then have to override some parameters:

-factoryName "oramoss-prod" -containerUri $(AzureBlobStorageURL)/$(blobContainerName)/adf/drop/linkedTemplates -containerSasToken $(StorageSASToken)

The factoryName needs to be overridden because we are moving the code from one Data Factory to the next.

In this article, it suggests that the parameter for the the URI of the template files is called “templateBaseUrl” but this appears to now be changed to “containerUri” and we set it to the Primary Blob Service Endpoint, Container Name and directory where the template files are held.

The article also suggest the Storage SAS Token parameter is called “SASToken” but it appears to now be “containerSasToken”.

The YAML for step 3 looks like:

steps:
 task: AzureResourceManagerTemplateDeployment@3
 displayName: 'Deploy ADF ARM Template'
 inputs:
 azureResourceManagerConnection: 'Pay-As-You-Go (xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)'
 subscriptionId: 'xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx'
 resourceGroupName: 'adf-prod-rg'
 location: 'North Europe'
 templateLocation: 'URL of the file'
 csmFileLink: 'https://oramossadls2.blob.core.windows.net/demo/adf/drop/linkedTemplates/ArmTemplate_master.json$(StorageSASToken)'
 csmParametersFileLink: 'https://oramossadls2.blob.core.windows.net/demo/adf/drop/linkedTemplates/ArmTemplateParameters_master.json$(StorageSASToken)'
 overrideParameters: '-factoryName "oramoss-prod" -containerUri $(AzureBlobStorageURL)/$(blobContainerName)/adf/drop/linkedTemplates -containerSasToken $(StorageSASToken)'
 deploymentName: 'oramoss-prod-deploy' 

The fourth step removes orphaned resources. Because we use an incremental approach to pushing the ARM template to oramoss-prod it means that if we dropped an element from oramoss-dev Data Factory it would not automatically get removed from oramoss-prod Data Factory, i.e. the element would be orphaned in oramoss-prod. This step removes any such elements it finds using a Powershell script.

The YAML for step 4 looks like:

steps:
 task: AzurePowerShell@4
 displayName: 'Remove Orphans'
 inputs:
 azureSubscription: 'Pay-As-You-Go (xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)'
 ScriptPath: '$(System.DefaultWorkingDirectory)/_ADF/powershell/RemoveOrphanedADFResources.ps1'
 ScriptArguments: '-DataFactoryName $(DataFactoryName) -DataFactoryResourceGroupName $(DataFactoryResourceGroupName) -armTemplate $(System.DefaultWorkingDirectory)/_ADF-CI/drop/ARMTemplateForFactory.json'
 azurePowerShellVersion: LatestVersion 

The last step runs the same script as the second step but with State set to “StartPriorEnabled” instead of “Stop”.

The YAML for step 5 looks like:

steps:
 task: AzurePowerShell@4
 displayName: 'Start ADF Triggers'
 inputs:
 azureSubscription: 'Pay-As-You-Go (xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)'
 ScriptPath: '$(System.DefaultWorkingDirectory)/_ADF/powershell/SetADFTriggersState.ps1'
 ScriptArguments: '-DataFactoryName $(DataFactoryName) -DataFactoryResourceGroupName $(DataFactoryResourceGroupName) -State "StartPriorEnabled" -ReleaseIdentifier $(Release.Artifacts._ADF.BuildId)'
 azurePowerShellVersion: LatestVersion 

That’s it. Save the Release Pipeline and run it and the output should look like similar to:

Helpful Links

Safed Musli known as Divya Aushad in the ayurvedic medicine is gaining an increasing popularity according to its properties cipla tadalafil as cure for diabetes, arthritis pre-natal and post-natal problems. order viagra from india Besides, men can consume NF Cure capsule and Shilajit capsule provide better effect regarding this issue. Finally patients suffering from low blood pressure, feeling with more energy in the late afternoon hours, difficulty in getting out of bed and abrupt loss of the erection of the penile region cialis side effects during the acts of making love. This condition or inability to bear a child due 5mg cialis online to various reasons is known as infertility.