Quantcast
Channel: Chaminda's DevOps Journey with MSFT
Viewing all 344 articles
Browse latest View live

Controlling Pull Request Source Branches for Given Target Branch in Azure Git Repos

$
0
0
Generally you can create pull request from any given source branch to a any given target branch in Azure Git repos. You can setup a pull request validation build which evaluates the repo state, by performing a pre merge in the build server to ensure, your repo build succeeds after the pull request is merged to the target branch. You can set the branch policies so that the pull request cannot be merged to a given target branch if the validation build fails or if the approvers not approve the pull request. However, you might want to follow a branching strategy, which requires you to control from which branches a pull request can be created and merged to a targeting branch. The purpose of controlling the pull request source and target would be to assure your team is adhering to a defined branching and merge strategy. Let’s look at using a task in pull request validation build to ensure the request is coming from an allowed/valid branch, and make the build to fail if it is coming from an invalid branch.


Let’s take an example branching strategy shown below and assume only pull requests (PR) should be done following the rules specified below.

  • Any feature branch can only create a PR targeting dev branch.
  • Any branch can be merged with a PR coming from a hotfix branch.
  • qa branch can be merged with a PR coming from dev branch or any hot fix branch only.
  • master branch can be merged with a PR coming from qa branch or any hot fix branch or any release branch only.
  • Any release branch can be merged with a PR from any hotfix branch.

Let’s implement PR source branch validation mechanism for each target branch, using pull request validation build which is used in the target branch policies.
For the above conditions we can define branch validate pattern as a PowerShell hashtable as shown below.
@{"dev" = "feature/*;hotfix/*" ; "qa" = "dev;hotfix/*" ; "master" = "qa;hotfix/*;release/*" ; "release/*" = "hotfix/*"}
Each hash table entry has target branch defined to the left of the = mark and the right side of = mark contains the possible branch pattern,each source branch separated by a semicolon. For Example, "dev" = "feature/*;hotfix/*"  defines feature/*;hotfix/* (any feature or any hotfix) branch as source for dev branch.
You can use the PowerShell script made available here in a PowerShell task and create a task group so that it can be used in any PR validation build. Let’s understand the actions performed in the script.
The script is performing steps described below.
Read pull request source and target branch details from the build variables and print them in build log. Read the branch control pattern variable content as hashtabe.

Strip the refs/heads portion of source and target branch of the PR and create source and target branch fitters.

Set /* for if required to the source and the target branch filters. For example, feature/somefeature  is made feature/*. Print the branch filters in build log.


Retrieve the possible source branch patterns for the target branch of the PR, from the branch control patterns and print the possible source branch pattern in the build log.

Output error and fail the build if the possible source branch pattern not found for the target branch of the PR. If possible source branch pattern contains the current source branch of the PR allow build to proceed as it is a valid PR, considering source and target branches. If the source branch is not available in possible branch patterns fail the build preventing the PR from merge to target branch.


In a build definition the above described script can be added as a task (you can create a task group and use it in multiple build definitions). make sure to add below custom control condition so that this step is only get executed in pull request validations.
and(succeeded(), eq(variables['Build.Reason'], 'PullRequest'))


Add a variable to the build definition or use via a variable group a variable named BranchControlPattern with the desired pattern hastable.


Set the build definition as a pull request validation build for each of the branches using branch policy settings.


Once you make a pull request to a branch from a invalid source the validation step will prevent the build from passing blocking the pull request from merging. For example, an attempt of a pull request from a feature branch to qa is blocked as below.


A pull request from a feature branch to dev branch is allowed to be merged as it is adhering to branch pattern.


Fixing Azure DevOps Xamarine Build error XA5300: The Java SDK Directory could not be found

$
0
0
When you try to build xamarine android projects with Azure Pipelines hosted agent Windows 2019 with VS2019 you might encounter error “XA5300: The Java SDK Directory could not be found”, if you are using a visual studio build task or msbuild task to build a solution which includes android projects. This is caused due to the inability to find the java sdk in the hosted agent. Let’s see how we can get this issue fixed.
C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\MSBuild\Xamarin\Android\Xamarin.Android.Common.targets(721,2): error XA5300: The Java SDK Directory could not be found. Please set via /p:JavaSdkDirectory. [d:\a\1\s\Host\xxxxxxxxxxxxxxxxxxxxxxxxxxx]

"d:\a\1\s\xxxxxxxxxx.sln" (default target) (1) ->
"d:\a\1\s\Host\xxxxxxxxxxxxx.Android.csproj" (default target) (14) ->
   C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\MSBuild\Xamarin\Android\Xamarin.Android.Common.targets(721,2): error XA5300: The Java SDK Directory could not be found. Please set via /p:JavaSdkDirectory.

To fix the issue pass /p:JavaSdkDirectory="$(JAVA_HOME_8_X64)" in msbuild arguments of the visual studio build task.
This will allow the msbuild to find the java sdk and will successfully build your android project.

Resolving 'NoneType' object has no attribute 'azure_services' While Setting a Network Rule to Azure Key Vault

$
0
0
As infrastructure automation is vital aspect of DevOps, you might want to setup an Azure Key Vault to use a Subnet in a vnet, programmatically. For this you can use Azure Command Line Interface (CLI) and you may run into an error 'NoneType' object has no attribute 'azure_services'. Let’s have a look at how to get this issue resolved.

We can create an Azure Key Vault using the CLI command below.
az keyvault create -n keyvaultName -g resourceGroupName -l azureDataCenterRegion `
             --sku keyvaultSKU
Assuming you already have a resource group and a vnet and subnet setup in your resource group we can follow the step below to add the network rule.
az keyvault network-rule add -n keyvaultName -g resourceGroupName --subnet vnetDefaultSubnetName --vnet-name vnetName
However, you may run into below issue if you have not set the Key Vault to use selected networks in the Firewall and Network rules.
ERROR: 'NoneType' object has no attribute 'azure_services'
This GitHub issue discusses the error in detail.
But, if you are using AzureRM PowerShell module this issue does not occur. While setting up the Key Vault or updating it with CLI you can set the two parameters –defualt-action to deny and allowed value (AzureServices, None) for –bypass, to make the Key Vault to set as Selected Network in firewall and network rules.
az keyvault update -n keyvaultName -g $resourceGroupName `
             --default-action Deny --bypass AzureServices
The network rule is getting applied for the Key Vault.

Customizing Columns in Your Azure DevOps Sprint/Task Boards

$
0
0
Kanban flow of the Azure Boards for user stories, features or other top level back log boards was a feature available in Azure DevOps Boards for a long time. However, it was not possible to change the sprint board columns in Azure DevOps, without introducing new states to Task work item or any work item which is using sprint board previously. This long awaited community request has been now completed and available in Azure DevOps services. Azure DevOps on-premise server does not have this feature we can hope it would be added to the on-premise server eventually. Let’s look at how we can customize the columns in the Sprint boards with this new feature.
In your sprint boards now you can find a Column Options button available.


Once you click Column Options a side pane will open allowing you to customize the columns of the Sprint/Task board.

1 – You can click + Add Column button to add a new column.
2 – You can drag and drop to reorder columns, except the first and the last columns.
3 – You can remove a column except first and the last column using delete button of each column.
4 – You can change the name of the column. It is possible to even change the column name of first and last columns as well.
5 – Select an available state to a column. It is not possible to change the state of the first or the last column.
6 – Save button can be clicked to apply the changes to the sprint/task board.

Build .NET Core 3.0 Projects with Azure DevOps Pipeline Using Hosted Agents

$
0
0
.NET Core 3.0 is still in preview 7 and some of you may have already started developing projects with it. Implementing CI/CD is important to any project regardless of whether you are using bleeding edge technology or not. It is possible to setup your own build server with the required preview components setup a builds. However, if you can utile the hosted agents in Azure DevOps Services that would remove the burden of maintaining your own build server. Hence, let’s look at the steps required to build a .NET core 3.0 application with hosted 2019 agents.
1. The first step required is setting up a global.json file with below content in the same folder where the Visual Studio solution file is located in your repo. It will allow the MSBuild to use the preview .NET core versions in the build step.
{
"sdk": {
"version": "3.0.100-preview"
}
}
2. If you want to dynamically create this file in the build pipeline (to prevent the temporary needed file to be added to the repo),  you can use a PowerShell task with a inline script in your build such as below.
$globaljson = '{"sdk": {"version": "3.0.100-preview"}}';
$globaljson | out-file $(globaljsonPath) -Encoding UTF8

3. Then you need to add Use .NET Core step to the build and you can define which version of the .NET Core preview should be downloaded to the build server. The version number specifications can be found in here.
Then you can use Visual Studio build step to build your solution which is containing a .NET Core 3.0 application.
The usage of dotnet build task is possible as well and you would be able to skip the step 1 and 2 of adding a global.json if you are using dotnet build task. But if you want to build the entire solution with Visual Studio build step you have to follow the steps described above.

Creating an Azure Web App Supporting .NET Core 3 – IaC with Azure Pipelines

$
0
0

We have discussed how to build a .NET Core 3 Web Application in the previous post. In order to deploy a .NET Core 3.0 Web App to an Azure Web App, you need to install .NET Core 3.0 Extension to the Azure Web App. You can easily add .NET Core 3.0 via Azure Portal to a Web APP. However, if you are really into automating your infrastructure as code (IaC) you may want to make all these steps automated and executed via a deployment pipeline. Let’s look at a script which is using Azure CLI and Azure PowerShell to fully automate creation of a .NET Core 3.0 enabled Azure Web App.
You can download the full script from here. Let’s understand each part of the script.


The Parameters


Since this script is using Azure CLI and AzureRM PowerShell you need to supply an Azure service principal details to this script. Let’s discuss how to create this service principal later in this post. Parameters starting with $azure refers to service principal details. The script begins by creating a resource group in Azure in a given datacenter. Because of that reason the service principal you have to create must allow subscription level contribute role. For the $core3Extention parameter you can supply either AspNetCoreRuntime.3.0.x64 or AspNetCoreRuntime.3.0.x86 depending on your application need. If you are using free pricing tier in Azure Web App you can only use AspNetCoreRuntime.3.0.x86 with Core 3.0 and Blazer.
A try catch block is used to catch any exception and output the error massages and fail the pipeline in case of an error.
Login


The first step is login to Azure using the service principal.

Resource Group


Script checks if the resource group exist in the subscription and creates a one if it does not exists.

App Service Plan


Next script checks if the web app service plan exists. If exists pricing tier can be updated. If not exists it will create an app service plan in the resource group.

Web App


As the next step a web app with the given name is created if it is not already created and pricing plan is assigned to the web app.

.NET Core 3.0 Extension


The script then checks the .NET Core 3.0 extension is added to the site and add it if not already added.

Slots


If you supply the slot names in an array the script will create the slots for the Azure Web App and add the .NET Core 3.0 extension to each slot. Slots are optional and you can skip this parameter in the script. If you are using slots the pricing plan SKU should support slots.

Exceptions


The exception is thrown in any error in script execution.

Using in Pipeline
To use this script in Azure Pipelines you need to create a service principal in Azure. You can execute Azure CLI command below to create a service principal with contributor role for the Azure subscription.
az ad sp create-for-rbac -n InfraDeploySpn --role contributor --scopes /subscriptions/subscriptionid
Execute the command in Azure cloud shell and copy the service principle details to Azure Pipeline variables.


Then you can use a PowerShell task like below to execute the script to create the Azure Web App with .NET Core 3.0 support.

Once executed you can see the Web App gets added with Core 3.0 extension.

































Setting Up Az Module to Write IaC for Azure

$
0
0
Microsoft has deprecated the AzureRM PowerShell module and introduced new Az PowerShell module for Azure. Bug fixes and support will be available for AzureRM module until December 2020. However, it is better starting migrating any AzureRM based PowerShell scripts which you have written to achieve Infrastructure as Code (IaC) for Azure. For this purpose, you can setup the Azure Az Module in your machine while allowing the AzureRM commands to still work using alias. Let’s look at the steps of setting up Azure Az Module in a machine where Azure RM is already setup.
As the first step we need to check version of Azure RM installed in the machine. For this execute below command.
Get-InstalledModule -Name AzureRM -AllVersions

Then check if AzureRM is installed via msi by looking for it in Settings --> Apps in Windows 10. If it is windows 8 or below you can look at Control Panel -- > Uninstall Programs.


If AzureRM msi found as installed uninstall it using uninstall programs or from Settings -- > App in Windows 10. Below are the options for windows versions to look for the Azure RM msi.


· Windows 10 Start > Settings > Apps
· Windows 7
· Windows 8 Start > Control Panel > Programs > Uninstall a program


As shown in above image if the AzureRM msi is not found the assumption is that you have used Install-Module -Name AzureRM to install AzureRM module. Then you can uninstall AzureRM module by executing command below.
Uninstall-Module -Name AzureRM -AllVersions -Force


Before installing Azure Az module check the version of PowerShell in your machine. PowerShell version should be version 5.1 or higher. To check version of PowerShell, execute below command. If the PowerShell version is not 5.1 or higher update it following instructions here.
$PSVersionTable.PSVersion
Then check if .NET 4.7.2 or higher available in your machine. To determine the version of use instructions here. If .NET 4.7.2 r later is not available in you machine install it.

Get-ChildItem 'HKLM:\SOFTWARE\Microsoft\NET Framework Setup\NDP\v4\Full\' |  Get-ItemPropertyValue -Name Release | Foreach-Object { $_ -ge 461808}


To install the Az Module execute the below command.
Install-Module -Name Az -AllowClobber -Scope CurrentUser
Or scope it to all users using below. However, it is recommended to use current user as other users of the machine may not prefer to use Az module as of yet.
Install-Module -Name Az -AllowClobber -Scope AllUsers


You may get
Untrusted repository
You are installing the modules from an untrusted repository. If you trust this repository, change
its InstallationPolicy value by running the Set-PSRepository cmdlet.
Are you sure you want to install the modules from 'PSGallery'?
[Y] Yes [A] Yes to All [N] No [L] No to All [S] Suspend [?] Help (default is "N"):


Select Yes to all and continue. Close all PowerShell windows and open a new PowerShell window. We can enable Azure RM commands to be used with Az module preventing AzureRM based scripts from failing. For this run the below command. If you have not closed all PowerShell windows and opened a new one below command will complain that both Azure RM and AZ modules are detected.
Enable-AzureRmAlias -Scope CurrentUser

Or to enable it for local machine.
Enable-AzureRmAlias -Scope LocalMachine

Available scopes are Local, Process, CurrentUser and LocalMachine.
However, if you run below command you still see some AzureRM modules in machine.
Get-InstalledModule


To remove all modules based on AzureRM execute below Az command.
Uninstall-AzureRM

Now you can see all AzureRM modules are uninstalled.

But still you can use AzureRM commands in your scripts since the Enable-AzureRmAlias is done previously.

once you migrate all your scripts to use new commands in Az modue you can disable the AzureRM alias commands by executing below. Available scopes are Local, Process, CurrentUser and LocalMachine.
Disable-AzureRmAlias -scope currentuser

Using VS Code Extension for Azure Pipeline–Part1

$
0
0
Visual Studio Code is becoming a popular code editing tool in all platforms, Recently VS Code was added with Azure Pipeline extension which can generate YAML based pipeline code. The extension assists in writing YAML pipeline code providing IntelliSense. Let’s look at how we can get started with the Azure Pipeline extension to generate a YAML base pipeline.
Getting the GitHub Repo Ready
As the first step create a public repo in the GitHub (you can do with private repos as well).

Then we need to create an API token to access this public repository. For that go to your account settings.

In the settings click Developer to get the developer settings.

In the developer settings Personal Access Tokens and generate a token.

Set the scope to public repos (since we have created a public repo) and generate the token.

Copy and keep the token in a safe place.

Get the VS Code Ready with Azure Pipelines Extension
Open the VS Code and click on Extensions icon or press Ctrl+Shift+X. Then search for Azure Pipelines and install the extension.

Now you have to Clone the GitHub repository. You can copy the clone URL from GitHub and press Ctrl+Shift+P to launch the command palette of VS Code. Then type git:Clone and provide the clone url of GitHub. Clone the GitHub repo to a local folder.
Generating Code
Install .NET Core SDK 2.2 in your machine if you have not already done. You can download .NET Core SDK from here https://dotnet.microsoft.com/download.
Open the terminal in VS Code and type dotnet new webapp to create ASP.NET Core Web Application.

Then commit and push the code to GitHub using VS Code.
Creating Azure Web App
In Azure portal create a new resource group and create a new web app in the resource group with .NET Core 2.2 as the runtime.
Generating the YAML Pipeline with Azure Pipelines Extension
In the File explorer of VS Code you can right click and click Configure Pipeline to start generating the YAML pipeline.

Or you can press Ctrl+Shift+P to launch command palette. The search for Azure pipelines and slect the command Azure Pipelines: Configure Pipeline and press enter.

You have to select the template for pipeline as the next step.

Once start configure pipeline you will be prompted to log on to your Azure account in VS Code.

Sign in with your Azure account credentials. Then you will be prompted to GitHub authorization. Provide the Personal Access Token we have generated earlier with GitHub and press enter.

Then you will be prompted to select an Azure subscription from your account. Select the subscription where you have created the new web application.

Next you have to select the web app.

Then the generator will create an Azure DevOps account.

The it will connect your Azure subscription with the pipeline.

The YAML file for pipeline will be created in the repo and you will be requested to commit and push it.

It would be a basic YAML template which would archive the project content and deploy it to the Azure web app.

Let’s do improvement to this YAML in the next blog. For now, just commit and push which will initiate a configuration of the pipeline in Azure DevOps and then it will execute it to deploy.

Once completed you can see a notification in VS code which let you browse the pipeline.

The pipeline is executed and Azure WebApp is deployed with the contents of the project.

However if you check the web app you will fined it is not showing the .NET core sample web app. The generated YAML pipeline has not compiled and published the web app. It has just deployed the web project files as it is. You can inspect in Azure Kudus the files deployed.

In the next post let’s try to explore how to develop the YAML pipeline with VS Code to properly build and deploy the .NET Core web app.

Resolving Azure DevOps Build Error “data at the root level is invalid” in dotnet test

$
0
0
When you try to execute dotnet test with test assemblies in Azure DevOps builds built with Visual Studio 2019 you may run into an error “data at the root level is invalid”. Let’s look at how to solve this issue and get the tests executed in the Azure DevOps build pipelines.
The error
error MSB4025: The project file could not be loaded. Data at the root level is invalid.


How to resolve
Instead of using test.dll to execute the tests with dotnet test use the csproj files. This makes the running of the tests to work.

The test executes fine after this change.

Import BitBucket Cloud Repo to Azure Git Repo

$
0
0
Azure DevOps being a comprehensive Application Lifecycle Management tool provides Azure Git Repos as the distributed version control system which can be easily integrated with, Azure Boards to track requirements, and Azure Pipelines and Azure Tests, implement build and deployment automations as well as test management and test automation. If you are already a BitBucket cloud git repo user you may want to move your repos in BitBucket could to the Azure Git Repos to get the advantages it offers, with totally integrated set of Azure DevOps features. Let’s look at the steps required to import a BitBucket cloud repo with history to Azure Git Repos.


First go to BitBucket cloud and in you profile setting setup an app password.


Provide repo read permission scope and create app password with a meaningful label.

Copy and save the app password as you would not be able to see it one you close the prompt.

Go to the BitBucket repository you want to clone and click Clone and copy the HTTPS clone url.

Then go to Azure DevOps team project and click on import repo in repos section.

In the popup provide the clone url of the BitBucket repo and a new name for the target repo. Select requires authentication and provide the bitbucket username (clone url contains that), and the app password you generated. Then click import.

Import will start and you will see the progress.

Once completed you will see the repo imported with history and all branches.

Rollup Columns in Azure DevOps Boards

$
0
0
Azure Boards in Azure DevOps facilitate project planning and management with agility. The rich feature set and extensibility of Azure Boards with project level customizations as well as process template level customizations helps a lot to manage the projects effectively. The new feature rollup columns help you to visualize progress of the project work based on various criteria using the default fields as well as custom fields. Let’s have a quick look at options available with the new feature roll up columns in Azure Boards.
To add rollup columns in Azure Boards, navigate to Backlog section and select a backlog level such as User Stories backlog. Then click on Column Options.

In the column options pane that is opened click on Add rollup column. You can select a rollup column options form the quick list of items available. There would be two types of rollup column quick adds, progress bar type and total number type. Let’s select Progress by task.

It will take couple of seconds to minutes depending on your backlog size to the rollup column to appear in the backlog. Rollup column based on tasks to show story progress will show story progress as a percentage of completed tasks.

You can even define custom rollup columns with various options such as shown below.

Try different rollup columns and utilize this newly added feature in Azure Boards to enable enhanced visualization of work progress in your teams.

Resolving “ERROR: There was a conflict. The remote server returned an error: (403) Forbidden.” While Creating Function App in Azure in IaC

$
0
0
You can use Infrastructure as code (IaC) to create resources in Azure and use that in the Azure DevOps pipelines. However, if you are using IaC create an Azure function where the storage account of the function is added to Virtual Network (vnet in Azure) you may run into the issue ERROR: There was a conflict. The remote server returned an error: (403) Forbidden. To fix this issue you can set the storage account to allow any network while deploying function app and then reapply the restriction on storage access. Let’s see the cause of the error and how to resolve as the error message is really misleading.


The issue

When you have a storage account in a vnet in Azure, it does not allow you to create a function app utilizing that storage account via Azure portal or via command line such as CLI that can be used to write IaC. But the error message is misleading and it does not help to understand the real issue. There is no real conflict here it is rather the accessibility to storage account is going to cause the error message.

How to resolve

To resolve the issue you can simply allow storage account to be used from all networks using the Azure CLI command as shown below or go to portal and set the fire wall settings in the storage account to allow all networks.


az storage account update -n 'storageaccountname' -g 'resourcegroupname' --default-action 'Allow'
Once the deployment completed you can apply the firewall rules back using below CLI command.
az storage account update -n 'storageaccountname' -g 'resourcegroupname' --default-action 'Deny'
Since we have not removed the association of storage account with the relevant vnet remains intact once we reapply the network restrictions to storage account.
Add the Azure function to vnet and to same subnet which is associated to the storage account. The function would be ready and would be able to access the storage account without any issues.

Getting Started with Secure DevOps Kit for Azure

$
0
0
Security is important aspect of any software development project. DevOps is essential part of software development. Improving security measures of software development, delivery and deployment is now can be enhanced with Secure DevOps kit for Azure. This security kit comes with PowerShell unitalities as well as CI CD extensions etc. for securing your software projects. Let’s explore the Secure DevOps Kit with couple of posts. In this first post let’s look at an overview as well as how to get started with Azure Subscription health scans using PowerShell utilities.


Secure DevOps Kit for Azure

Following are the basic level overview of Secure DevOps Kit for Azure.

Source: Microsoft Documentation

Subscription Security: Set of tools to enable creating and managing a secure Azure cloud environment.
Secure Development: Using Security IntelliSense and Security Verification Tests (SVTs) in order to identify vulnerabilities while development.
Security for CI/CD: Using CI/CD extensions for Azure DevOps to ensure security of the committed and deployed code.
Continuous Assurance: Check the “drift” from the secure system using tools and ensure your system is secure in the cloud environment continuously.
Alerting and Monitoring: Using analytics and monitoring tools with DevOps to assure the system is kept at the expected security level.
Cloud Risk Governance: Usage of Telemetry to come up with governance policies for secure system.

Let’s explore each of these tools in detail in the upcoming posts. As the first step of it let us do a basic level security scan of an Azure subscription using the Secure DevOps Kit for Azure.

Setup Secure DevOps Kit for Azure

There are two prerequisites to setup Secure DevOps Kit for Azure.

1. PowerShell 5.0 or higher
2. Windows OS

To install the Secure DevOps Kit for Azure, execute the below command in an administrative PowerShell window. It is advisable to have Az module installed as well even though it is not mandatory.

Install-Module -Name AzSK -Scope CurrentUser -AllowClobber -Force

If you want for the scope you can use AllUsers instead of CurrentUser. If you have already used Login-AzAccount and log onto your Azure subscriptions, then make sure to execute Logout-AzAccount to ensure installation proceeds.


After installation execute below command to check if the AzSK module is installed.
Get-InstalledModule


Execute Health Scan of an Azure Subscription

To scan health of your Azure subscription, execute the below command providing your Azure subscription id.

Get-AzSKSubscriptionSecurityStatus -SubscriptionId yourazuresubscriptionid

However, you might run into below issue when executing the command.

InvalidArgument: Please provide a valid tenant or a valid subscription.
Note: If you are using Privileged Identity Management (PIM), make sure you have activated your access.


To resolve this issue, execute Login-AzAccount and logon to your Azure Subscription account when prompted. Then execute the scan command again and the scan of subscription starts.

Once execution competes you can see a log of analysis is created as a CSV file. You can read the log and identify any vulnerabilities in your Azure subscription.

We have explored health scan of Azure subscription with a basic overview of Secure DevOps Kit for Azure in this post. In the coming posts let’s further explore using the Kit for ensure security of cloud deployments of software.

Define Variables Dynamically and Use Them in Subsequent Steps in Azure DevOps Pipelines

$
0
0
You normally define variables in Azure DevOps pipelines in the pipeline definition or using variable groups. However, there might be situations in a given step in your pipeline you might want to set a variable with a value from external source. You do not have to define such variables in your release definition or in a variable group added to the definition. You can obtain the value from external source and dynamically define the variable, so that it can be used within the agent job. Let’s look at the steps to achieve this.


Create and apply value for variable dynamically

The two variables defined in following PowerShell script var and var2 are dynamically created and assigned with values chaminda and chandrasekara respectively.


This kind of scenario can be used to obtain Variables via Octopus deploy variable sets attached to an Octopus deploy project, to reuse them in Azure DevOps pipeline. Example below and full post on this usage can be found here.
Write-Host ("##vso[task.setvariable variable=" + $octopusVariable.Name + "]" + $variableValue)


Use the dynamic variables in other tasks/steps

The variables can be used normally as you use the other variables defined in the variable groups or in the definition itself.


The variables effectively working and the values are available in the dynamically created variables.

Simple and Effective Branching and Deployment Strategy

$
0
0
While using Git repos there many standard branching strategies teams follow. With this post let’s discuss a simple to understand and execute branching and deployment strategy, which supports most of the requirements, a software development and delivery team requires. The following implementation is considering the Azure Git Repos and Azure Pipelines in Azure DevOps. However, you might be able to implement the same with other Git repos and CI/CD tools.
Let’s have a look at pictorial representation of the proposed strategy first.


Here we talk about three branch types.

  • Master – The stable master branch which we never deploy from. After each release is made to production, we merge changes of that release from version branch to the master via a pull request.
  • Version – When starting development for each client release a version branch is created from master, say version/0.0.1
  • Feature – When start working on each new feature or bug fix a new branch should be created from the version branch (version which is to be released). It could be a version that is already in production, if the bug is a hot fix to the production.

Build

  • CI Build – A build setup to build, run unit tests and create a deployable package on every push to master, version or feature branch.
  • PR Build – Using the same CI Build applying version branch protection policy so that any incoming Pull Request can be merged only if the build succeeds, with current version branch applied with incoming changes.

Environments

  • Dev – Developers test environment where on demand any feature build, version build or PR build can be deployed.
  • ATST – Automated test target environment which will be deployed with each PR build or version build upon completion of the build. Then the test automations will be executed against this environment. On success it will trigger the QA deployment and wait for approval.
  • QA – QA team will take in deployments from PR builds or version builds by approving release. PR builds will only allow pull requests to be tested in QA environment. But a PR build cannot be deployed beyond QA environment as it is not having actual source code available in a branch. Once QA team is happy with the pull request functionality, they will merge the PR to version branch. A version branch release is also tested in QA after the automated test environment, and with approval can be deployed to Demo or Staging (STG/UAT).
  • Demo – QA Approved version branch build can only be deployed to Demo environment.
  • STG (UAT) – QA Approved version branch build can only be deployed to STG environment where end users will do final tests before allowing it to be deployed to production.
  • PROD – The production environment that can only be deployed with approval to STG environment.

In the next post, let’s try to explore how to implement above strategy with Azure Git repos, Builds and Release pipelines.


Implementing Simple and Effective Branching and Deployment Strategy with Azure DevOps

$
0
0
In the previous post, we have discussed, a simple and effective branching and deployment strategy as a concept. Let’s now have a look at key implementation considerations of the proposed strategy with Azure Git repos and Azure Pipelines.
Let’s have a look at pictorial representation of the proposed strategy first.

In the, previous post we identified each of the above branches, builds and environments and their usage in theory. Let’s see the implementation concerns of them now one by one.
Once a repo is created you will get a master branch in Azure git repos. The you can submit initial code structure to the master branch and create a branch with the targeted version say 0.0.1. You should use version/<version number> format so that all version branches would go into a folder version. When work on next version starts you can create another version/<version number> branch.

Then you can implement one build for each relevant microservice (if you are implementing microservices, or else a build for your project) in your Azure pipeline which builds, execute unit tests and then package and pushes artifacts to Azure DevOps Artifacts. The build should be set to trigger for master branch any feature branch or version branch as shown below. If microservices build then can use the path filters to build only when files changed in a given microservice. For this you have two options, using a git repo for each microservice or use folders within same repo. When fodders within a repo is used, path filters help to filter the triggering of builds to relevant folder content changes.


Then the build can be set to protect the required version branch using branch policies. Path filters can be applied in the build policy so that it validates incoming changes to given paths. Excluding paths helps to build only the incoming PRs for a given path.



Release pipelines can be setup to allow triggering for the version branches targeting the first environment as Automated Test environment. You can set the release paths so that above mentioned release pipeline flow can be achieved.

In above TEST – V environment allows a release from version branch to be deployed automatically to Test environment (where automated tests are executed against) once a new build from the version branch is available. This is achieved by setting the build to trigger a release automatically when a new build is available from version/* branch.

Then a filter is applied in TEST – V environment to make sure it gets deployed only with the code from version/* branch.

Then the QA team will accept it to QA environment on demand. This is handled though pre deployment approval and setting to deploy after Test environment.



Pull request triggers are enabled in the release pipeline allowing to get a release triggered based on an incoming Pull Request validation build success.

The TEST – PR environment is the same Test environment where the automated tests are executed targeting the deployment in this environment. It is set to trigger for pull request build availability and branches are filtered to make sure none of version/* or feature/* build will not be deployed to TEST – PR environment.

Again, QA team will only accept a deployment from pull request validation to QA environment on demand and based on if it succeeds in Test environment. The PR cannot be deployed beyond the QA environment due to the way the pipeline is setup. This is required to prevent a PR validation output (which is not coming from a stable branch) not reaching any environment beyond QA, as the package of PR is coming from a non-existing branch (PR build output is containing current version/* branch code merged with incoming pull request changes, in the build server but not in the target branch yet).

A successfully verified build (created from version/* branch) which is deployed to QA – V can be approved by QA team and is eligible to be deployed to demo or staging (pre-production), with approval from authorized team members.

Production can be only deployed after approval and on demand, as well as after staging environment. The build should be from version/* branch since it is the only possible build to reach the staging environment.

Development environment is set to be deployed on demand without any branch filters, so it allows a build from version/*, feature/* or a PR Build output to be deployed to Dev environment, for developer testing.

With this setup in Azure Build and Release Pipelines and in Azure Git repos branching, branch policy settings we can achieve the simple and effective branching strategy we discussed in the previous post, with Azure DevOps.
































Building Code from Multiple Repos with YAML Pipelines

$
0
0
You could only build code from a single repo using classic build pipelines in Azure DevOps. Now with multiple repo support added to YAML build pipelines (pipeline as code), you can decide which of the repos you want to checkout and build, in a single pipeline build sequence. Let’s see the steps in detail.

First, to understand how it works add couple of repos to a team project and add some code to each repo. In this example, you can find three repos defined in the team project MultiRepoBuild, namely MultiRepoBuild, multiRepo1 and multirepo2.


In each repo, a .NET Core Web App is added.

In one of the repos, you can define the YAML pipeline.

As the build pipeline code is added to the MultiRepoBuild repo, we need to refer the other two repos in the YAML definition. This can be done with the syntax shown below.

Then, you can define checkout steps. Checkout self says to checkout the MultiRepoBuild repo, as it is the repo containing the YAML pipeline code. The other two repos are checked out with specifying their names.

You can add a step to view the content of the build source directory, which will show the three repos available in the build source directory as three folders.


Then we can use build steps to build projects from each repo.

The full pipeline code is shown below.

The pipeline executes and builds code from multiple repos.

Replacing iOS .plist Array Properties in Azure Release Pipelines

$
0
0
.plist files are used by iOS applications to keep configuration properties. These files support arrays as configurations. In deployment process of Azure DevOps pipelines you may want to update these plist files to have values relevant to the target deployment environments. Let’s see usage of plistbuddy utility to update array items in plist files as the plutil (http://scriptingosx.com/2016/11/editing-property-lists/) is unable to apply the changes.
The plutil can insert arrays as described here http://scriptingosx.com/2016/11/editing-property-lists/ . However, it is failing in replacing values in arrays in .plist files.
For example assume the below .plist file array element.

While trying to update arrays with plutil it always crashes with array out of bound errors.
Info.plist: Could not modify plist, error: Failed to insert value [ "msal701274c3-f6c1-4e9f-a2f0-3d30e709ab5a"] at key path CFBundleURLTypes.0.CFBundleURLSchemes with error -[__NSCFConstantString characterAtIndex:]: Range or index out of bounds
However, we can use plistbuddy commands as shown below to update such array items without any problems in Azure Pipelines.
/usr/libexec/PlistBuddy -c "Set :CFBundleURLTypes:0:CFBundleURLSchemes:0 $(authmsal)" $(ipaExtractPath)/Payload/$(iOSApp)/Info.plist

Resolving Issues with | Symbol in PowerShell While Creating Node Web App in Azure CLI

$
0
0
You can use Azure Command Line Interface with PowerShell to create infrastructure as code (IaC) scripts, for implementing deployment of resources in Azure platform services. When you try to create a web app in Azure with run time specified with Azure CLI in PowerShell, | symbol used in runtime specification causes issues as it is trying to do a pipe operation in PowerShell. Let’s look at the issue and how to get it resolved in PowerShell and Azure CLI based IaC scripts.
The Issue
When the script below executed, $nodeRuntime is set as node|6.12 the script fail with issue shown in below image.
$webAppPricePlan = az appservice plan show -n $azureFunctionsAppPlanName -g $sharedResourceGroupName | ConvertFrom-Json
Write-Host ("Web App " + $name + " is not found. Creating it...")
az webapp create -n $name -g $resourceGroupName -p ($webAppPricePlan.id) -r $nodeRuntime
'6.12' is not recognized as an internal or external command, operable program or batch file.

How to Resolve
To resolve this issue, we can use --% as escape to | symbol. However, to run this in a PowerShell script with variables, we have to copy it as an evaluated expression. The execute it via Invoke-Expression. If the script is getting executed with hardcoded values instead of variable usage you do not have to use Invoke-Expression.
$exp = 'az webapp create -n ' + $name + ' -g ' + $resourceGroupName + ' -p ' + ($webAppPricePlan.id) + ' --% -r ' + '"' + $nodeRuntime + '"'
Invoke-Expression($exp)

Once this is resolved the script can be used in Azure Pipelines to create node web app resources in Azure.

Training Machine Learning (ML) Model with Azure Pipeline and Output ML Model as Deployable Artifact

$
0
0
Training a machine learning model requires wide range of quality data to get the ML model trained in such a way that it can provide accurate predictions. Azure build pipeline can run the python tests written to validate the data quality and the train a model with uploaded data to Azure ML workspace. In this post on “Setup MLOPS workspace using Azure DevOps pipeline” it is clearly explained how to setup an Azure ML workspace in a new resource group dynamically with Azure CLI ML command extension. The post “Setup MLOPS workspace using Azure DevOps pipeline” as well as this post on training a model with Azure pipelines use the open source ML repo (https://github.com/SaschaDittmann/MLOps-Lab.git) with data by Sascha Dittmann, and code to train a model.


Prerequisites

Setup an Azure Git repo by cloning repo from https://github.com/SaschaDittmann/MLOps-Lab.git

All prerequisite steps, such as Install Python 3.6 step, adding Azure CLI ML extension, creating a compute cluster and uploading the model data are required to be done as explained in post “Setup MLOPS workspace using Azure DevOps pipeline”. Then following the same post create Azure ML workspace. The “Setup MLOPS workspace using Azure DevOps pipeline” explains how to upload data in your cloned repos data folder to the workspace as well. Uploading of the data is required to train the model with that data.

But before uploading data you should ideally run some tests. In order to execute python-based tests first you need to setup the Azure pipeline agent with the required python requirements. These requirements are specified in the repo, setup folder install_requirements.sh script and the text file. However, if your project has more requirements you can define them as per your needs.


Here in this repo it uses scikit-learn library and some other dependancies.

The step in the build piline can be setup to execute the Install_requirement.sh as a bash script task (this model traning uses Azure Pipelines hosted linux (Ununtu 16.04) agent).

Then you can execute the test written in python using a command like below, which will execute the python tests you have written to check you data quality and publish the results. The sample test is available in repo test/unit folder path as unit tests in data_test.py.
pytest tests/unit/data_test.py --doctest-modules --junitxml=junit/test-results.xml --cov=data_test --cov-report=xml --cov-report=html

Then you can publish the test results so that it gets available in the build as test results.


Then as explained in post “Setup MLOPS workspace using Azure DevOps pipeline” you can add Azure ML extension to the agent, create ML workspace and upload data from the repo’s data folder.

Training the model

Once that is done, we can get started training the model. First create two folders to keep metadata and model files in the pipeline agent by executing below command in a bash step.
mkdir metadata && mkdir models


Then you can train the model in the ML workspace by executing the command below. Python code to train the model train_diabetes.py is available in the repo. This user the compute cluster created (in post “Setup MLOPS workspace using Azure DevOps pipeline”) with Azure CLI ML commands. The dependencies to train the model is available in the conda_dependencies.yml file in the repo. Make sure you are setting correct resource group and Azure ML workspace names you have created.
az ml run submit-script --resource-group rg-mldemodev01 --workspace-name mlw-demodev01 --experiment-name diabetes_sklearn --ct cmbamlcompute01 --conda-dependencies conda_dependencies.yml --run-configuration-name train_diabetes --output-metadata-file ../metadata/run.json train_diabetes.py
 
Once the model training is completed you can register the model in the ML workspace in Azure. While registering the model you can output the model pkl file, which can be used as a build artifact to deploy the trained model in different ML workspaces.
az ml model register --resource-group rg-mldemodev01 --workspace-name mlw-demodev01 --name diabetes_model --run-metadata-file metadata/run.json --asset-path outputs/models/sklearn_diabetes_model.pkl --description "Linear model using diabetes dataset" --tag "data"="diabetes" --tag "model"="regression" --model-framework ScikitLearn --output-metadata-file metadata/model.json

Then you can download the trained and registered to publish it as an artifact in the build pipeline.
az ml model download --resource-group rg-mldemodev01 --workspace-name mlw-demodev01 --model-id $(jq -r .modelId metadata/model.json) --target-dir ./models --overwrite


Then you need to copy all necessary files for deploying the ML model in different Azure workspace in to artifact staging directory, from repo and downloaded files to make sure they are published as build output.

**/metadata/*

**/models/*

**/deployment/*

**/setup/*

**/tests/integration/*


Next step is to publish the artifact staging directory contest as build output.

Once the build executed it will run tests on data, publish the test results.

Then the data is uploaded and model is trained, registered, downloaded and published with repo contents which are required for deployments in target Azure ML workspaces. We will discuss in a next post how to use the build artifacts and deploy the ML model to a different Azure ML workspace.
Viewing all 344 articles
Browse latest View live