Quantcast
Channel: Chaminda's DevOps Journey with MSFT
Viewing all 344 articles
Browse latest View live

Getting Content of a File in VSTS Git Repo

$
0
0

GItHub has a easy way to get raw contents of a file by clicking on Raw button for any code file in GitHub, where it will redirect to url starting with ‘https://raw.githubusercontent.com/’. For example the PowerShell script here can be viewed as raw content or retrieved programmatically using PowerShell using Invoke-WebRequest with url  https://raw.githubusercontent.com/chamindac/VSTS_PS_Utils/master/CreateAzureWebApp/CreateAzureWebApp.ps1. Let’s look at possibility of retrieving raw content of a file in VSTS Git repo via VSTS REST API.

You can use the ‘Git Items Get’ REST API call to retrieve the content of a file with VSTS Git. Let’s try this with an example. Below is a simple PowerShell script that retrieves the content of a given file in VSTS. For the Personal Access Token (PAT) you need code read access scope for this script to work.

$vstsAccoutName = 'YourVstsAccountName'
$vstsTeamProjectName = 'YourTeamProject'
$vstsPAT = 'YourPAT'
$vstsBaseUrl = 'https://' + $vstsAccoutName + '.visualstudio.com'

$FileRepo = 'GItRepoName'
$FileRepoBranch = 'GitBranchName'
$FilePath = 'File Path'# example: src/test text/ProjectDepedancy.txt

$User=""

# Base64-encodes the Personal Access Token (PAT) appropriately
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}"-f$User,$vstsPAT)));
$vstsAuthHeader = @{Authorization=("Basic {0}"-f$base64AuthInfo)};

$Uri = $vstsBaseUrl+ '/' + $vstsTeamProjectName + '/_apis/git/repositories/' + $FileRepo + '/items?path=' + $FilePath + '&$format=json&includeContent=true&versionDescriptor.version=' + $FileRepoBranch + '&versionDescriptor.versionType=branch&api-version=4.1'

$File = Invoke-RestMethod -Method Get -ContentType application/json -Uri $Uri -Headers $vstsAuthHeader

Write-Host$File.content


With the above script a file content in VSTS such as below can be retried as raw content.image

image

The URI parameter includeContent as true is required to enable get the content. If you do not specify the URI parameter $format with value json, you can get file content to $File variable directly in the PowerShell script. In that case you do not need to use $File.content to get the file contents. However, when you specify the $format=json you get file content with additional information such as commitId.image

The versionDescriptor.versionType is set to branch and versionDescriptor.version is used to define the branch of the Git repo where the file contains in above script.


Using Unified Agent for Executing Automated Tests Without Installing Visual Studio in Test Client–Azure DevOps/VSTS

$
0
0

As described in the post “Running UI Tests with Unified Agent” you can setup unified agent (same agent used for build/deployment in Azure DevOps/VSTS), in an application pool or in a deployment group to execute functional UI tests. The unified agent is really useful as it does not require to setup separate test agent using winRM based Deploy Test Agent task which is required if Run Functional Tests task is used for automated test execution.With unified agent you can use Visual Studio Test task, which is capable of running Selenium based UI tests as well, in addition to Coded UI tests. There was a dependency that you needed to have Visual Studio installed in the test client machine to get the Visual Studio Test task as it is packaged with Visual Studio. However, thanks to the NuGet package “Microsoft Test Platform”  you no longer need to install full Visual Studio in your test client machine to execute automated tests, with unified agent using Visual Studio Test task. Let’s look at how to use “Microsoft Test Platform”  and Visual Studio Test task in a test client using Azure DevOps Release Management.


As the first step you need to add “Visual Studio Test Platform Installer” task to your build or release pipeline, to acquire “Microsoft Test Platform”  and set it up in the test client machine. This task must appear before the Visual Studio Test task.image

For package feed you can use a custom feed or a network share as well, if internet connectivity is unavailable to the test client machine. The custom feed or network share must be added with the downloaded “Microsoft Test Platform”  NuGet package. If internet connectivity available to test client you can set up to use default Official NuGet feed. For “Microsoft Test Platform”  version you can set a specific version, latest stable or latest with pre-release.image

Next you have to add a Visual Studio Test task to the pipeline. You can specify a test assembly or assembly pattern or use test plan and suites, for test execution. The most important field to set is the Test Platform Version, which must be set to “Installed by Tools Installer” to use the “Microsoft Test Platform”.image

With the right test assembly or test plan settings in place you would be able to execute Functional UI tests with unified agents without having to install Visual Studio in the test client machine.image

Compared to Deploy Test Agent task which is taking longer time to setup, the “Visual Studio Test Platform Installer” task  runs in few seconds, letting you execute the mandatory functional UI tests even for each and every deployment to a Dev Integration environment, which can be configured to get deployed and tested for each code commit. True spirit of DevOps with Azure DevOps!!!image

Making a Task Group Parameter Optional – Azure DevOps Pipelines

$
0
0

Task groups are really useful to share common actions with multiple build or release pipelines in Azure DevOps (VSTS). You can group multiple tasks ang create a task group forma build or release definition. then these task groups can be utilized in other build or release definitions in a given team project. Parameters in a task group help to pass values from build or release definition to the tasks in the task group. You can add a default value for these parameters or have to provide value from the build or release definition that is using the task group. Making a task group parameter optional is not straight forward and you need some do work around to get it working. Let’s look at how we can make a parameter optional for a task group in Azure DevOps pipelines.


Below is a task group created with a PowerShell task accepting two parameters.image

$SqlDBServer and $SqlDBName  are used as the local variables and $(SqlDatabaseServer) and $(SqlDatabaseName) are the input parameters. Note the single quotes which is important in the PowerShell if you are having PowerShell scripts as your task group steps. This will make two parameters available for the task group. You can provide a description for each parameter and a default value. However, default value is not utilized in this sample as we are tying to make the parameters optional.image

If you try to use this task group in a build or a  release definition you will see it demands for both parameters as required. In a side note, you can see the descriptions added to parameters are visible to the release or build definition.image

How can we make one of the se parameters optional? For that you have no out of the box option available in Azure DevOps as of now. You have to export the task group and manually edit the json file to get it done. First export and save the json file.image

You can then open the downloaded json file with Visual Studio or Visual Studio Code. You can see the opened file as a single line of code which is hard to understand and modify.image

Press Ctrl+k Ctrl+D in Visual Studio or Shift+Alt+F in Visual Studio Code to format the json file (for that matter to format any code file). After formatting you can see there is a section named inputs and each of the parameters available there. You can set the required false for any input parameter in the json file and save it.image

Then you have to import the json file in task groups. Unfortunately this will only make a copy of the task group and you would not be able to overwrite existing task group.image

Browse for the file saved and import as a new Task group.image

image

Rename the imported task group name to your preferred name (you cannot overwrite existing task group) and save it.image

Go to the build or release definition add new task group (you may have to do full refresh of the build/ release definition page with a Ctrl+F5 to make the changes in task groups to be effectively visible to you). You would be able to see not the optional parameter from task group is not demanding for a value. Not a neat solution but it works!!!image


Build and Deploy SSIS with Azure DevOps Pipelines

$
0
0

SQL Server Integration Services (SSIS) projects can be created to perform ETL (Extract Transform and Load) operations. As Implementing of Continuous Delivery becoming a mandatory aspect of any type of software project it is vital for SSIS projects to be able to implement CI/CD. With the availability of the extension “SSIS Build & Deploy” in Marketplace for Azure DevOps, the CI/CD implementation for SSIS has become straightforward to implement. Let’s look at a sample to understand how to get CI/CD implemented for SSIS project with Azure DevOps.

You can create SSIS projects in Visual Studio and define project level parameters as shown below (More information on creating SSIS projects with VS 2017 can be found here).image

A simple sample such as copying data from one database table to another database table can be implemented as a trial. To build and deploy SSIS project with Azure Pipelines first get the extension “SSIS Build & Deploy” installed to Azure DevOps. Two important tasks getting added with this extension. One for builds SSIS projects and another for deploying SSIS project.

Building SSIS project with Azure DevOps

Create a new build definition and add SSIS Build task got added with the extension “SSIS Build & Deploy”. You should provide the path to the SSIS project instead of the Visual Studio solution. Command line switch should be Build and you have to specify the Visual Studio version to use.image

Once build you can copy the *.ispac file (deployable SSIS project) for publish as build output.image

Then create a release definition and add using the SSIS build as artifact source.Define the required variables for the parameters in SSIS project as well as variables to keep target database server information.image

In the release environment add SSIS deploy task that is installed with the extension “SSIS Build & Deploy”.  Provide the path to *.ispac file to deploy.image

The parameters defined in the SSIS project can be replaced with the variable values defined in the release definition. You have to specify the parameter names and variables in json file format as in below example.

{
"parameters":[
{
"Name": "SourceDatabase",
"Value": "$(Param.SourceDatabase)"
},
{
"Name": "SourceSQLUser",
"Value": "$(Param.SourceSQLUser)"
},
{
"Name": "SourceSQLUserPwd",
"Value": "$(Param.SourceSQLUserPwd)"
},
{
"Name": "SourceSQLServer",
"Value": "$(Param.SourceSQLServer)"
},
{
"Name": "TargetDatabase",
"Value": "$(Param.TargetDatabase)"
},
{
"Name": "TargetSQLUser",
"Value": "$(Param.TargetSQLUser)"
},
{
"Name": "TargetSQLUserPwd",
"Value": "$(Param.TargetSQLUserPwd)"
},
{
"Name": "TargetSQLServer",
"Value": "$(Param.TargetSQLServer)"
}
]
}

Deployment of SSIS packages is possible only with Integrated windows authentication and the user running the build/release agent should have sufficient permissions in SQL server to do the SSIS project deployment. The deploy task currently has an issue and only way for you to specify the connection string as custom connection string. the issue

[error]Task_InternalError Exception calling ".ctor" with "1" argument(s): "Property Login was not set."

is discussed here. You can use below connection string to avoid the issue with your preferred variable name for SQL Server.

Data Source=$(SSIS.SQLServer);Initial Catalog=master;Integrated Security=SSPI;

image

With these setting you would be able to get the SSIS project successfully to the target server.image

Parameter values set in project will be replaced with the variable values defined in the release template (Notice the TargetDatabase was set as SampleDB in the SSIS project but it got replaced with the value TargetDB defined as variable “Param.TargetDatabase” in the release definition).image

Azure DevOps Default Enabled Alert/Notification - Which Can be a Nightmare for an Admin

$
0
0

As an admin of an Azure DevOps account you may be creating sample environments to simulate production behaviors, before applying any process changes to a Team project. In this case your choice would be to create a new team project and try out a simulation first. However, you might want to do the simulation with close to production data, so you will try to find a mechanism to make a copy of work items etc. from your production to a simulation Team project. How to do this with various options available is not the discussion today (which we can discuss in another post), as this post  is to show you an alert that must be turned off, before you do any sort of work item data duplication as a bulk, to avoid spam emails to many users of your Azure DevOps account. A lesson learnt the hard way is shared with you to prevent you from falling to same pitfall.


Purpose as explained was to move the Work Items from a production team project to a simulation team project. The choice was to create a new Azure DevOps account backed with the organization AAD (Azure Active Directory) as the need was to share the simulation Team project with the organization users. You must be wondering why new account? Can’t it be done with a new team project in same account? and use bulk copy work items available “out of the box feature” which is the simplest way to do this. Unfortunately you cannot create a new Team project when you do not have the collection admin access to your production Azure DevOps and that was the only reason for going ahead with new account. However, spam email issue discussed above will be there regardless of new account or not unless you do not turn off this alert which is enabled by default.

Being a person check things before you do, I did verify the notification settings of the new Tem project created in the new Azure DevOps account and no alters related to Work Items configured. So any manipulation to work items would not result in notifications was the expectation by looking at the Notification page in Team Project level.image

Executed the bulk work item transfer and here comes the unexpected. Several end users started complaining about multiple notifications on work item assignments to them, thinking it was production environment. The simulation team project all aspects including name and iteration, and area paths etc. created exactly similar to production. So, no wonder the spam emails mislead the end users and some even believed the work item data in production  altered. Not  good impact at all. The damage is done already as the bulk work item move was finished by the time complains about the spam emails started to raise.

So where on earth this setting is for sending email notifications as work item update? Which should be shown in the Team Project notification settings page in my opinion.image

It is available in the account notification setting and by default having an alert if a work item assignment change. This was triggering emails while the bulk copy action creating new work items in the simulation team project. It would have been nice to show these account level setting inherited to the team project level in the notification settings page of the team project. But it is what it is and if you are ever planning as an account admin to do bulk copy of work items and don’t want to create panic in your end users, by sending them hundreds of emails on work item assignments, turn this alert off before you do bulk copy, a lesson I learnt the hard way.image

Resigning and Deploying Xamarin iOS Apps with Azure DevOps

$
0
0

Creating iOS mobile application with Xamarin is a good option for developers who are familiar with Visual Studio family of products. Any project you create nowadays demand for CI/CD implementation as it is the first step towards enabling DevOps. Azure DevOps comes with feature rich set of tasks for building the iOS apps. Tasks that allows re-signing of packaged ipa (iOS deployment files) are available as Marketplace task. App Center allows you to do mobile application testing in much simpler way. Let’s look at steps required to resign and deploy iOS package to APP Center using Azure DevOps pipelines.


Building iOS Xamarin projects to create the deployable ipa file can be done with the iOS.image

A build ipa can be packaged into a nuget package and added to Azure DevOps NuGet feed or other NuGet feed. Or you can publish the ipa as a build artifact.

Before deploying iOS application to target testing environment (or even to App Store for production use), you have to update the configurations of ipa package content and resign the package with required keys. This is fairly easy task in Azure DevOps as it offers hosted Mac OS agents. First step required is to receive the package containing ipa from Azure DevOps NuGet feed (if ipa published as build artifact this step is not required). Download Package task automatically extracts the package to default working folder (you can set the extract path using Destination Directory parameter).image

Then you can copy the ipa to a temporary location (root of working folder in this example) using copy task. image

ipa files can be extracted using unzip command in Mac OS terminal. It is advisable to extract to a folder, named ipa in this example.

unzip youipaname.ipa -d $(System.DefaultWorkingDirectory)/ipaimage

Then you can use a replace token task and replace any configuration file variables tokens with release definition variable values. In order for this to work in your build you should be transforming the variable values in to tokens as described in this post. After changing config values you have to repack the ipa. For this you can use zip command. But make sure to execute the command from within the ipa content root folder to ensure it is getting packaged as ipa with the correct structure. First command deletes existing ipa in working directory and then second one create the ipa (notice the . at the end of second command).

rm $(System.DefaultWorkingDirectory)/youipaname.ipa

zip -r $(System.DefaultWorkingDirectory)/youipaname.ipa .image

Once ipa is repacked, it should be resigned using the Ipa Resign task available with the extension Apple App Store in the marketplace.image

Resigned ipa can be deployed to app center using app center deployment task. App slug should be orgname/app name in app center. The destination id  should be the app center distribution group id.image

Inspecting Most Frequently Changed Files in Azure Git Repos

$
0
0

When analyzing an application with its telemetry data we would be looking at usage of the application, and would be able to identify which parts of the system is often used. Wouldn’t it be nice to compare that with the most frequently updated areas of the system? Let’s look at how we can get this information via Azure DevOps REST API.


Whether you use one Azure Git repo or multiple Azure Git repos within a single team project in Azure DevOps, it would be useful to extract frequently developed or modified areas of the system. This can be easily achieved using the script  I made available in GitHub. Currently the script only support Azure DevOps Git repos. (Support for including tfvc (team foundation version control) will be added soon).

The script can be executed as shown below.

.\GetMostFrequentlyModifiedFiles.ps1 -token 'yourPAT' -fromDate '5/02/2018' -collectionUri 'https://dev.azure.com/yourAccount' -teamProjectName 'yourteamProject' -repoName @('repo1*', '*corereop*') -branchNameFilter @('master*','develop*')

Parameters for the script

  • token: Personal Access Token of Azure DevOps User.
  • fromDate: The from date to consider modification to the system. Should be supplied in US short date format. This gives the opportunity to obtain data from a given date to current day.
  • collectionUri: Your Azure DevOps account URL.
  • teamProjectName: Name of the Azure DevOps team project.
  • repoName: You can provide * as an array item to consider all repos. Or you can provide multiple filters in an array to filter repo names. Patterns of reponamepart*, *reponamepart or *reponamepart* supported.
  • branchNameFilter: You can provide * as an array item to consider all branches. Or you can provide multiple filters in an array to filter branch names. Patterns of branchnamepart*, *branchnamepart or *branchnamepart* supported.

Script execution process in brief

  • Retrieve all repos and filter for repo name pattern filter and process only relevant repos.
  • Retrieve all branches in a given repo and apply filter and identify branches to include in the report.
  • Get all commits from the given date and process is commit.
  • In each commit find the files changed and make counts of each file change, and get it added up to file change count memory table.
  • Sort the data after processing all repos and branches and export data as a csv file.

Once script executed a csv with the most frequently used files information for selected repos and branches will be exported to csv format. Data will be sorted from most frequently modified to least modified files. The file paths would help you to identify the areas getting most changes in the system. You can use the data and create graphical representations using Excel by summarizing the data by using file path patterns. (This is possible with writing formulas in excel to strip the file part into dropping file name or path portions).image

Custom Control Conditions in Azure DevOps Pipeline Steps

$
0
0

Azure DevOps pipelines support default conditions allowing you to execute a step “Only when all previous tasks have succeeded”, “Even if a previous task has failed, unless the build was canceled”, “Even if a previous task has failed, even if the build was canceled” or “Only when a previous task has failed”. These conditions facilitate execution of  steps in a build or release pipeline to cater different scenarios. However, to support scenarios not covered by the default conditions you can implement custom control conditions in Azure DevOps pipeline steps. There is a good explanation of custom conditions in this article. Let’s explore two advance scenarios that can be handled with custom conditions, which are not explained in the article.

You may want to avoid executing a step is the branch is not starting with a given branch name pattern.  For example if you want to skip branches start with features/ you can add the custom condition as show below. Note the usage of “not” and the “startsWith” in combination to achieve the condition required.

and (succeeded(), not(startsWith(variables['Build.SourceBranch'], 'refs/heads/features/')))

Next lets look at how to write complex condition. Below specification says step should be executed when al previous steps succeeded, AND when it is NOT a pull request build AND the branch is either “master” or “develop” or a branch starts with “version/”. So the first two conditions should be true and branch name condition should be true, by being one of the three branch conditions being true. Note the usage of () and conditions “and”, “or” , “ne”, “eq” and “startsWith”. Further note the branch names in below condition is used with BitBucket cloud as the source control repo.

and(succeeded(), ne(variables['Build.Reason'], 'PullRequest'), or(eq(variables['Build.SourceBranch'], 'master'), eq(variables['Build.SourceBranch'], 'develop'), startsWith(variables['Build.SourceBranch'], 'version/')))


Choosing Right Size for Windows 10 VM in Azure to Run Docker

$
0
0

The post “Setup Windows 10 VM in Azure to Develop with Docker CE” has a detailed walkthrough on setting up Docker CE in Azure VM, which is using Windows 10. While trying out the same, encountered an issue to start the Docker giving error message “Failed to start the virtual machine 'MobyLinuxVM' because one of the Hyper-V components is not running. 'MobyLinuxVM' failed to start.” It is interesting to inspect and find a fix to this issue.


The full error message was as shown below.

Failed to start the virtual machine 'MobyLinuxVM' because one of the Hyper-V components is not running.

'MobyLinuxVM' failed to start. (Virtual machine ID xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx)

The Virtual Machine Management Service failed to start the virtual machine 'MobyLinuxVM' because one of the Hyper-V components is not running (Virtual machine ID 8CDE6654-DC88-4810-AB14-C1779F05B167).
at Start-MobyLinuxVM, <No file>: line 300
at <ScriptBlock>, <No file>: line 395
    at Docker.Core.Pipe.NamedPipeClient.Send(String action, Object[] parameters) in C:\gopath\src\github.com\docker\pinata\win\src\Docker.Core\pipe\NamedPipeClient.cs:line 36
    at Docker.Actions.DoStart(SynchronizationContext syncCtx, Boolean showWelcomeWindow, Boolean executeAfterStartCleanup) in C:\gopath\src\github.com\docker\pinata\win\src\Docker.Windows\Actions.cs:line 77
    at Docker.Actions.<>c__DisplayClass16_0.<Start>b__0() in C:\gopath\src\github.com\docker\pinata\win\src\Docker.Windows\Actions.cs:line 61
    at Docker.WPF.TaskQueue.<>c__DisplayClass19_0.<.ctor>b__1() in C:\gopath\src\github.com\docker\pinata\win\src\Docker.WPF\TaskQueue.cs:line 59
image

Inspecting the log also does provide only above error details.

Hyper-v is enabled in the machine as per instructions in “Setup Windows 10 VM in Azure to Develop with Docker CE”. image

The GitHub issue discussed here helped to get an idea on the needs and when check whether CPU virtualization enabled in task manager, noticed it was not enabled.image

Wrong size is selected for the VM might have caused issue was the guess. Checked nested virtualization enabled VM sizes for Azure here. The size of VM selected was not a nested virtualization supported size.image

Changed size to v3 which is supporting nested virtualization.image

Yay! now the virtualization is enabled.image

Docker is running fine.image


Steps to View Dashboard of Azure Kubernetes Cluster When RBAC Enabled

$
0
0

Azure Kubernetes Services offers you to host your containerized applications in Kubernetes without having to worry about getting the Kubernetes cluster infrastructure setting up and maintenance. Dashboard in Kubernetes helps you to monitor the status of services deployed as well as deployment states and health of the cluster etc. Let’s look at the steps required to perform to access the Kubernetes Dashboard once you have the Azure Kubernetes Service up and running with Role Base Access Control (RBAC) enabled.

In Azure portal you can find instructions to view  Kubernetes dashboard is available in the Service overview page with a link “View Kubernetes Dashboard”.image

As the first step you need to install Azure CLI version 2.0.27 or later in you machine. You can follow instructions here to install Azure CLI. Then you can use az login to logon to Azure subscription account.

Then you need to install kubectl by executing command below.

az aks install-cliaz aks install-cli

As the next step you need to retrieve the credentials for the cluster. For this you can execute command below.

az aks get-credentials --resource-group resourcegroupname --name azurekubernetesservicename

Then you can launch the dashboard by executing the command below.

az aks browse --resource-group resourcegroupname --name azurekubernetesservicename

However when RBAC is enabled you will get error messages similar to below in Kubernetes dashboard.01

configmaps is forbidden: User "system:serviceaccount:kube-system:kubernetes-dashboard" cannot list configmaps in the namespace "default"
persistentvolumeclaims is forbidden: User "system:serviceaccount:kube-system:kubernetes-dashboard" cannot list persistentvolumeclaims in the namespace "default"
secrets is forbidden: User "system:serviceaccount:kube-system:kubernetes-dashboard" cannot list secrets in the namespace "default"
services is forbidden: User "system:serviceaccount:kube-system:kubernetes-dashboard" cannot list services in the namespace "default"
ngresses.extensions is forbidden: User "system:serviceaccount:kube-system:kubernetes-dashboard" cannot list ingresses.extensions in the namespace "default"
daemonsets.apps is forbidden: User "system:serviceaccount:kube-system:kubernetes-dashboard" cannot list daemonsets.apps in the namespace "default"
pods is forbidden: User "system:serviceaccount:kube-system:kubernetes-dashboard" cannot list pods in the namespace "default"
events is forbidden: User "system:serviceaccount:kube-system:kubernetes-dashboard" cannot list events in the namespace "default"
deployments.apps is forbidden: User "system:serviceaccount:kube-system:kubernetes-dashboard" cannot list deployments.apps in the namespace "default"
eplicasets.apps is forbidden: User "system:serviceaccount:kube-system:kubernetes-dashboard" cannot list replicasets.apps in the namespace "default"
obs.batch is forbidden: User "system:serviceaccount:kube-system:kubernetes-dashboard" cannot list jobs.batch in the namespace "default"
cronjobs.batch is forbidden: User "system:serviceaccount:kube-system:kubernetes-dashboard" cannot list cronjobs.batch in the namespace "default"
eplicationcontrollers is forbidden: User "system:serviceaccount:kube-system:kubernetes-dashboard" cannot list replicationcontrollers in the namespace "default"
statefulsets.apps is forbidden: User "system:serviceaccount:kube-system:kubernetes-dashboard" cannot list statefulsets.apps in the namespace "default"

To fix this issue you need to make sure you are allowing dashboard user service account with admin access using the command below.

kubectl create clusterrolebinding kubernetes-dashboard -n kube-system --clusterrole=cluster-admin --serviceaccount=kube-system:kubernetes-dashboard


Then you need to execute the get credential again and launch the dashboard with below two commands.

az aks get-credentials --resource-group resourcegroupname --name azurekubernetesservicename

az aks browse --resource-group resourcegroupname --name azurekubernetesservicename

With this the dashboard will be up and running without issue.04

Draft Pull Request in Azure Git Repos

$
0
0

Pull Request are the controlled way to bring in the changes to your stable branches in your Azure Git repos, or for that matter all Git providers support Pull Requests. In Azure DevOps now you can create Pull Request as a draft so that it is giving the ability to developers to getting it ready and prevent it from being completed accidently by the reviewers. Let’s look at simple steps involved in creating Pull Request as a draft in Azure Repos.

When you are creating a Pull Request in Azure Git Repo, click the down arrow in Create button and click Create as draft.image

Such a draft Pull Request will be marked as draft near its ID.image

These Pull Requests cannot be completed until they are published. There is an option to abandon the draft Pull request if required.image

However, you can still share this pull request and perform few additional operations on it as shown below.image

Adding work items or more commits or reviewers is possible while in Draft mode. While in draft mode it is possible to review and add comments on the changes required as in a normal Pull Request. image

Once you publish the Pull Request it can be completed or abandoned. you can see the pull request is marked as Active near its Id once it is published. It is possible to change an Active Pull Request back to Draft mode by clicking on Mark as draft as shown below.image

Updating Test Case Work Item Tags and Description based on Test Automation Code Test Category Attributes and Summary Descriptions

$
0
0

A test automation development team has come up with a requirement to keep track of some test automation code related attributes and documentation descriptions in the relevant Test Case work items in Azure DevOps. The documentation description added to each test method in  the code should be captured and updated to Test Case work item summary description. Any test category attribute values should be applied as Tags to the Test Case work item. Let’s look at how we can automate the above requirement using a little bit of PowerShell in combination with Azure DevOps build pipelines.

The Requirement

  • Capture the summary description comment in the code for each test method in related Test Case work item in Azure DevOps.
  • Add the TestCategory attribute values as tags to the Test Case work item in Azure DevOps.image

The Solution

  • To keep the link between the Test Case work item and the test method in code new TestProperty is introduced named TestcaseID. The developer of test code has to add a TestProperty attribute to each test method providing the TestcaseID as the key and the ID of the Test Case work item in Azure DevOps as the value.
  • When an assembly is compiled it does not contain the documentation portion of the code such as summary description of a method. To generate the documentation of code alongside assembly as an XML documentation file, you need to enable Generate Documentation in project property pages. To do this right click on the Project in Visual Studio Solution Explorer and click on properties. Open the Build tab and for each build configuration check the XML documentation file option. With this setting enabled we can generate the XML documentation on the code which contains summary description etc. in the CI build as well.image
  • PowerShell script made available here which should be used in the Continuous Integration (CI) build of the test code to get the Test Case work item updated, with the summary description and add/manage tags based on the TestCategory attributes. Let’s analyze what the script does.
    • First it will load the Test automation assembly with reflection using name pattern provided and look for each class and methods of each class to locate test methods. Test Methods are identified using the availability of the TestMethod attribute. We have to use reflection to do this as the XML documentation we generate does not include any information about the attributes used in methods.image
    • Then it would check if a TestcaseID property has been defined and if not defined will output a warning to the build log. If an ID is specified and if that Test Case work item is not available in Azure DevOps then there would be an error failing the build. This allows the developers of test code to add the TestcaseID when they are ready with the Test Case work item in Azure DevOps without having to modify the entire code base to include it IDs at once.
    • The attributes in each test method evaluated to identify the TestCategory attribute values. The values are prefixed with a predefined tag prefix to prevent any overwriting of manually added tags in the test case work item. Script is only updating the tags if any attribute changes are available.
    • Using the assembly name XML documentation file located in the script. Documentation of a given method is retrieved and Summary Description from XML is compared with the Test Case Summary Description if any changes happened in the description, it would be updated in the Test Case work item.
  • You have to introduce below set of variables to the build pipeline to support the script.
    • RestApiVersion – Azure DevOps REST API version to use
    • TestAssemblyFilePattern – Name pattern for test assembly
    • TestAssemblyPath – Path of the test assembly and dependencies in the build server
    • TestCategoryTagPrefix – Prefix to use in tags for TestCategory attribute valuesimage
  • Enable script to use OAuth token in build as it would be required to access the REST API of Azure DevOps in the script to retrieve and update Test Case work items.image
  • In the build step make sure to add  below msbuild arguments to enable generating XML documentation and to copy output of the build to the $(Build.ArtifactStagingDirectory) folder.

/p:OutDir="$(Build.ArtifactStagingDirectory)" /p:GenerateDocumentation=true 

  • Add the PowerShell script from here to your source code repository. Then use it after the build step in the build pipeline as a PowerShell script task.image
  • With this setup when you check in/commit test automation code to the repo, the build will be executed and the relevant Test Case work items get updated automatically. A detailed log will be available in the build pipeline.image
  • In the Test Case work items you can see the changes getting applied via build.image
  • You can find sample code and the PowerShell script in GitHub here.
  • The sample Azure DevOps build can be found in the public Azure DevOps team project.

Deploying ASP.NET Core App to Azure Kubernetes Services (AKS)–Setting Up Azure DevOps Pipeline Manually–Step By Step Guide–Part 1

$
0
0

In the post “Deploy ASP.NET Core App to AKS with Azure DevOps” a quick start guide has been given on setting up an ASP.NET Core App with Azure DevOps projects. The application code, Azure Container Registry, Azure Kubernetes Service (AKS) Cluster as well as build and deployment pipelines were auto generated, with few clicks in the Azure Portal. To do deployment to AKS, Helm packaging was used in the Azure DevOps Projects auto generated sample app. In this post, let’s look at much simpler implementation, to get the ASP.NET Core App build and deployed to AKS via Azure Pipelines.

As the first step create a ASP.NET Core App in Visual Studio. When you hit F5 in Visual Studio, a browser should launch and load the App. To deploy the ASP.NET Core app in AKS it needs to be containerized. For this purpose you should have Docker for Windows installed in your machine, if you are developing your ASP.NET Core App using windows. When setting up Docker for Windows make sure to set it up for Linux containers instead of Windows containers as we are going to deploy the app to AKS as a Linux container.

Right click your ASP.NET Core App project in the solution explorer and click Add –> Docker Support, in the context menu. Select option as Linux to add Docker Support to the ASP.NET Core app as we are going to run it as a Linux container in AKS.image

This will add a Dockerfile to your ASP.NET Core App. The Dockerfile specifies to obtain dotnet 2.1 sdk image and copy the source code of the ASP.NET Core App. Then it will build and publish the app. As the next step, will use the dotnet 2.1 aspnetcore runtime image and copy the published files and prepare the container image for app execution. This basic Dockerfile is clearly explained in the post here. image

When you run the Web App in the Visual Studio it builds the docker image as per instructions in the Docker file, deploys the container in Docker for Windows, and run the container. Now you should submit the code to Azure DevOps Git repo in an Azure DevOps Team Project.

The next step is to get the Docker images built with Azure build pipelines. Before doing that we need to setup the container registry where we are going to push the built Docker image. We are going to use an Azure Container Registry for this purpose. Creating Azure Container is explained in the post here.

Create a new build pipeline and set it to use the Git repo which  you have pushed the Docker supported ASP.NET Core app. Select the Hosted Ubuntu build agent as the build agent pool.image

Add a Docker Command task to the build pipeline and set the command to Build. For the Container Registry Type select Azure Container Registry. Click on Manage link near Azure Subscription and add a service connection to the Azure Resource group which you have created for the Azure Container Registry. This will list the container registry name in the drop down of Azure Container Registry in the Docker task. Select the container registry that you have created earlier.image

image

Set the Command as build to build the Docker image. Select the Dockerfile in the repo which have been created with Visual Studio. For the image name provide a name and use build ID as the tag. Image name should be in the following format. Make sure to Set the Build context to the Web App folder as it would be the path considered in the Dockerfile as the context.

<ImageName>:<tag>

Example – aksdemo:(Build.BuildId)image

Add another Docker task to the pipeline and set it to use Push as the command. Select the Azure subscription service connection used in previous step and the container registry.image

Set the Image name to exact image name setup in the Docker Build task. It is advisable to use a Build variable and store the image name with tag format to avoid any issues in accidental mistyping of names, and for improved maintainability of the build pipeline.image

With these two tasks the build pipeline is ready to build the Docker image and push it to the Azure Container Registry. You can further improve the build pipeline to use variables for common values used in multiple tasks and enable continuous integration to build on each commit and push an image to the registry. Once you queue a build the build will create the Docker image and push it to the Azure Container Registry.image

Notice the build id of the build and in Azure Container Registry,Repositories you can find the Docker image is created with the name you have provided and tagged with the build Id. Even though this post uses build Id as the image tag you can define your of tag for image my be based on repo tag, or build version or any other formula of your preference.SNAGHTML608056ba

This concludes the part one of getting an ASP.NET Core app build as a Docker image and deploying it to AKS via Azure DevOps. We have created the Docker image and now have it in the Azure Container Registry using Azure DevOps build pipeline created with two simple steps, Build Docker image and Push it to ACR. In the next post, we can understand how to setup deployment of the Docker image in Azure Container Registry (ACR) to an AKS cluster, using simple steps defined in an Azure release pipeline.



Deploying ASP.NET Core App to Azure Kubernetes Services (AKS)–Setting Up Azure DevOps Pipeline Manually–Step By Step Guide–Part 2

$
0
0

In the part 1 of this post, enabling Docker support for ASP.NET Core app and building and pushing the Docker image to Azure Container Service, using Azure DevOps build pipeline with simple steps was described. The image is tagged with the build Id and it is pushed to the Azure Container Registry, so that it can be later deployed to a container orchestrator to run the container. Helm is used to get he deployment done to AKS via Azure DevOps when creating a an ASP.NET Core App, Container Registry and AKS, then getting it deployed automatically with few clicks using Azure Projects as described in the post “Deploy ASP.NET Core App to AKS with Azure DevOps Project”. Let’s look at getting the container image in Azure Container Registry deployed to AKS with three simple steps without using Helm, with Azure Pipelines.

As the first step we need to get an Azure Kubernetes Service cluster created in Azure, as described in the post here. Then create a release pipeline in Azure DevOps. Select the build created in the Part1, to build and push Docker image as an artifact to enable triggering the release when a new build is available.image

Enable continuous trigger (this is not mandatory and you can opt to create the release manually on demand).image

To deploy to AKS as the fist step you can create a namespace in AKS cluster. Namespaces in Kubernetes are described here. For this purpose we should create a YAML file and add the following YAML code to it. Then in the deployment pipeline we can use the YAML file in the Kubernetes deployment task to get the namespace created in AKS. Following should be the content of the YAML file. It says to use apiVersion v1, the type or the king is a namespace. Then in metadata tag you can specify the Namespace name. Here we have use __ prefix and suffix with KubNamespace to create a token in the YAML file. This token can be replaced at the time of deployment, which would allow the same YAML file to be used in deploying in multiple environments, such as Dev –> QA –> Staging –> Production in a deployment pipeline.

apiVersion: v1
kind: Namespace
metadata:
name: __KubNamespace__

We need another, YAML file to describe how to get the Docker image deployed in the AKS cluster and then how to run the ASP.NET Core app in the container deployed as a service in Kubernetes. First part of this YAML file defines how the deployments should happen. You can see in the metadata in the deployment section, the AppName is tokenized so that it can be replaced at the time of deployment. As the container name also AppName is used. Container image location is tokenized so that at the deployment time the Azure Container Registry, Login Server info and the container image name and the tag (build version) can be replaced. Container port as hardcoded here to 80 as that is how it the port used in the Docker file to expose the ASP.NET Core app in the Part1. This port also can be tokenized if required. The SecretName token is used so that pulling the container image from the Azure Container Registry (ACR) in the Kubernetes (AKS) can be authenticated and authorized. If this is not provided, then Kubernetes will fail to pull the image from ACR with an access denied error. More details on setting this will be described in the deployment step later in this post. The second section specifies how to run the container in AKS. As the name of service the APPName is used and it is tokenized. The type is set to use the LoadBalancer in AKS. AppName token is provided as the app and port 80 is hardcoded to run the service but it also can be tokenized as mentioned earlier.

apiVersion: apps/v1beta1
kind: Deployment
metadata:
name: __AppName__
spec:
template:
metadata:
labels:
app: __AppName__
spec:
containers:
- name: __AppName__
image: __ACR__/__ImageName__:__buildversion__
ports:
- containerPort: 80
imagePullSecrets:
- name: __SecretName__
---
apiVersion: v1
kind: Service
metadata:
name: __AppName__
spec:
type: LoadBalancer
ports:
- port: 80
selector:
app: __AppName__

Add both the YAML files to the code repo you have used in the Part1.image

Then in the release pipeline add the repo as an artifact source so that the YAML files can be used in the release steps of the pipeline.image

image


Add the variables as shown below to the release pipeline. App name can be different from the container image name. Secret name can be any value of your preference which would be used to store service principle based ACR access details in Kubernetes cluster to enable obtaining images from ACR for deployment. Image name should be same image name that we have used in Part1. Since we linked the build from Part1 as primary artifact of the release pipeline we can use Build.BuildId predefined variable to get the tag of the Docker image. Provide a name for the Kubernetes namespace name variable KubNamesapce.image

ACR variable refers to Azure Container Registry login server as described in the deployment section of the YAML above. It can be obtained from the overview tab of the Azure Container Registry in the Azure portal.image

Create a release stage and provide a name for example Dev. Add an Agent phase and select Hosted2017 pool t run the deployment.image

Add replace token step coming with the marketplace extension here, Set the artifact path from repo as root directory and target files as *.yml to replace tokens in the YAML files described earlier. The prefix and suffix should be set as __ and this will make the YAML files to be updated with the values defined in the release pipelines, as the first step of the release.image

As the next step add a Deploy to Kubernetes task coming with extension available in marketplace, then click the Manage link near Azure subscription to add a service connection as described in Part1 and in this article. Connect the resource group you have created the AKS cluster in the service connection. Then select it for the Azure subscription in the Deploy to Kubernetes step as shown below. Once the subscription selected you would be able to select the resource group  and the AKS cluster in the step. Make sure to keep the namespace as empty in this step which is used to create the Kubernetes namespace if it does not exist in the AKS cluster. The namespace name would be found via the YAML file as described below and providing non existing namespace here would cause the deployment to fail, as it will try to use a namespace if provided to execute the action.image

Scroll down in the Deploy to Kubernetes step which is used for creating namespace, in the Commands section for the command set apply as the value. Select Use configuration files and for the configuration file provide the YAML file artifact path which  we added to the repo for namespace deployment. the YAML file token for namespace is already replaced with variable values by previous step replace tokens. Note that we do not have to secret information here as we are not yet deploying image to AKS. We do not have to update any other section of this step as we only create the namespace in this step.image

Add another Deploy to Kubernetes step which would be used to deploy the image to AKS. Then select the AKS cluster Azure subscription service connection, the resource group and the AKS cluster. Make sure to set the namespace variable in Namespace field this time as we are deploying the app in the newly created namespace in the previous step.image

In the Commands section for the command set apply as command and select the use configuration file option. for the Configuration file pick the deployment YAML file we created earlier. The tokens in this YAML will be replaced with the variable values we set in the pipeline, such as build Id for tag of docker image, docker image name, app name etc. Expand the secrets section and select the Type of secret as docker Registry. Then for Container registry type select Azure Container Registry. Select the service connection we have added in the Part1, which is having the connection to ACR we created in the Part1 to store the Docker image. Then Select the Azure Container Registry from drop down  which is having the Docker image. For the secret name provide the variable name we have set earlier. Select Force update secret option, which would always keep the secret updated in the Kubernetes cluster allowing it to access the selected ACR which is containing the Docker image to obtain it for deployment.image

With this you can save the release pipeline and create a new release. image

You can view Kubernetes dashboard as described here, and in the Kubernetes dashboard you would be able to see newly created namespace deployed with the app. In the Services of the Kubernetes dashboard you would be able to find the external endpoint of the app that is deployed to the cluster. image

When you click on in you would be able to view the ASP.NET Core app running in the AKS cluster.image

With the above steps we have completed the simple implementation of deployment to AKS via Azure DevOps which is simple and clear to manage. However, helm based deployment using Azure pipelines to AKS would be more enterprise ready, have much more flexibility as well as robustness to implement deployments targeting AKS. We can explore more on  them in the coming posts.

Pull Request Report for Azure Git Repos

$
0
0
When you have several repos in your team project and if you want to obtain a report of pending pull requests, or even completed pull requests, you can use widget available here in your dashboards.Additionally pull request count widget also available in here. Individual repo based pull request dashboard widget also available in Azure DevOps. However, if you want a custom report, you can use REST API and create your own report.
The script made available here can be used to obtain html report on pull requests. You can pass the status of pull request and obtain the pull request list with details such as link to the pull request as well as approver details etc. To execute the script you can use below syntax.
.\FindPullRequests.ps1 -token 'PAT' -collectionUri 'https://dev.azure.com/OrgName'
-teamProjectName 'teamprojectname' -prStatuses @('active','completed')
You can obtain the report for more than one status by passing them or you can pass a single state such as active to obtain active pull request details.
The script obtains all pull request for a given state and then evaluates its vote state to determine the review actions performed.
Age is calculated considering weekdays and each requests are sorted to show in descending order of age for each status. For weekdays calculation function available here is used in the script. The report would be generated as an html file in the script location.

Copying a Build Definition to Another Team Project in a Different Azure DevOps Organization

$
0
0
You can easily clone a build and create a new build definition in the same team project. This is useful when you have similar type of applications to be built. However, if you want to clone a build definition to another team project, or for that matter to a different Azure DevOps Organization, you cannot use the clone as it always creates it in the same team project. For task groups you can use import and export to get them copied over to a different team project or even to  a different team project in a different Azure DevOps organization. Let’s look at how to export and import a build definition from a team project into a different team project in a different Azure DevOps organization.

Prerequisites
  • If you have any variable group defined and used in source build make sure to crate the variable groups with same name in the target team project.
  • If any task groups are used export and import them to the new team project. There may be extensions that you may want to install for the target team project if it is a different Azure DevOps organization.
How to import and export build definition
With above prerequisites fulfilled, you can use export in builds tab to export a definition as a json file.
Then go to new team project and click new and import. If you do not have any build definitions you will not see the view shown below. Create a dummy build definition temporarily in that case.
After importing the build definitions you ma have to fix following items.
  • Agent pool – you have to select the relevant pools.
  • Fix source location and authorize to get source code.
  • Fix any service connections used in tasks.
Then if you try to save or save and queue the build you will run into below error.
The request specifies project ID <target project guid> but the supplied pipeline specifies project ID <source project guid>.


To fix this open the exported json file in a notepad and search for source project guid and replace it with target project guid. you do nt have to fix any other part of this file such as URLs referring the  source team project and Azure DevOps organization.
However, if you want to find the project guides before running into the errors you can do it using REST api call. Replace your Azure DevOps organization name and team project name in below url and run it in a browser to get the project Id. This way you can avoid the project Id error when trying to save imported build definition from a different team project.
https://dev.azure.com/yourorgname/_apis/projects/teamprojectname?api-version=5.0

Getting Started with Azure DevOps Command Line

$
0
0
Using command line to work with Azure related resources can be done easily using Azure CLI. Now there is an extension to Azure CLI allowing you to work with Azure DevOps. Let’s have a quick look at how to get it setup and use it for few operations.


Prerequisite
You must have Azure CLI version 2.0.49 or later  installed in your machine. If it is not installed follow the instructions here to get it installed. You can verify the version of Azure CLI installed by executing below command.
az --version

Then execute below command to add Azure DevOps extension to CLI.
az extension add –n azure-devops
Once the extension added you can execute az --version and the installed extensions will be also listed.


Using Azure DevOps Command Line
Login using below command.
az login
Then you can execute Azure DevOps commands providing the organization name and the team project name. For example to list all repos in a team project you can execute below command.
az repos list --org 'https://dev.azure.com/yourorg' -p 'yourproject'

You can set a default organization and project so that execution of commands to be more seamless.
az devops configure -d 'organization=https://dev.azure.com/yourorg 'project=yourproject'
Then you can execute repos list command without org or project arguments.
az repos list

To get help on the commands you can add -h. For example see below commands help.

  • az devops –h
  • az devops configure –h
  • az repos list -h
For further information you can refer to the documentation here.

Setting a Release Variable to Make It Available to Other Stages in Azure DevOps Release Pipeline

$
0
0
You can easily set a variable in a build or release pipeline by executing “##vso[task.setvariable variable=variablename;]variablevalue” . This sets the variable for the scope of a given stage in a release pipeline. However, it does not allow you to set a variable in the scope of a release. That means if you want to set a variable value in stage and then use it in subsequent stages it is not possible to do it with  ##vso[task.setvariable . Let’s look at achieving this need using Azure DevOps REST API.
The script available here can be used for this purpose. A slightly modified version of the script to support release pipeline and enable sending any variable name and value for updating a value of a given variable can be found here.
Steps of the Script
1. Take in two parameters – VariableName and VariableValue
2. Create header to access REST API with System.AcessToken. You have to enable Allow Scripts to access the OAuth token in each deployment or agent group phase that you intend to use this script.


3. Get the current release using the ReleaseID.
4. Update the variable value.
5. Update the release using REST API.

You can use this script and execute it with a PowerShell task in your pipeline. Or you can setup a Task group so that you can reuse it as a task in your release pipelines.
In the above example Run Inline PowerShell task available with extension Inline PowerShell is used. Notice the usage of Variable name and Variable value surrounded with ‘ when passing parameters to avoid argument errors you may encounter if you use them without ‘, specially for variable value.
-VariableName '$(VariableName)' -VariableValue '$(VariableValue)'
Task group can be used as shown below to set a variable value so that it is available to other stages in the release pipeline.

Azure DevOps Service Connection for an Azure Subscription in Another Azure AD

$
0
0
Adding Azure Subscription to Azure DevOps  as service connection is really simple when you have the same account you are using for Azure DevOps associated with your Azure Subscription. However, this may not be the case always and you may want to deploy to resources in a Azure Subscription which is not related to your Azure DevOps organization. Let’s see how to create a service connection in such situation to utilize it in a deployment pipeline.

1. The solution for allowing access is to create a service principle and allow it to contribute to required resources in the foreign Azure Subscription. Use foreign Azure Subscription (could be your clients Azure Subscription) credentials log onto the Azure Portal and open up the cloud shell or use az login in PowerShell in a machine where Azure CLI is enabled.
2. List the Azure subscriptions in the account using below command and take a note of the required subscription id and the name of the subscription.
az account list
3. To test the connectivity lets create a resource group in Azure.
az group create -n 'azdo-cnect-blog-rg' -l centralus --subscription "subscriptionid"
4. To create the service principle in Azure scoped to the resource group you can use below command.
az ad sp create-for-rbac -n "MyApp" --role contributor --scopes /subscriptions/{SubID}/resourceGroups/{ResourceGroup1} /subscriptions/{SubID}/resourceGroups/{ResourceGroup2}

az ad sp create-for-rbac -n "AzDO-chamindac-free-testP" --role contributor --scopes /subscriptions/subscriptionid/resourceGroups/azdo-cnect-blog-rg

Save the output from service principal creation as these information are required for making the service connection from Azure DevOps.
5. If you inspect the resource group role assignment you would be able to see the contributor permission assigned for the service principal.
6. Let’s create the service connection by selecting Azure Resource Manager Service Connection. <image 4>
7. Click on use the full version of the service connection dialog link. <i5>
8. Provide a connection name of you preference. The subscription id and name of the foreign (client) Azure subscription should be added to relevant fields. Service Principal Client Id is the appId of the output shown to you when you create the service principle with Azure CLI. Password shown is the value that should be entered to Service principal key. Tenant id shown is the Tenant ID. (check numbered mapping in images). Verify the service connection and it should be connected. As we have created the service principal allowing contribution on a single resource group the deployments are possible only to that resource group. You can create service principal with different scope such as multiple resource groups or subscription level scope.<img6>
9. Let’s test the connection created by trying to create an Azure Web app via release management of Azure DevOps using the service connection. For that create a release definition and add an Azure Resource Group Deployment task. Select the service connection as Azure Subscription and you would be able to select the resource group which has contributor permissions for the service principal. Select action as Create or update resource group. Location parameter would be ignored as the resource group is already existing. <img 7>
9. In the template section of the task provide Azure Resource Manger (ARM) quick start sample link for web app available in https://raw.githubusercontent.com/Azure/azure-quickstart-templates/master/101-webapp-basic-windows/azuredeploy.json. You can find the information on this ARM template in here. As specified in the documentation we have to provide webAppName, sku (store keeping unit or pricing tier), and location for the web app.<i8>
10. Setup variables and values for three parameters mentioned above in the release dfintion. <i9>
11. Create a release and you will see it is getting completed successfully. <10>
12. The new web app would be created in the resource group.
13. If you test the web app the created web app is ready and you can add further steps in release management to deploy your application to Azure web app.
This post has taken you through steps required to create a service principal in Azure to create a service connection from Azure DevOps to do deployments to Azure resources in a client/foreign/another Azure subscription which is not connected to the user of the Azure DevOps organization.

Join a Personal Azure DevOps Organization to a Company Domain

$
0
0
You might have started using Azure DevOps using Microsoft accounts. Later you may have established a company and may want to move the current Microsoft account based Azure DevOps organization to work with your company active directory users. Let’s look at the steps required to join a personal Microsoft account based Azure DevOps organization to a company domain.


Connecting to Company Azure AD
Step1: Have your company on-premises Active Directory (AD) domain integrated with the Azure Active directory. How to get on-premises AD sync with Azure AD can be found in below links.
https://docs.microsoft.com/en-us/azure/architecture/reference-architectures/identity/azure-ad
https://www.ecanarys.com/Blogs/ArticleID/234/How-to-Sync-On-premise-AD-with-Windows-Azure-AD-using-Azure-AD-Connect-tool
Step2: Log on to your Azure DevOps organization with the organizations owner Microsoft account. Go to Organization settings –> Users tab and add the company domain user to the users.

Step3: Grant the new user Project collection admin access in the security tab of the organization settings.
Step4: Open a private browser window and log on to the Azure DevOps organization with the company domain user added. You need to log on from the company user to the Azure DevOps at least once before making the user the owner of the organization in the next step.
Step5: While logged in from the Microsoft account which is the Azure DevOps organization owner, change the ownership of the organization to the company domain user, in the organization settings overview page.
Step6: Log on to the Azure DevOps Organization with the company domain user. Go to organization settings –>Azure Active Directory tab. Then click connect directory and select your company Azure AD name and click connect.
Your Azure DevOps organization is now connected to the Azure AD of your company. You need to sign out and close all browsers and sign in again to start using the Azure DevOps organization.


Disconnecting from Company Azure AD
If you wish to disconnect the Azure DevOps organization from the Azure AD of the company you can do so by clicking on Disconnect Directory in the Azure Active Directory tab of the Organizations setting page. The user log on must be the organization owner to perform disconnection.

In the popup dialog provide the organization name and disconnect.
Once disconnected you have to sign out and sign in again to use the Azure DevOps organization. Afterwards even you can transfer the ownership from the company account to a personal Microsoft account if required.
Viewing all 344 articles
Browse latest View live