Quantcast
Channel: Chaminda's DevOps Journey with MSFT
Viewing all 344 articles
Browse latest View live

Trigger Build Based on Label/Tag – VSTS/TFS

$
0
0

You might want to trigger a build for a team foundation version control (TFVC) Label applied previously or if you are using Git version control for a Tag. Git tags are quite nicely supported in TFS/VSTS web Interface, but Label in TFVC is still only manageable with the  Source Control explorer of Visual Studio. There is a user voice request to enable web experience for TFVC Labels, which you can also vote for Smile. Let’s explore how to trigger build based on Tag or Label with VSTS/TFS.


Git

If you are using Git version control you can use the tag as shown below.

Tag in commitimage

Selecting Tag when queuing buildimage

Getting the source by tag for buildimage

Build done for tagimage

TFVC

TFVC is not supporting label in Code tab of web interface. But it is supported in Visual Studio.

Queuing a build for code in TFVC label should be prefixed by ‘L’.image

For the label "TFVCSampleLabel" a build is queued like shown belowimage

Build done for Labelimage


Deploying Two Deployment Group Targets on Same Machine

$
0
0

Deployment groups are the recommended approach to use with VSTS for setting up release management based automated deployments. Can two team projects of same VSTS account with two deployment groups, have same machine as the target? Was a question in a forum and below steps describe how to achieve it.


If the script from the VSTS to setup executed as it is, when the second deployment group target is deployed the first one gets overridden. So there will be target attached to only one deployment group with the default PowerShell script execution to create deployment group target.

You can use steps below to achieve multiple deployment group targets in azure VM or any single machine.

1. Copy the target registration PowerShell script to a notepad. Then replace the

$env:SystemDrive\'vstsagent'

with a local path of your preference for agent for example

'g:\vstsagentprojX'

This will make the agent installed for project X deployment group setup in g:\vstsagentprojX.

enter image description here

2. Replace

--agent $env:COMPUTERNAME

in the script with project specific name for example

--agent chamindac.projx

enter image description here

3. Then register the agent for Project X deployment group and it will be configured in the specified folder,

enter image description here

and with a unique service name since you have provided a project specific name for agent.

enter image description here

4. Follow steps similar to the other project and change its PowerShell script to have the

$env:SystemDrive\'vstsagent'

replace with project specific path for local folder say

'g:\vstsagentprojQ'

Then change --agent $env:COMPUTERNAME to a project specific name such as

--agent chamindac.projq

Then run the script to create the deployment group agent for the second project.

enter image description here

This will create two agent services with unique names in the same machine. enter image description here

You should have two targets in two team project deployment groups, online.

enter image description here

Resolving “The SDK 'Microsoft.NET.Sdk.Web' specified could not be found”

$
0
0
While using .NET Core 2.0 you may run into errors which may not be clear. “The SDK 'Microsoft.NET.Sdk.Web' specified could not be found.” is one such error which appears, when trying to load .NET Core 2.0 Web projects, and Let’s understand the reason behind this error to get it resolved.


Error:
When the solution was loaded in Visual Studio 2017 it did not show any message, but did not load the .NET Core 2.0 web projects. If tried to reload project, by right clicking on the project that was not loading below error message shown.

---------------------------
Microsoft Visual Studio
---------------------------
The SDK 'Microsoft.NET.Sdk.Web' specified could not be found.  D:\Users\chamindac\Documents\Source\Repos\bla\bla\src\APIs\xxx.csproj
---------------------------
OK  
---------------------------

The reason for this error is mismatch in the SDK of .NET Core 2.0 installed in the machine. The machine was installed with latest .NET Core 2.0 SDK, that is having number 2.1.4 as the version.

But the global.json file was saying required SDK is 2.0.0

How to resolve?
Changing the global.json to have version as 2.1.4 fixed the issue and projects could be loaded and built. But this project was done with a team and changing SDK decision, could not be made alone. The team was not ready to upgrade the SDK as they are not ready to take a risk.

The next approach was to find the exact version of SDK from somewhere and it was (and is) available in https://github.com/dotnet/core/blob/master/release-notes/download-archive.md
After installing SDK 2.0.0 the issue was resolved in a compatible way with the rest of the team.
Similar issue was discussed in a previous post here.

What happens if the solution opened in Visual Studio 2015?
Visual Studio 2015 is not compatible with .NET Core 2.0 and but it was giving a really misleading error, as shown below, if you try to reload a .NET Core 2.0 web project in VS 2015.

---------------------------
Microsoft Visual Studio
---------------------------
The default XML namespace of the project must be the MSBuild XML namespace. If the project is authored in the MSBuild 2003 format, please add xmlns="
http://schemas.microsoft.com/developer/msbuild/2003" to the <Project> element. If the project has been authored in the old 1.0 or 1.2 format, please convert it to MSBuild 2003 format.  :\Users\chamindac\Documents\Source\Repos\bla\bla\src\APIs\xxx.csproj
---------------------------
OK  
---------------------------

Check the below links to understand the .NET Core 2.0 requirements.





Cloning TFS 2018

$
0
0
How to clone TFS before TSF 2018 is explained in posts, Backing Up and Restoring Databases, Prepare Restored Databases and Configuring the AT. Out of these steps Backing Up and Restoring Databases is valid for TFS 2018. But you do not have to perform the Prepare Restored Databases as it can be done while configuring the Application Tier of the cloned TFS instance. Let’s look at valid steps for cloning TFS 2018.

You have to perform Backing Up and Restoring Databases and then on the restored databases you cannot run the TFSConfig PrepareClone as it is not available for TFS 2018.

TFSConfig PrepareClone /SQLInstance:sqlsvrname /DatabaseName:Tfs_Configuration /notificationURL:http://tfs:8080/tfs

But you may still run TFSConfig ChangeServerID command and TFSConfig RemapDBs command, on the restored databases but this step is not required as AT configuration wizard will allow you to perform that.

TFSConfig ChangeServerID /SQLInstance:sqlsvrname /DatabaseName:Tfs_Configuration

TFSConfig RemapDBs /DatabaseName:sqlsvrname;Tfs_Configuration /SQLInstances:sqlsvrname /AnalysisInstance:sqlsvrname /AnalysisDatabaseName:Tfs_Analysis


Required Steps

Backing Up and Restoring Databases

Install TFS 2018 on the clone Application Tier machine. Then run configuration wizard

Select to Configure Team Foundation Server

Select the existing database option
Specify the SQL server instance where the databases are restored and list the available databases.
Select the option to Clone a Deployment.
You would be able to see read only options are selected for remapping DBs, changing server IDs and removal of backup configuration to prevent any harm to your existing TFS 2018 instance that you are trying to create a clone.
Provide the account information for service account.image
Fill the rest of the wizard as per your requirements and you can setup reporting services etc.
image

Your cloned instance will be configured successfully.

Finding Active Team Projects Based on Code CheckIns/Commits

$
0
0

When you have many team projects in your TFS instance or in your team services account (VSTS), manually finding out what are the active projects is not going to be an easy task, by looking at each code repository in each of the projects to determine whether project is getting updated with code changes. TFS/VSTS REST API can be used in this scenario to obtain the projects that have any code commits/checkins from a given date using the script available here.

The script will go through each of the team projects available in a collection and check for TFVC (Team Foundation Version Control) and TFS Git repositories to see if any code changes made from a given date. If any such commit is found it will be added to an html file as a list as shown below. The project name can be clicked to go to the relevant project in TFS/VSTS or the changeset will navigate to changesets of the project.

image

To execute the scrip you should use below parameters.

  • PAT – Personal Access Token (Collection Admin Level user should be used to generate token as the requirement is to read all projects data)
  • From Date – Date from the active projects should be filtered
  • Collection Url – Team Project Collection url

.\GetActiveProjects.ps1 -token "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx" -fromDate "12/31/2015" -collectionUri "https://yourtfs.com/tfs/yourcollection"

param(
[Parameter(Mandatory=$true)]
[string]$token,
[Parameter(Mandatory=$true)]
[string]$fromDate,
[Parameter(Mandatory=$true)]
[string]$collectionUri
)




$User=""

# Base64-encodes the Personal Access Token (PAT) appropriately
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}"-f$User,$token)));
$header = @{Authorization=("Basic {0}"-f$base64AuthInfo)};


$reportName = 'ActiveTeamProjects.html'

$report = '<!DOCTYPE html><html><head>
<style>
li {font-family: Arial; font-size: 10pt;}
</style>
</head><body>'

$report = $report + '<h2><u><center>' + 'Active Team Project List' + '</center></u></h2>'
$report = $report + '<h4><u><center>' + 'for ' + $collectionUri + ' from ' + $fromDate + '</center></u></h4><ul>'
$report | Out-File -Force $reportName
$report = '';



$top=100;
$skip=0;

while($true)
{


$Uri = $collectionUri + '/_apis/projects?$top='+ $top + '&$skip='+ $skip + '&api-version=1.0'

$projects = Invoke-RestMethod -Method Get -ContentType application/json -Uri $Uri -Headers $header

$skip+=$top;

if($projects.count -le 0)
{
break;
}

foreach($projectin$projects.value) {
$project

$sourceControlPath = ("$/" + $project.name)

# TFVC
# Get Changesets
$Uri = $collectionUri + '/_apis/tfvc/changesets?$top=1&orderby=id%20desc&searchCriteria.itemPath=' + $sourceControlPath + '&searchCriteria.fromDate=' + $fromDate + '&api-version=1.0'

$changesHistory = $null;

try{

$changesHistory = Invoke-RestMethod -Method Get -ContentType application/json -Uri $Uri -Headers $header

if ($changesHistory.count -ge 1)
{

$changesHistoryitem = $changesHistory.value.Item(0);


$report = $report + '<li> <a target="_blank" href="' + $collectionUri + '/'+ $project.Name + '">' + $project.Name + '</a> --> <a target="_blank" href="' + $collectionUri + '/'+ $project.Name + '/_versionControl/changesets">' + $changesHistoryitem.changesetId + '</a> '+ $changesHistoryitem.createdDate + '' + $changesHistoryitem.checkedInBy.displayName +' -- ' + $changesHistoryitem.comment + '--</li>'
$report | Out-File -Append -Force $reportName
$report = '';
}
}
catch{
Write-Warning$_.ErrorDetails.Message
}

#Git

$Uri = $collectionUri + '/' + $project.name +'/_apis/git/repositories?api-version=1.0'
$gitRepos = Invoke-RestMethod -Method Get -ContentType application/json -Uri $Uri -Headers $header

if ($gitRepos.count -ge 1)
{
foreach($gitRepoin$gitRepos.value)
{
#find commits
$Uri = $gitRepo.url + '/commits?$top=1&fromDate=' + $fromDate + '&api-version=1.0'
$commits = $null;

$commits = Invoke-RestMethod -Method Get -ContentType application/json -Uri $Uri -Headers $header

if($commits.count -ge 1)
{
$commitItem = $commits.value.Item(0);


$report = $report + '<li> <a target="_blank" href="' + $collectionUri + '/'+ $project.Name + '">' + $project.Name + '</a> --> <a target="_blank" href="' + $commitItem.remoteUrl + '">' + $commitItem.commitId + '</a> '+ $commitItem.committer.date + '' + $commitItem.committer.name +' -- ' + $commitItem.comment + '--</li>'
$report | Out-File -Append -Force $reportName
$report = '';
}

}
}

}

}

$report = $report + '</ul></body></html>'
$report | Out-File -Append -Force $reportName
$report = '';

Visual Source Safe (VSS) to TFS Migration

$
0
0

In modern software development usage of only a source control mechanism would not add value to the software delivery process. However, some organizations still using Visual Source Safe (VSS) to manage their source code, but it is unsupported tool as of now and they should consider moving their code bases to a different tool. As a fully DevOps capable ALM tool TFS is a good choice to move the code from VSS. Let’s look at the steps required to perform such a migration of VSS code, including history.


Ask all your teams to check in their source code to VSS.

Next step is taking a backup of VSS. Before doing that ensure you are using Visual Source Safe 6.0 or later. To get the backup go to the VSS machine and find the path the VSS is setup (this is usually c:\Program Files(x86)\Microsoft Visual Studio\VSS), then copy the files and folders named below.

  • data folder
  • temp folder
  • users
  • srcsafe ini file
  • users text file01

Make sure to copy them to a single folder as shown below.02

You should setup a machine with one of below OS as per the documentation.

  • Windows 8
  • Windows Server 2012
  • Windows 7
  • Windows Server 2008 R2

Using Windows Server 2016 gave no trouble so you can also use the same as well if needed. In this temporary upgrade computer install an instance of SQL Server. SQL Server 2012 and 2014 tested and works fine for this requirement. Make sure in this intermediate machine you have free disk space of 2 times of your VSS backup + 5GB or more.

Install Visual Source Safe 2005 in your intermediate machine.

Download the Visual Source Safe upgrade tool for the Team Foundation Server and install it in the temporary machine you setup to do the upgrade work. Now copy the backup of the VSS you created previously to the intermediate machine. It is strongly advised that you should not use your VSS server or the Target TFS server as the intermediate machine.

Open up the upgrade tool and provide the backup folder path and the VSS admin user password. Then click List Available Projects, which will list all the projects in VSS, in the tool as shown below.03

You might run into below error when you click on List Available Projects, if your intermediate machine does not have Visual Source Safe 2005 installed in it.

---------------------------

VSS Upgrade Wizard

---------------------------

Unable to cast COM object of type 'Microsoft.VisualStudio.SourceSafe.Interop.VSSDatabaseClass' to interface type 'Microsoft.VisualStudio.SourceSafe.Interop.IVSSDatabase'. This operation failed because the QueryInterface call on the COM component for the interface with IID '{2A0DE0EE-2E9F-11D0-9236-00AA00A1EB95}' failed due to the following error: No such interface supported (Exception from HRESULT: 0x80004002 (E_NOINTERFACE)).

---------------------------

OK 

---------------------------

In your target TFS (tested VSS migration with TFS 2017.2 and works fine) create a new Team Project, or multiple projects if you want to move each VSS project to different TFS team projects. Select the required VSS project to migrate and click next. In the next step add the TFS server and select the required team project as target.04

05

In the next step select option to migrate the full history if you want the TFS tem project to have all the source control checkin history of VSS, otherwise you can just migrate the latest version of code by selecting tip. Since there is a difference between how changes to code are identified in VSS and TFS, the changesets of of TFS might contain multiple changes from VSS. Generally files updated by same user within few minutes with same comment in VSS  would be grouped as a changeset in TFS when the migration happens. Further information on this is available here.06

Review all setting and run the readiness checks next.07

08

Run the upgrade to perform the migration.09

The details of issues can be found by clicking next and clicking on the link to open the log. 10

The issue in this particular instance was that the files were checked out in VSS and that checkout status has not been preserved in TFS. To avoid such warning make sure to checkin all code to VSS before taking a backup for migration.11

Upgrade report link shows the details of issues etc. in a web browser in a more clear format.12

You can find your source code in TFS with history. The comments will be prefixed with date time information of original date of checkin happened in VSS. In multiple changes are grouped into one changeset situations the begin and end time of change would be added as comment prefix as shown highlighted below. This mechanism is used as TFS changeset is having a created date of the day which the migration happened.13

14

Securing Build Definitions When Multiple Teams Work on a Single Team Project

$
0
0

Securing a build definition is quite straight forward when an organization uses multiple team projects in VSTS/TFS to handle different applications they develop. Each team project build administration can be assigned to different individuals easily. There are organizations using a single team project to manage all of their applications, dividing them into teams inside a single team project. Let's look at possibilities of securing each application teams' builds in single team project for organization scenario.

Builds can be organized into folders in VSTS/TFS. This gives you opportunity to create each build definition belonging to a different application teams in relevant folder for the team. For the folder only, the team members should be given access to edit build quality, view or queue a build. To do this follow the steps below.

Create a folder for each of the application teams in your team project using the Manage Folders window in the build definitions page of VSTS/TFS.image

Open the relevant build folder security page.image


Add the team to the permissions and you can see the team inherits permissions from contributors group of team project (If you have created teams without inheriting permissions of contributors group then the inherited permissions would not be available).

image

Set the permissions below to explicitly allow for the team to enable them to view, edit build quality and queue builds.image

Once you have completed this procedure for all the teams in the team project, go to the all builds security options from the build definitions page. Then for the contributors VSTS group set the permissions to "Not Set" for all permission options. This will remove the inheritance of permission via contributors group to a team's build folders for members of other teams, to view, queue or set build quality.image

We managed to isolate the permissions for view, set build quality and queue builds for the individual teams, for their respective builds following the steps above. But how can we apply build administration permissions isolated to each team? Let's have a look at that next.

No one should be added to the Build Administrators VSTS security group, except people who require administrative control for all build definitions.image

Create Build Admin VSTS group for each of the project teams and make each one a member of the relevant project team. For each team build folder grant permissions to the relevant team build admin group, as “allow” for all permissions, except "Override check-in validation by build" and "Update build information" which should only be allowed for service accounts (for more information refer here). Queue, set quality and view permissions are inherited to admins of a given team from the team since we are making the build admin VSTS group a member of the relevant team.image

You can add the required build admin groups to the administrator role of the required build queues. You may keep the contributors group as user for build queues shared for all teams. If you have a build queue specific to a single team the you can remove the contributors group from that queue, and just add the relevant team with user role to the queue.image

Another essential part of managing build definitions is the ability to create service endpoints to get external service access vial builds. For example, if you want to integrate with SonarQube for validation your source code quality with the builds, you need to create service endpoint to connect to Sonar Server or Sonar cloud services. For this you need end point creation and end point administration permissions to be granted for the build administration VSTS groups you have created. One caveat is this allows the ability to change the other teams’ endpoints to another teams build administrators. With the current permissions structure of VSTS/TFS there is no way to create the end point administration and creation permissions isolated to each team inside a team project.

VSTS Build, node-sass and HTTP 404

$
0
0

When you are building node js projects with VSTS you may encounter some strange errors which are very hard to diagnose due to misleading error messages. node-sass and Http 404 error is one such error making it really hard to fix the issue misleading error message.

2018-05-28T02:44:19.0012193Z npm info lifecycle node-sass@3.13.1~install: node-sass@3.13.1
2018-05-28T02:44:19.0012339Z Cannot download "
https://github.com/sass/node-sass/releases/download/v3.13.1/win32-x64-57_binding.node":
2018-05-28T02:44:19.0012428Z
2018-05-28T02:44:19.0012528Z HTTP error 404 Not Found


2018-05-28T02:44:19.0012596Z
2018-05-28T02:44:19.0012713Z Hint: If github.com is not accessible in your location
2018-05-28T02:44:19.0012897Z       try setting a proxy via HTTP_PROXY, e.g.
2018-05-28T02:44:19.0012971Z
2018-05-28T02:44:19.0013081Z       export HTTP_PROXY=
http://example.com:1234
2018-05-28T02:44:19.0013160Z
2018-05-28T02:44:19.0013266Z or configure npm proxy via
2018-05-28T02:44:19.0013335Z
2018-05-28T02:44:19.0013442Z       npm config set proxy
http://example.com:8080
2018-05-28T02:44:19.0013573Z npm verb lifecycle node-sass@3.13.1~install: unsafe-perm in lifecycle true
2018-05-28T02:44:19.0014185Z npm verb lifecycle node-sass@3.13.1~install: PATH: C:\Program Files\nodejs\node_modules\npm\bin\node-gyp-bin;F:\agent\_work\32\s\node_modules\gulp-sass\node_modules\node-sass\node_modules\.bin;F:\agent\_work\32\s\node_modules\gulp-sass\node_modules\.bin;F:\agent\_work\32\s\node_modules\.bin;F:\agent\externals\git\cmd;C:\ProgramData\Oracle\Java\javapath;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\ProgramData\chocolatey\bin;C:\Users\svccor_phoenix\AppData\Roaming\npm;c:\tools\nuget;C:\Program Files\dotnet\;F:\agent\Microsoft.Data.Tools.Msbuild\lib\net46;C:\Program Files\Microsoft SQL Server\120\DTS\Binn\;C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\110\Tools\Binn\;C:\Program Files (x86)\Microsoft SQL Server\120\Tools\Binn\;C:\Program Files\Microsoft SQL Server\120\Tools\Binn\;C:\Program Files (x86)\Microsoft SQL Server\120\Tools\Binn\ManagementStudio\;C:\Program Files (x86)\Microsoft SQL Server\120\DTS\Binn\;C:\Users\svccor_phoenix\AppData\Roaming\npm;c:\tools\ssdtbuild;C:\Program Files\Git\cmd;C:\Program Files\nodejs\;C:\Program Files (x86)\dotnet\;C:\Program Files\Microsoft SQL Server\130\Tools\Binn\;
2018-05-28T02:44:19.0014766Z npm verb lifecycle node-sass@3.13.1~install: CWD: F:\agent\_work\32\s\node_modules\gulp-sass\node_modules\node-sass

2018-05-28T02:44:19.6231401Z 38604 info lifecycle node-sass@3.13.1~postinstall: Failed to exec postinstall script
2018-05-28T02:44:19.6231571Z 38605 verbose unlock done using C:\Users\svccor_phoenix\AppData\Roaming\npm-cache\_locks\staging-ceadb4797f66e355.lock for F:\agent\_work\32\s\node_modules\.staging
2018-05-28T02:44:19.6231735Z 38606 warn s No repository field.
2018-05-28T02:44:19.6231846Z 38607 warn s No license field.
2018-05-28T02:44:19.6231973Z 38608 verbose stack Error: node-sass@3.13.1 postinstall: `node scripts/build.js`
2018-05-28T02:44:19.6232111Z 38608 verbose stack Exit status 1
2018-05-28T02:44:19.6232266Z 38608 verbose stack     at EventEmitter.<anonymous> (C:\Program Files\nodejs\node_modules\npm\node_modules\npm-lifecycle\index.js:280:16)
2018-05-28T02:44:19.6232742Z 38608 verbose stack     at emitTwo (events.js:126:13)
2018-05-28T02:44:19.6232869Z 38608 verbose stack     at EventEmitter.emit (events.js:214:7)
2018-05-28T02:44:19.6233040Z 38608 verbose stack     at ChildProcess.<anonymous> (C:\Program Files\nodejs\node_modules\npm\node_modules\npm-lifecycle\lib\spawn.js:55:14)
2018-05-28T02:44:19.6233187Z 38608 verbose stack     at emitTwo (events.js:126:13)
2018-05-28T02:44:19.6233312Z 38608 verbose stack     at ChildProcess.emit (events.js:214:7)
2018-05-28T02:44:19.6233456Z 38608 verbose stack     at maybeClose (internal/child_process.js:925:16)
2018-05-28T02:44:19.6233598Z 38608 verbose stack     at Process.ChildProcess._handle.onexit (internal/child_process.js:209:5)
2018-05-28T02:44:19.6233728Z 38609 verbose pkgid node-sass@3.13.1
2018-05-28T02:44:19.6233855Z 38610 verbose cwd F:\agent\_work\32\s
2018-05-28T02:44:19.6233975Z 38611 verbose Windows_NT 6.3.9600
2018-05-28T02:44:19.6234132Z 38612 verbose argv "C:\\Program Files\\nodejs\\node.exe""C:\\Program Files\\nodejs\\node_modules\\npm\\bin\\npm-cli.js""install"
2018-05-28T02:44:19.6234286Z 38613 verbose node v8.9.3
2018-05-28T02:44:19.6234393Z 38614 verbose npm  v5.5.1
2018-05-28T02:44:19.6234499Z 38615 error code ELIFECYCLE
2018-05-28T02:44:19.6234606Z 38616 error errno 1
2018-05-28T02:44:19.6234736Z 38617 error node-sass@3.13.1 postinstall: `node scripts/build.js`
2018-05-28T02:44:19.6234854Z 38617 error Exit status 1
2018-05-28T02:44:19.6234972Z 38618 error Failed at the node-sass@3.13.1 postinstall script.
2018-05-28T02:44:19.6235125Z 38618 error This is probably not a problem with npm. There is likely additional logging output above.
2018-05-28T02:44:19.6235254Z 38619 verbose exit [ 1, true ]

After investigating the issue for several hours and running the build and trying out options in below links did not help much.


Putting npm install task to provide verbose information helped to identify the possible cause of error which is linked to missing python files.

2018-05-28T02:44:19.0239281Z gyp verb check python checking for Python executable "python2" in the PATH
2018-05-28T02:44:19.0239410Z gyp verb `which` failed Error: not found: python2
2018-05-28T02:44:19.0239557Z gyp verb `which` failed     at getNotFoundError (F:\agent\_work\32\s\node_modules\which\which.js:13:12)
2018-05-28T02:44:19.0239714Z gyp verb `which` failed     at F (F:\agent\_work\32\s\node_modules\which\which.js:68:19)
2018-05-28T02:44:19.0239859Z gyp verb `which` failed     at E (F:\agent\_work\32\s\node_modules\which\which.js:80:29)
2018-05-28T02:44:19.0240001Z gyp verb `which` failed     at F:\agent\_work\32\s\node_modules\which\which.js:89:16
2018-05-28T02:44:19.0240149Z gyp verb `which` failed     at F:\agent\_work\32\s\node_modules\isexe\index.js:42:5
2018-05-28T02:44:19.0240290Z gyp verb `which` failed     at F:\agent\_work\32\s\node_modules\isexe\windows.js:36:5
2018-05-28T02:44:19.0240426Z gyp verb `which` failed     at FSReqWrap.oncomplete (fs.js:152:21)
2018-05-28T02:44:19.0240561Z gyp verb `which` failed  python2 { Error: not found: python2
2018-05-28T02:44:19.0240770Z gyp verb `which` failed     at getNotFoundError (F:\agent\_work\32\s\node_modules\which\which.js:13:12)
2018-05-28T02:44:19.0240927Z gyp verb `which` failed     at F (F:\agent\_work\32\s\node_modules\which\which.js:68:19)
2018-05-28T02:44:19.0241078Z gyp verb `which` failed     at E (F:\agent\_work\32\s\node_modules\which\which.js:80:29)
2018-05-28T02:44:19.0241221Z gyp verb `which` failed     at F:\agent\_work\32\s\node_modules\which\which.js:89:16
2018-05-28T02:44:19.0241362Z gyp verb `which` failed     at F:\agent\_work\32\s\node_modules\isexe\index.js:42:5
2018-05-28T02:44:19.0241508Z gyp verb `which` failed     at F:\agent\_work\32\s\node_modules\isexe\windows.js:36:5
2018-05-28T02:44:19.0241643Z gyp verb `which` failed     at FSReqWrap.oncomplete (fs.js:152:21)
2018-05-28T02:44:19.0241950Z gyp verb `which` failed   stack: 'Error: not found: python2\n    at getNotFoundError (F:\\agent\\_work\\32\\s\\node_modules\\which\\which.js:13:12)\n    at F (F:\\agent\\_work\\32\\s\\node_modules\\which\\which.js:68:19)\n    at E (F:\\agent\\_work\\32\\s\\node_modules\\which\\which.js:80:29)\n    at F:\\agent\\_work\\32\\s\\node_modules\\which\\which.js:89:16\n    at F:\\agent\\_work\\32\\s\\node_modules\\isexe\\index.js:42:5\n    at F:\\agent\\_work\\32\\s\\node_modules\\isexe\\windows.js:36:5\n    at FSReqWrap.oncomplete (fs.js:152:21)',
2018-05-28T02:44:19.0242632Z gyp verb `which` failed   code: 'ENOENT' }
2018-05-28T02:44:19.0242771Z gyp verb check python checking for Python executable "python" in the PATH
2018-05-28T02:44:19.0242910Z gyp verb `which` failed Error: not found: python
2018-05-28T02:44:19.0243049Z gyp verb `which` failed     at getNotFoundError (F:\agent\_work\32\s\node_modules\which\which.js:13:12)
2018-05-28T02:44:19.0243198Z gyp verb `which` failed     at F (F:\agent\_work\32\s\node_modules\which\which.js:68:19)
2018-05-28T02:44:19.0243343Z gyp verb `which` failed     at E (F:\agent\_work\32\s\node_modules\which\which.js:80:29)
2018-05-28T02:44:19.0243500Z gyp verb `which` failed     at F:\agent\_work\32\s\node_modules\which\which.js:89:16
2018-05-28T02:44:19.0243648Z gyp verb `which` failed     at F:\agent\_work\32\s\node_modules\isexe\index.js:42:5
2018-05-28T02:44:19.0243789Z gyp verb `which` failed     at F:\agent\_work\32\s\node_modules\isexe\windows.js:36:5
2018-05-28T02:44:19.0243933Z gyp verb `which` failed     at FSReqWrap.oncomplete (fs.js:152:21)
2018-05-28T02:44:19.0244060Z gyp verb `which` failed  python { Error: not found: python
2018-05-28T02:44:19.0244201Z gyp verb `which` failed     at getNotFoundError (F:\agent\_work\32\s\node_modules\which\which.js:13:12)
2018-05-28T02:44:19.0244356Z gyp verb `which` failed     at F (F:\agent\_work\32\s\node_modules\which\which.js:68:19)
2018-05-28T02:44:19.0244499Z gyp verb `which` failed     at E (F:\agent\_work\32\s\node_modules\which\which.js:80:29)
2018-05-28T02:44:19.0244641Z gyp verb `which` failed     at F:\agent\_work\32\s\node_modules\which\which.js:89:16
2018-05-28T02:44:19.0244796Z gyp verb `which` failed     at F:\agent\_work\32\s\node_modules\isexe\index.js:42:5
2018-05-28T02:44:19.0244944Z gyp verb `which` failed     at F:\agent\_work\32\s\node_modules\isexe\windows.js:36:5
2018-05-28T02:44:19.0245079Z gyp verb `which` failed     at FSReqWrap.oncomplete (fs.js:152:21)
2018-05-28T02:44:19.0245379Z gyp verb `which` failed   stack: 'Error: not found: python\n    at getNotFoundError (F:\\agent\\_work\\32\\s\\node_modules\\which\\which.js:13:12)\n    at F (F:\\agent\\_work\\32\\s\\node_modules\\which\\which.js:68:19)\n    at E (F:\\agent\\_work\\32\\s\\node_modules\\which\\which.js:80:29)\n    at F:\\agent\\_work\\32\\s\\node_modules\\which\\which.js:89:16\n    at F:\\agent\\_work\\32\\s\\node_modules\\isexe\\index.js:42:5\n    at F:\\agent\\_work\\32\\s\\node_modules\\isexe\\windows.js:36:5\n    at FSReqWrap.oncomplete (fs.js:152:21)',
2018-05-28T02:44:19.0245651Z gyp verb `which` failed   code: 'ENOENT' }
2018-05-28T02:44:19.0245779Z gyp verb could not find "python". checking python launcher
2018-05-28T02:44:19.0245997Z gyp verb could not find "python". guessing location
2018-05-28T02:44:19.0246134Z gyp verb ensuring that file exists: C:\Python27\python.exe
2018-05-28T02:44:19.0246273Z gyp ERR! configure error
2018-05-28T02:44:19.0246415Z gyp ERR! stack Error: Can't find Python executable "python", you can set the PYTHON env variable.
2018-05-28T02:44:19.0246574Z gyp ERR! stack     at PythonFinder.failNoPython (F:\agent\_work\32\s\node_modules\node-gyp\lib\configure.js:483:19)
2018-05-28T02:44:19.0246736Z gyp ERR! stack     at PythonFinder.<anonymous> (F:\agent\_work\32\s\node_modules\node-gyp\lib\configure.js:508:16)

Searching along that pointed to the GitHub issue here and using the command below before firing “npm install” in the repo folder using command prompt fixed the issue.

npm --add-python-to-path='true' --debug install --global windows-build-tools

But, when executing with the VSTS the error pops up again due to the user of the VSTS agent is not the same user that executed the above command manually. This could be fixed with the addition of command line step to build which execute the same command before the npm install. Once this additional npm command line “npm --add-python-to-path='true' --debug install --global windows-build-tools” run  it can be disabled from the build definition to save build time, as long as same agent is used for the build for next time. image


Using NuGet Packages as VSTS Release Artifact Source

$
0
0

If you are used to deploy your solutions with Octopus deploy which a re built with VSTS/TFS, you are used to package your build output as a NuGet package and use it in Octopus. Now you can use the NuGet packages with VSTS release management as well for deployment. For this you have to have the package management feature in VSTS enabled. As VSTS builds and their artifacts are discarded in a configured time period and the maximum time and  number of builds is limited, keeping artifacts as NuGet packages would be useful you to keep your deployed artifacts for a longer period. Let’s explore how to use NuGet packages for deployment in VSTS release management.


You have to create a package feed as the first step.image

You can set the package feed visibility and enable use of public packages via this feed.image


Once you have the package feed setup you can use NuGet task in your build with a push command to push your package built in the build to the internal package feed.image

Then in your release definition for artifacts you can expand artifact types as shown below.image

This will give you few more artifact types including packages. Select the package feed and your package. then set the version to latest or to a preferred one.image

You have the option to even set continuous deployment trigger for new package version availability.image

The package content will be automatically downloaded , extracted and will be available for you similar to a normal artifact drop contents.

Packaging "Assemblies in GAC Installed with SDKs" in Build and Getting Deployed to Target Machine GAC

$
0
0
There can be projects depending on assemblies in Global Assembly Cache installed with may be a internal company SDK, which would even be installed in developer machines and in build servers. These assemblies should be packaged with the project and deployed to the GAC of target machines such as QA,staging and Production etc.  as well. Since the code repo does not include such assemblies in the build server it may be required to extract those assemblies from GAC and packaged with the builds. Let’s see how we can get the assemblies in GAC packaged and get deployed to targets, using VSTS build and release management.
There is a PowerShell module developed by Lars Truijens available in PowerShell gallery and source is in GitHub. This module has really useful commands which does not depend in the gacutil coming with Visual Studio, which is  not useful in this scenario as gacutil cannot be used in deployment targets even if there is a possibility of usage of it in the build servers, since we cannot install Visual Studio in deployment targets Smile.
You can use below PowerShell script in VSTS/TF build to extract the required assemblies using assembly names (wild cards supported). This script makes sure NuGet package provider exists and then installs the PowerShell Module GAC, before using the GAC related commands.
param (
[string]$AssemblyMask,
[string]$AssemblyTargetPath
)

Install-PackageProvider -Name NuGet -MinimumVersion 2.8.5.201 -Force

Install-Module Gac -Force

md -Path $AssemblyTargetPath

$AssemblyMaskValues = $AssemblyMask.Split(';')

foreach ($AssemblyMaskValuein$AssemblyMaskValues)
{
Get-GacAssembly$AssemblyMaskValue | Get-GacAssemblyFile | Copy-Item -Destination $AssemblyTargetPath -Verbose
}

You can use the script as inline PowerShell task as shown below. Pass the arguments for assembly mask and the path to download assemblies from GAC. (The latest version of out of the box PowerShell task in VSTS does not have Arguments, you can use the extension from VS Marketplace which has a inline PowerShell task which is really flexible)
-AssemblyMask '$(DemoSDKAssemblyMask)' -AssemblyTargetPath '$(GacAssemblyPath)'image
These variables can be configured similar to shown below. You can use .nuspec files to make sure the assemblies are packaged into NuGet package if you are using NuGet packages as the build artifact. Or you can copy the assemblies to build drop by putting them in build staging directory.image
Then in your release definition you can use the script below to deploy the assemblies to the target machine GAC. This script also makes sure NuGet package provider exists and then installs the PowerShell Module GAC, before using the GAC related commands.
param (
[string]$AssemblyTargetPath
)

Install-PackageProvider -Name NuGet -MinimumVersion 2.8.5.201 -Force

Install-Module Gac -Force

$items = Get-ChildItem -Path $AssemblyTargetPath


foreach ($itemin$items)
{
$fullAssemblyPath = Join-Path$AssemblyTargetPath$item.Name
Write-Host'Adding to GAC '$fullAssemblyPath

Add-GacAssembly -Path $fullAssemblyPath -Force -Verbose
}
In the release definition you have to provide the path to the download and copied location of the GAC assemblies in the target machine.image
The assemblies will be deployed to GAC in the target machine when the release is executed. If you are using Octopus deploy same PowerShell script can be used to get the assemblies deployed to GAC.

Building and Deploying Windows Services with VSTS/TFS

$
0
0

Generally windows services are deployed by creating an msi installer. It is possible to deploy msi via VSTS/TFS release management using the extension available in the marketplace. Packaging the windows as non msi would give opportunity to have more control on the configurations etc, when deployment and let’s see how to package a windows service and then get it deployed via VSTS/TFS release management.

Most important point is how to get the apply the configuration parameters for a windows service at deployment depending in the target and how to tokenize the configuration in the builds. Windows service would contain a app.config file and to transform an app.config you can add say for example app.release.config with xdt transformation tags.

app.configimage

app.config like above in a windows service can be added with a transform file such as app.release.config such as below.image

To transform this sort of a config in the VSTS builds you can use XDT transform task coming with marketplace extension. You can use syntax of app.release.config => app.config to get the transformation happened from build and applied to app.cofig which is getting packaged. If you have multiple config files this task can tranfrom all and you can add additional transform entries as new lines in Transformations field.image

You can build and package the windows service project as NuGet package and use it in the release as discussed in “Using NuGet Packages as VSTS Release Artifact Source”, or you can use the build drop as default pattern in VSTS/TFS builds.

To enable replacing of tokens applied in the config file with the transformation discussed above at the deployment time with the variable values setup in the release definition, you can use replace tokens task available in the extension in marketplace.image

To deploy the windows service a good extension is available in marketplace having few tasks. It has stop and start tasks as well as create/update windows service task. Stop service task is useful to stop the service before start the deployment. It can be set to continue on error to prevent any errors in initial deployment if the service is not available or in an event the service is already stopped.image

You have to use the copy file task to copy the files from artifacts drop extracted location to a designated path of the windows service after stopping the service. The the create windows service task can be used to update the service if it is already available or if it is not available to create it.image

The advance settings in the task let you set the service startup type and service account.image

The you can use the start windows service task to start the service after deployment. You can group all these tasks to a task group and use it as a single task to enable usage with multiple release definitions to enable windows service deployments.image

Integrating VSTS Package Feed to Octopus

$
0
0

Octopus deploy is used by many organizations as their continuous deployment tool. You can send a package to Octopus deploy server via VSTS build or release step, using  “Push Packages to Octopus” task comes with VSTS Marketplace extension named Octopus integration. However, if you prefer to use VSTS Package Management to store your packages, you can now consume them in Octopus by integrating with the VSTS Package Management feed. Let’s look at how to setup Octopus to consume VSTS Package Management feed packages. (We have discussed how to use VSTS Package Management feed as artifact source in VSTS Release Management in a previous post “Using NuGet Packages as VSTS Release Artifact Source”)


In Octopus you can add an external feed in the Library tab.image

You have to obtain the package feed url from VSTS, for the integration with Octopus. To get the url go to the Packages tab in the Build and Release tab and select the required feed. Then click on Connect to feed which will give a pop up window with NuGet feed url. This url is needed to configure the connectivity with Octopus.image

To authenticate the access to NuGet feed in VSTS you need to generate a PAT (Personal Access Token) with the scope of Package reading.image

Fill in the package feed url and the PAT into the Octopus feed creation window. You can provide any value for the name of the user and it should not be empty. Set retry attempts and intervals as appropriate and tick the option to “Make use of the extended API” which is supported with VSTS package feeds. Click on save and test to proceed.image

In the next window that appears click on test.image

You can search for a exact package name.image

Which will retrieve the package details including version of it.image

You can search for part of the name and all matching packages would be found. This confirms the connectivity with VSTS Package Management feed is established successfully and you can use the packages from the feed in Octopus deployment tasks.image

Securing Release Definitions When Multiple Teams Work on a Single Team Project

$
0
0

We have explored “Securing Build Definitions When Multiple Teams Work on a Single Team Project” in a previous post. Now the folders to group release definitions and applying permissions to isolate each team’s release definitions is also a possibility in VSTS. As we discussed in the “Securing Build Definitions When Multiple Teams Work on a Single Team Project” it is important to create the Build/Release admins VSTS permission group for each of the teams in the team project. Using the same admins group and the team we can setup permissions for release definitions folders. Let’s look at the steps in detail.

The new releases hub allows you grouping with folders. You need to enable the preview feature to get the new release hub access. In your VSTS profile menu click on preview features.image

Then enable the New Releases Hub.image

In the Build and Releases tab go to Releases* to view the new releases hub.image

You can click on menu for All folders and click New Folder to create a new folder to group your release definitions.image

Provide a name and create a folder.image

You can create child folders and create a tree structure if you need multiple levels for grouping.image

Once required folder structure ready you can move the existing release pipelines (definitions) to new folders.image

By default all release definitions are manageable by contributors group. You cave to click Security menu on All pipelines and set the contributors group permissions to “Not Set” to prevent all contributors from inheriting permissions to all release pipelines. If you want view releases and view definitions can be allowed for contributors.image

image

Then for a folder you can setup permissions to a given team’s build admins access to create new definitions or edit existing definitions in the folder and to manage, approve deployment etc.image

image

You might want to create multiple groups in a team such as release approvers for a particular environment etc.Using the permissions and folders in new release hub will allow you to effectively control permissions as per your team needs and isolate each team’s releases from one another. However there is no better way to isolate each teams service end points within a single team project as of now.


Controlling Octopus Releases with VSTS Release Management

$
0
0

You may be using Octopus deploy for your deployment automation pipeline needs while you are having your builds and work items managed in VSTS. It is a good idea to manage the Octopus release pipeline via VSTS release management so that you have the opportunity to use automated test execution and capturing of test results, as well as easily generate release notes using the VSTS work items, using feature rich tasks and automation test results views in VSTS release management. Let’s look at the important steps required to make VSTS release management to successfully utilize your existing Octopus deploy process steps.

The first step is to make all the lifecycle except first one optional in Octopus, so as to allow VSTS release management to decide when to trigger a given environment in a phase. Make sure to add phases and make the phase optional even if a phase contains a single environment as shown below. This is mandatory to allow VSTS release management to control the process flow of deployment pipeline.image

As the next step you need the connectivity from VSTS and its deployment agents to the octopus server. To add Octopus as a service for VSTS you need to create API Key in Octopus. Go to you profile in Octopus.image

Click on create New API Key.image

Provide a name (purpose) for the key and click on Generate Key.image

Make sure to copy and keep the API Key in a safe location as it will be required to configure VSTS service endpoint for Octopus.image

In the VSTS team project go to Project Settings => Service Endpoints and click on New Service Endpoint to select Octopus Deploy endpoint.image

Provide the octopus server URL and the API Key from octopus along with a friendly name for the service endpoint.image

In your build you can push your packages to Octopus server or push them to VSTS Package Management feed and let Octopus utilize packages from VSTS Feed as explained in post “Integrating VSTS Package Feed to Octopus”. Make sure to version the package with the build number, as this would be used in release management steps to let octopus know the correct package version that needs to be deployed.

To create release and trigger a deployment from VSTS you can use the tasks comes with Octopus extension available in marketplace. These are same tasks that can be used in VSTS build for triggering a deployment in Octopus. However, the difference here is instead of letting Octopus controlling the flow of release through the pipeline, VSTS release management pipeline will control the flow. As explained above Octopus environments should be setup as deployment optional to enable this control from VSTS side. The VSTS pipeline should be setup to create the octopus release in the first environment. From the second environment onwards it will only be deploy octopus release task to trigger the deployment in different environments defined in octopus.image

You should connect the build which is pushing the NuGet packages to octopus, or VSTS Package Management feed to the release definition. Since the build does not have published artifacts drop, there will be a warning saying no artifacts in the latest version. This can be ignored safely.image

In the agent phase of VSTS make sure to set not to download any artifacts as it is not required, since Octopus is doing the deployment.image

In the first environment of the release pipeline in VSTS you should have the create octopus release task, which will trigger a deployment to the initial environment. If you have setup in Octopus to not to trigger a deployment once release created you can add a second task to trigger deployment which will be explained next. In the create Octopus release task, make sure to provide the VSTS release name, variable as the release number, so that it can be used in subsequent deployment trigger tasks to locate the correct release created in Octopus. Select the Octopus service endpoint created earlier and select the Octopus project defined. It is not required to define the channel but even if it is defined no harm as long as you select the correct channel in octopus and the environments planned to control deployments with VSTS release management is included in that channel.image

Then make sure to define the Tenant if you have configured a Tenant based deployments for the Octopus project. The most important setting is providing the build number as the package version. you can do this by providing additional argument to octo.exe as --packageVersion $(Build.BuildNumber) . This will make the Octopus release to be created with the package version of build number. Select the first environment defined in Octopus as the environment to be deployed, and make sure to check the Show deployment progress, which enforces VSTS Release Management to wait for the Octopus release to complete deployment to target.image

In the second or any subsequent environment in VSTS release pipeline you just have to define a task to trigger deployment in Octopus release already created. For example next environment DEV setup in Octopus, as we described earlier earlier can be setup to deploy with the next environment in VSTS Release pipeline. image

Make sure to NOT to provide the package version in these subsequent deployment task as it would throw an error to VSTS release management from Octopus saying “Unrecognized command arguments: --packageVersion, 1.0.6”, as for deploy command package version cannot be provided as an argument.image

With this setup you can control the deployment from VSTS and the octopus deployments will be triggered, as you approve the VSTS release management pipeline environment for deployment. The advantage of this approach is you can utilize your existing Octopus deployments for your deployment purposes, while leveraging ability of VSTS to provide enhanced capabilities of test automation integration, reporting of test automation results as well as seamless integration of release notes with VSTS work items and other rich set of capabilities to integrate with many other services.image

image

Finding Membership Information of a User/Group/Team in VSTS/TFS

$
0
0

Managing permissions of a VSTS/TFS sometimes become a nightmare specially if you have many teams groups etc. created in your account and in team projects. It is important to have a way to find group membership of an individual or group, so that you can analyze where the memberships are assigned in order to make required maintenance or change permission activities. Lets look at a command which can help you find that information quickly.


tfssecurity is a command line tool made available to you if you have Visual Studio team explorer installed (default path C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\CommonExtensions\Microsoft\TeamFoundation\Team Explorer) or in on-premises TFS server Application Tier. You can read documentation about various types of usages of tfssecurity command line from here.

For our problem of finding which groups or teams a user if member of in VSTS/TFS we can utilize this command with /imx which will list all group membership. To use this command you need View collection-level information or the View instance-level information permission set to Allow. You can use developer command prompt of Visual Studio in administrator mode or normal command prompt in admin command prompt and change directory to the path where tfssecurity is or add tit to path variable, in order to use the command below.

tfssecurity /imx Identity [/collection:CollectionURL] [/server:ServerURL]

For VSTS you can provide the VSTS account url for collection url parameter. and for identity you have to use something similar to “user@domain.com” to pass the user identity you want to evaluate for permissions. It can even be VSTS group or team name.

Using with a user of VSTS

tfssecurity /imx "userx@domain.com" /collection:https://vstsaccountname.visualstudio.com

When you execute command it will list the user details and his membership information in the account.image

You can execute this command with a team or a group of VSTS/TFS to see the member list of the group and which groups this group/team is member of.

tfssecurity /imx "[teamproject]\ABC Team" /collection:https://vstsaccountname.visualstudio.com

It will display information about the group or team.image

List of Members.image

Other groups membership.image


Solving OutOfMemoryException and Getting NUnit Tests on Visual Studio Test Explorer

$
0
0

It is fun to work with latest tools and frameworks but sometimes errors are bit confusing and searching for fix is not that easy. One team worked with NUnit and .NET 471 and complaint that the when compiling they get outofmemory exception “ NUnit Adapter 3.7.0.0: Test discovery starting Exception System.OutOfMemoryException, Exception converting mytest” . Searching pointed to few links in GitHub issues as shown below but those workarounds seem to be not applicable as .NET framework change back to 462 is not an option.Fix was really simple but looking for a solution sometimes takes time. So let’s explore the problem and the simple fix.


The problem is as described in here could not run the tests and used Visual Studio is 2017 latest update 15.7.01

Test Explorer in Visual Studio is empty.02

Building with VSTS does not run unit tests saying no test found in the assemblies.04

05

The fix is as simple as typing few letters to change version of NUnit test adapter to version 3.10.03

Once build tests now appear in VS Test Explorer.06

Build in VSTS picks up and executes the tests. No need to specify the custom adapter paths for VSTS build to make it work. It automatically picks from build source directory packages.image

07

Developing Azure Functions in Visual Studio and Creating a Deployment Pipeline Using VSTS

$
0
0

Serverless computing has become a hot topic these days. It is providing you with paying only for the actual time your code is running and resources you are consuming without any worries about infrastructure. Azure functions let you build functions that can scale dynamically based on the needs, with a languages of your choice. When you install Visual Studio Azure Development workload you get templates for developing Azure Functions. Let’s look at how to create a simple Azure Function “Hello World” in Visual Studio and most importantly how to get your CI/CD pipeline ready within couple of minutes.


You can create an Azure Function app project using Visual Studio 2017. Then commit it into VSTS Git repo.image

Install continuous delivery extension for Visual Studio 2017 from Visual Studio Marketplace. 01

02

Click on the new icon appears on the Visual Studio tray and click on Configure Continuous Delivery to Azure.04.0

In the dialog provide your Microsoft account details and you would be able to see the team project Git  repository. Select the branch that you have committed your Azure Function code. Select you Azure Subscription and click Edit button to provide additional information for Azure Function app.04

You can provide new Azure Resource Group or select existing and provide other details such as storage account, hosting plan, and hosting plan size etc.05

Once you click OK Visual Studio will connect with VSTS and Azure, and will generate a build definition and a release definition. Execute a build and release and get the Azure Function deployed.06

The Advantage of generating the build and deployment pipeline will help you get started quickly. Let’s inspect tasks setup in these definitions to understand how the steps are performed so that it enables you to do necessary enhancements to support you real production scenarios such as adding additional environments, say QA, Staging or Prod, and adding further configurations etc.

NuGet Restore task will restore any NuGet packages used and Visual Studio build step will build and publish the app with the arguments passed to MSBuild. The build will package the Function app as a MSDeploy package. You may want to introduce package as NuGet package and push it to a package management feed as an enhancement. We can explore these options in a later post.

/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:PackageLocation="$(build.artifactstagingdirectory)\\"

image

Visual Studio Test task will run any unit tests and publish symbols will enable debugging by making .pdbs available. More information on publishing symbols for debugging can be found here. Executed build will create a MSDeploy package as shown below.image

The continuous delivery pipeline also get automatically created and will have the Dev environment setup for deployment using the build artifacts.image

Azure App Service deployment task is configured supporting deployment of the Function App as shown below. The App Type is set to Function App and package from the build is used for deployment. The task allows you to use variable substitution etc. as further enhancements.image

The Service Endpoint created to connect with the Azure subscription is scoped to the resource group picked in the Visual Studio. You can inspect the service endpoint by clicking the Manage link next to the Azure Subscription and the opening the new service endpoint for update.image

You can verify the current connection and change scope or even define new service endpoints to target different Azure subscriptions depending on you target environment needs. For example you production environment can be totally different Azure subscription. Then you can clone the release pipeline auto generated environment to create the next step in the pipeline.image

Beauty of the Continuous Delivery extension to Visual Studio is it creates you entire CI/CD pipeline for Azure Function apps and even executes the first release to get it deployed to the first target environment. This enables you to get started really quick and do further enhancement as and when needed.image

image

Deployed function can be tested as shown below (This is just the template function created by Visual Studio as default when creating a function app project).image

Setting VSTS Release Variable Values At the Time of Creating a Release

$
0
0

VSTS builds have the option to set build variable values at the time of queuing a build. You have to select option for build variable “Settable at queue time” to enable it to be set value at the time of queuing  build. However this feature was not available for VSTS Release and there was a user voice raised here requesting to allow setting release variables at the time of creating a release. As promised by MSFT in that user voice this feature is now available for VSTS Release. Let’s look at how to use it with VSTS web UI and with the VSTS REST API.

With VSTS Web UI

In your release definition now you can selection option “Settable at release time”.image

This allows you to set the value of the variable at the time of creating the release.image


With VSTS REST API

First you need to enable variables to be settable at release time as shown in above in the VSTS Release Definition. However, VSTS REST API documentation here for creating a release does not show you how to supply the variables when creating a release.image

You can use below json syntax to add variables to the REST API call request body to set the variable values at the time of creating a release.

"variables": {
     "Variable1Name": {
       "value": "Variable1Value"
     },
     "Variable2Name": {
       "value": "Variable2Value"
     }
   }image

Here is a sample PowerShell script calling VSTS REST API to create a release while setting variables (setting artifacts is omitted in this sample script but it is just a matter of adding it to the request body).

param(
[Parameter(Mandatory=$true)]
[string]$token,
[Parameter(Mandatory=$true)]
[string]$VSTSAccoutName,
[Parameter(Mandatory=$true)]
[string]$teamProjectName,
[Parameter(Mandatory=$true)]
[string]$ReleaseDefId,
[Parameter(Mandatory=$true)]
[string]$TriggerDescription,
[Parameter(Mandatory=$true)]
[string]$octopusProject
)

$User=""

# Base64-encodes the Personal Access Token (PAT) appropriately
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}"-f$User,$token)));
$header = @{Authorization=("Basic {0}"-f$base64AuthInfo)};

$ReleaseMetadata = '{"definitionId": ' + $ReleaseDefId + ',"description": "' + $TriggerDescription + '","isDraft": false,"reason": "none","manualEnvironments": null,
"variables": {
"OctopusProject": {
"value": "' + $octopusProject + '"
},
"ToEnv": {
"value": "AutoDumDum"
},
}';

$Uri = 'https://' + $VSTSAccoutName + '.vsrm.visualstudio.com/' + $teamProjectName + '/_apis/release/releases?api-version=4.1-preview.6'

$ReleaseResponse = Invoke-RestMethod -Method Post -ContentType application/json -Uri $Uri -Body $ReleaseMetadata -Headers $header

Write-Host$ReleaseResponse

Values of repositoryType Parameter in VSTS REST API

$
0
0

VSTS REST API is really useful in achieving automations in many aspects of the software development process, and it can be used to write some handy utilities to obtain information quickly with appealing formats. However, sometimes you have to bit struggle to find out which are the correct values to pass for some parameters expected by VSTS REST API, where the documentation does not provide much of a valid information. One such parameter is ‘repositoryType’ in documentation found here for listing build definitions, which does not give any information what are the expected values for the parameter.Documentation here gives some hint on what could be the values but does not have information for REST API expected values.


Trying to find if any build definition exist for a given repo is possible using the REST API by listing build definitions filtered for a given repository ID. But if you do not provide the repository type to the REST API it fails with below shown error message.

Repository type is missing/invalid.image

Here is a table of each repo type and value that is expected by each repo type for parameter ‘repositoryType’ in  VSTS REST API.

Repository‘repositoryType’ Parameter Value
VSTS/TFS GitTfsGit
TFVS (Team Foundation Version Control) TfsVersionControl
GitHubGitHub
GitHub EnterpriseGitHubEnterprise
Subversionsvn
Bitbucket CloudBitbucket
External GitGit

Check if a Build Definition Exists for a Given Repo in VSTS/TFS

$
0
0

In scenarios where you have many teams working in the same team project, you might have multiple TFS Git repos within a single team project. When you want to create a build definition for a given repo, you might be wondering what if there is a build definition already exists for the repo. Opening each build definition to find that out is a waste of time. Let’s look at a small utility script that can give you the information quickly using VSTS/TFS REST API.


The PowerShell script I made available here in GitHub and in GitHub gists will help you to generate a quick html report on all repos, build definition information. To use the script you need to have a PAT (Personal Access Token) created with scope of build and code read.image

You can call the script as specified below for VSTS.

.\FindBuildsDefsForGivenRepo.ps1 -token 'yourPAT' -collectionUri 'https://youraccount.visualstudio.com' -teamProjectName 'yourteamprojectname' -repoName '*'

for TFS

.\FindBuildsDefsForGivenRepo.ps1 -token 'yourPAT' -collectionUri 'yourTFSProjectCollectionUrl' -teamProjectName 'yourteamprojectname' -repoName '*'

for the repo name parameter below patterns are supported.

  • *
  • somename*
  • *somename*

The script will look for any TFVC repo builds regardless of the repoName parameter value you supply.image

It will filter the Git repos in a team project (there can be multiple Git repos in a single team project) using the repoName parameter value and sort the repos by repo name, then will look for build definitions and print them to an html file in the script execution location.image

The resultant html file will be created with name RepoBuildDefs.html in the same location of the script and it will list down repos and build definitions, with the clickable links to open the relevant repo or build summary page.

image

image

Viewing all 344 articles
Browse latest View live