Category: PowerShell

  • Mastering Task Group Versions in Azure DevOps

    In Azure DevOps (ADO) classic pipelines, you can encapsulate steps into collections called Task Groups. These groups can then be easily reused in Builds or Releases to get common or standard steps shared across several pipelines.

    The issue is, if you just use the GUI to import a version of an existing Task Group, it just makes a new one with slightly different name, typically appending ‘copy’ to the same name. You can revise a Task Group while editing it but there are aspects that that the GUI does not expose to editing.

    As one example, there is a typical PowerShell technique to print variables: “$($_.Exception.Message)”. Inside the parentheses would be initial calculations or expressions that needed to be computed before printing the string. If you have this in a PowerShell step and save or revise, ADO will assume you want that as a variable to input to the task group. There is no way to remove ‘extra’ variables in the Task Group in the GUI.

    So, I looked at the ADO REST API documentation and saving a version isn’t standard. A search of the internet found one article on how to do this- https://medium.com/@tejasparmar99/azure-devops-task-group-version-upgrade-using-rest-apis-8524478364db. Updating a task group with a new version is a multi-step process when you use the GUI. The author figured out how these steps are replicated in PowerShell:

    1. Create a draft version of the task group by setting the version isTest value to true.
    2. Set the ‘parentDefinitionID’ property in the JSON definition of the task group to the ID of the previously created task group.
    3. Use the REST APIs to update the task group with the new version.

    This method allows you to update task groups to new versions while keeping the different versions intact.

    I took what they wrote and wrapped it in a little more logic. My function reads a list of the company standard Task Group definitions from a JSON file. It connects to the ADO Server using a Personal Access Token and retrieves the existing task groups. It compares the task group definitions from the JSON file with the existing task groups in Azure DevOps Server. If a task group definition is new, it takes the standard JSON definition file and creates new Task Group. If the Task Group exists, it uses the steps from above to create an updated version of the Task Group with multiple REST API calls.

    The article showed me the way when the official documentation failed me but also left implied details while updating versions:

    1. Creates a draft task group by adding missing properties to the new task group object, such as instanceNameFormatdefinitionTypeiconUrlrunsOn, and version.
    2. Posts the draft task group to ADO using a POST request.
    3. Publishes the draft task group as a preview using a PUT request.
    4. Publishes the preview as a new version of the task group using a PATCH request.

    This may have come from my standard JSON files not including all fields. There are enough to add it new but more parameters are needed when making a draft Task Group. Then there are parameters that change state as you change from draft to preview to published.

  • TFS/ADO REST API

    In my module, I have a number of functions that I wanted to share out to other internal teams. The module stuff is a little more helpful as it’s still relevant. I feel like all the cool kids use GITHUB and GIT Actions but someone else might be stuck using TFS repositories with Azure DevOps 2019 or greater.

    These are the functions I have to share. They are various helpers to read, create, or update tickets and builds using the REST API that comes with Azure DevOps.

    Here’s the list as I have while writing this-

    • Get-ChangesetCountSinceBuild.ps1
    • Get-MenuSelection.ps1
    • Get-TFSAttachments.ps1
    • Get-TFSbuildList.ps1
    • Get-TFSbuildStatus.ps1
    • Get-TFSqueryDetails.ps1
    • Get-TFSrelatedTickets.ps1
    • Get-TFSticketDetails.ps1
    • New-TFSbuildQueue.ps1
    • New-TFSticket.ps1
    • New-TFSvariableGroup.ps1
    • Update-TFSticket.ps1

    There is an odd-ball in the list: Get-MenuSelection is a function to add a menu option with arrow keys to the terminal. I got it from koupi.io but that site was taken down. Glad I grabbed it while I could!

    The rest of the functions follow a pretty standard pattern- some parameters to create the connection to ADO (URL, Personal Access Token, and the API version) and perhaps something specific to that function. The connection stuff is assembled and for all the Get- a call is made to ADO for some information. If in a New- function, the details are added to a $body variable and then the connection pushes the new whatever to ADO. If the connection has worked then the requested or new object is returned.

    You can hit me up with questions in GITHUB.

  • How to make a PowerShell module

    Assuming you have a few PowerShell functions that you would like to gather into one package to easily distribute, you will want to make a module. This will allow you to load your functions on a specific computer and then call the functions without any further effort on your part.

    At work, I use Azure DevOps so I wrote this to take my code and bundle the functions into a module, test the module is operational, and save the module as an artifact that can be downloaded. If you don’t want to use ADO, you can adapt the code as necessary.

    I didn’t figure this all out on my own, here’s where I learned all this-

    These instructions are an amalgamation of those listed in the links and simplified for me.

    1. Make folder structure
      1. Add your functions and tests
    2. Make configuration file, from sample
    3. Copy build file, from GITHUB
    4. Copy Pester Tests, from GITHUB
    5. Copy YAML file to define build process, from GITHUB
    6. Create pipeline in Azure DevOps and output artifact
    7. Review code to download new artifact

    To hold the sample code, I have a GITHUB repo with sample files-

    https://github.com/chinkes5/PowerShellModule

    Step 1

    First is to organize what you have. The folder structure and a few key file names will drive a lot of how the module is found and used. All the folders described are relative to the same root folder in your Azure DevOps repository.

    The functions you have should be named with one file having a single function and the name of the file the same as the function within. Those that are public, that is functions for general consumption, should be put in a folder named ‘Public’. If you have private functions that you don’t want generally exposed, you can make a folder ‘Private’ as well.

    Make a folder ‘Tests’. There will be a few standard tests to make sure the module can be built. If you have written any Pester tests for your functions, add them here as well. The expected format will be [function-name].test.ps1

    Step 2

    In the root folder, make a json configuration file called ‘module-config.json’. Here is a sample to follow:

    {
        "name": "My_PS_Module",
        "description": "PowerShell module for use by my team for standard operations",
        "Author": "John Chinkes",
        "CompanyName": "Your Company Name",
        "CopyrightStartYear": 2024,
        "GUID": "your-guid-value",
        "PowerShellVersion": "5.0"
      }

    The build will use this information to make a PSM1, PSD1, and NuSpec files.

    Step 3

    You will want a build file to assemble all the functions and create your manifest file. Use the following, calling it ‘build.ps1’:

    Here’s the file in GITHUB- https://github.com/chinkes5/PowerShellModule/blob/main/build.ps1

    The values you put in the config file will be found and swapped out by this process. Also, the names of all the functions will be added. When you call the module, it will read the public and private folders to expose the public functions. To add new functions to your module, just re-run this build file and they will be added!

    If you want an explanation of what’s going on inside this file, see this post.

    Step 4

    Optionally, make Pester tests for your functions and the module. How to make Pester tests is outside the scope of this document but the following basic test is recommended:

    Here’s the file in GITHUB- https://github.com/chinkes5/PowerShellModule/blob/main/test.ps1. It will print out several paths so you can see where it’s working and then invoke-pester on the ‘Tests’ folder. The output will be an xml file that Azure DevOps will display with the build.

    Save this file as test.ps1 in the root of your module folder.

    Step 5

    The output of the pipeline will not actually be a module, but a NuGet file that can be loaded to on the destination computer.

    At noted at the beginning, I’m working with Azure DevOps for my CI/CD tool. Here is a sample YAML file to build the module. Please be careful with the spacing and formatting when you copy! You will want to swap out the path to your code in the repository. At the end of the file, swap out the name of your feed in Azure DevOps Artifacts.

    Here’s the file in GITHUB- https://github.com/chinkes5/PowerShellModule/blob/main/module.yaml. Save this as Module.YAML in the root of your module directory.

    Step 6

    Create the pipeline to output a new module to your Azure DevOps Artifacts. You will want to check in all the code and structure you have created above.

    1. Sign in to Azure DevOps:
    2. Navigate to Pipelines:
      • In the left-hand menu, click on Pipelines.
      • Click on New pipeline.
    3. Select the Repository:
      • Choose the location of your code (e.g., Azure Repos Git, GitHub, etc.).
      • Select the repository where your YAML file is located.
    4. Configure the Pipeline:
      • In the Configure step, select Existing Azure Pipelines YAML file.
      • Choose the Module.YAML file from your repository.
    5. Review and Run:
      • Review the YAML file content.
      • Click on Save and run to start the pipeline.

    After the pipeline has run, you should be able to check for a new artifact with your module.

    Step 7

    You will need a PAT with permissions to read and download packages or set the permissions to allow for public consumption. Use this code to register your Azure DevOps Artifacts on a destination computer, swapping out your PAT, a name for your repo, and the exact URL to your DevOps Artifact:

    $patToken = "Your PAT goes here" | ConvertTo-SecureString -AsPlainText -Force 
    $credsAzureDevopsServices = New-Object System.Management.Automation.PSCredential("username", $patToken) 
    
    $srePackageSource = "Name you call your repo" 
    packageSourceList = Get-PackageSource 
    if ($packageSourceList.Name -notcontains $srePackageSource) { 
        Write-Output "Adding package source to be able to get $srePackageSource..." 
        $regParam = @{ 
            Name               = $srePackageSource         SourceLocation     = "https://pkgs.dev.azure.com/[collection]/_packaging/youPathTo/nuget/v2"
            PublishLocation    = "https://pkgs.dev.azure.com/[collection]/_packaging/ youPathTo/nuget/v2"
            InstallationPolicy = 'Trusted' 
            Credential         = $credsAzureDevopsServices
            Verbose            = $true
        }
        Register-PSRepository @regParam 
    } 
    else { 
        Write-Output "Package source of $srePackageSource found." 
    } 
    
    $psRepoList = Get-PSRepository 
    if ($psRepoList.Name -notcontains $srePackageSource) { 
        Write-Output "Registering package source $srePackageSource..." 
        $pakParam = @{ 
            Name         = $srePackageSource 
            Location     = "https://pkgs.dev.azure.com/[collection]/_packaging/youPathTo/nuget/v2" 
            ProviderName = 'NuGet'
            Trusted      = $true 
            SkipValidate = $true 
            Credential   = $credsAzureDevopsServices 
        }
        Register-PackageSource @pakParam 
    }
    else {
        Write-Output "PowerShell repository $srePackageSource found." 
    }

    Then use PowerShellGet to find and download the module, note the reuse of the credential from prior script. Swap out the name used for your repo and the name of the module in this code:

    Write-Output "Finding Your Module..." 
    Find-Module -Name 'Name of Module' -Repository 'Name you call your repo' -Credential $credsAzureDevopsServices -Verbose x
    Write-Output "Downloading Your Module..." 
    Save-Module -Name 'Name of Module' -Repository 'Name you call your repo' -Path ($env:PSModulePath -split ';')[1] -Credential $credsAzureDevopsServices -Verbose 
    Write-Output "Importing Your Module..." 
    Import-Module -Name “Name of Module” -Force -Scope Global

    You now have a module and downloaded it on a destination computer!