Policies in Azure Resource Manager for better conventions management

Conventions are a really good idea in general - they prevent you from handling the whole mess which appears sooner or later. While it's pretty obvious when working with your codebase(we have plenty of different tools), it's still not so popular when you're managing your resources in the cloud. This post is going to encourage you to use them and present possible use cases.

Why not roles?

In Azure you can also find a term RBAC, which stands for role-bases access control. While you can find it useful to assign e.g. admin role only for those few people "who know, what they're doing", it's still more user-centric. You can imagine a situation, when you have different teams working on different projects. Different people have different opinions and experience and their choices are dictated by their current point of view. 

Now imagine giving them access to the production environment(I'm aware of the fact that normally access restricted - let's pretend we forgot about this rule). You can select who can do anything on production. What you cannot do is to restrain him or her from provisioning a G5 VM for 5k Euros per month. This is where policies come to play.

Creating and assigning a policy

You have multiple options when it comes to selecting a tool to manage your policies. In fact, you can choose either a REST API, Powershell or Azure CLI. For me the easiest way to work with policies was to use Powershell cmdlets, I strongly encourage you to select tool which suits you the most. 

To make a policy effective you have to perform 2 steps - create it and assign it to the resource group. This is another great feature - you can have your policies predefined and attach them to different resource groups as you wish. It's super easy to automate also if you wish.

To create aand assign policy you can use following command:

/
$policy = New-AzureRmPolicyDefinition -Name regionPolicyDefinition -Description "Allow only one region" -Policy '{    
  "if" : {
    "not" : {
      "field" : "location",
      "in" : ["northeurope"]
    }
  },
  "then" : {
    "effect" : "deny"
  }
}'

New-AzureRmPolicyAssignment -Name regionPolicyAssignment -PolicyDefinition $policy -Scope /subscriptions/{SubscriptionId}/resourceGroups/{ResourceGroup}

I used examples from this page and modified them slightly. What this code does can be shortened to "create a policy in subscription & assign it to the resource group". Now let's try to create any kind of resource in this resource group, which is not in the North Europe region. It seems it's not so easy now:

When you go to the details of this error, you'll see something similar to following:

/
{
  "error": {
    "code": "RequestDisallowedByPolicy",
    "message": "The resource action 'Microsoft.Network/virtualNetworks/write' is disallowed by one or more policies. Policy identifier(s): '[{\"policyDefintionId\":\"/subscriptions/{SubscriptionId}/providers/Microsoft.Authorization/policyDefinitions/regionPolicyDefinition/\",\"policyAssignmentId\":\"/subscriptions/{SubscriptionId}/resourceGroups/FunctionApp//providers/Microsoft.Authorization/policyAssignments/regionPolicyAssignment/\"}]'."
  }
}

Managing policies

When you take a look at the documentation of policies, you'll see other options available for policies like viewing already created ones or removing them. I strongly encourage you to read them - there're many different properties, which can be set like the size of VM, storage SKU or even ensuring that storage blob is encrypted. It's a really powerful tool and using it wisely can really ease operations and management.

Managing your git repository via REST API in VSTS

Let's say you'd like to store some results of a build or a release inside a repository. It could be any reason - easy access, versioning, the only tool you have access to. By default VSTS doesn't provide any kind of git-related steps, which could be helpful in such case. Fortunately, once more its REST API comes to the rescue, giving us an opportunity to fully manage repositories with the possibility to push multiple commits.

Before we start, take a look at the overview to understand what are the general capabilities of this API.

Making an initial commit

Once you have your git repository created in VSTS, you can either initialize it with a commit from the GUI or push an initial commit using API. If you take a look at the example from the documentation:

/
POST /_apis/git/repositories/{repository}/pushes?api-version={version}

and its body:

/
{
  "refUpdates": [
    {
      "name": "refs/heads/master",
      "oldObjectId": "0000000000000000000000000000000000000000"
    }
  ],
  "commits": [
    {
      "comment": "Initial commit.",
      "changes": [
        {
          "changeType": "add",
          "item": {
            "path": "/readme.md"
          },
          "newContent": {
            "content": "My first file!",
            "contentType": "rawtext"
          }
        }
      ]
    }
  ]
}

you'll see mostly self-descriptive properties, which build this JSON. The only thing - oldObjectId - is not so obvious. Basically it's SHA1 of the commit this commit is based on - since there're no commits yet, it's basicaly a string full of zeros.

Pushing data

Making an initial commit is a piecie of cake. What if we'd like to update an existing file? The main issue here is to find an oldObjectId, which is required to actually make a request successful. Once more the API becomes handy here - what we can do is to fetch a list list of all pushes and take the last one. Take a look at the signature from the documentation:

/
GET https://{instance}/DefaultCollection/_apis/git/repositories/{repository}/pushes?api-version={version}[&fromDate={dateTime}&toDate={dateTime}&pusherId={guid}&$skip={integer}&$top={integer}]

What is great about this request is the possibility to filter the data - we don't have to download all pushes, only those from the date interval, made by a specific pusher or maybe only the top N. The response gives us a list of pushes ordered from the newest to the oldest. That is important here is to pass includeRefUpdates=true parameter in the query string. This way we'll get following additional property in the response:

/
{
          "repositoryId": "04baf35b-faec-4619-9e42-ce2d0ccafa4c",
          "name": "refs/heads/master",
          "oldObjectId": "0000000000000000000000000000000000000000",
          "newObjectId": "5e108508e2151f5513fffaf47f3377eb6e571b20"
}

and we're able to refer to the newObjectId property to make an update. Once we have it, we can use once more the endpoint used to create an initial commit with a slightly modified body:

/
{
  "refUpdates": [
    {
      "name": "refs/heads/master",
      "oldObjectId": "5e108508e2151f5513fffaf47f3377eb6e571b20"
    }
  ],
  "commits": [
    {
      "comment": "Added a few more items to the task list.",
      "changes": [
        {
          "changeType": "edit",
          "item": {
            "path": "/readme.md"
          },
          "newContent": {
            "content": "Modified readme file!",
            "contentType": "rawtext"
          }
        }
      ]
    }
  ]
}

Once we post this request, a new commit should be pushed and visible when you access a repository.