Encrypting disks on an Azure VM

Due to all the new privacy and data-protection rules like GDPR, I see a lot of companies looking at using disk encryption on their servers. In this case, a customer asked me to enable Azure Disk Encryption on their Azure VM’s.  Azure Disk Encryption encrypts the disks for your Azure VM at rest using the known Bitlocker technology. This is free -as in free beer. There is no extra charge for encrypting you disks in Azure. The crypthographic keys  are stored in Azure Key Vault. These cryptographic keys are used to encrypt and decrypt virtual disks attached to your VM. You retain control of these cryptographic keys and can audit their use through the Key Vault. An Azure Active Directory service principal is used for issuing these cryptographic keys as VMs are powered on and off.

Proces for encrypting a VM

The process for encrypting a VM is pretty straight forward:

  1. Create a cryptographic key in Azure Key Vault
  2. Configure the key to be usable for Azure Disk Encryption
  3. Create an Azure Active Directory service principal to be able to read the key
  4. Run the PowerShell command to encrypt your virtual disks, specifying the Azure Active Directory service principal and appropriate cryptographic key to be used
  5. The Azure AD service principal will request the required cryptographic key from Azure Key Vault
  6. The virtual disks are encrypting using the provided key

There are some requirements and limitations to keep in mind. Encryption can only be applied to VM’s in de standard tier and all resources (Key Vault, Storage Account and VM) should be in the same region. You can’t use Azure Disk Encryption for VM’s created in the classic deployment model and you can’t update the key on already encrypted VM’s.

 

So, let’s code!

In the following examples, we’ll use PowerShell to enable Azure Disk Encryption on the ‘myVm’ Virtual Machine in the ‘myResourceGroup’ resource group. All resources are based in the West Europe region. Of course, before you start, make sure that you are running the latest AzureRM module in Powershell.

First of all, we’ll set some variables and register the Azure Key Vault provider.

Next, we’ll create the key vault. I choose to use a specific key vault for storing my key, but you can use an existing key vault. Of course you can freely choose the name of your vault 🙂 Make sure to enable the key vault for disk encryption.

Next, we’ll create a software-protected key in the vault to use with Azure Disk Encryption. This is the key that will actually be used for encrypting and decrypting your disks. Again, pick your own name for the key.

When virtual disks are encrypted or decrypted, you specify an account to handle the authentication and exchanging of cryptographic keys from the key vault. This account, an Azure Active Directory service principal, allows the Azure platform to request the appropriate keys on behalf of the VM. We’ll create the service principal using the following code. When choosing your password, keep the Azure AD password policies and restrictions in mind.

After creating the service principal, we have to set permissions on the key stored in the key vault to be read by the service principal.

So, now all prerequisites are met to start encrypting disks. We kick off the encryption process on our VM

You will be prompted to continue with the encryption. Once you accepted, the encryption process will be started and the VM will be restarted in the process.

Once the process has completed, you can check the status with the following command:

The output will look something like this:

That’s it! You just encrypted your disks using Azure Disk Encryption. Want to know more about this feature? Check the documentation.

 

Getting Intune device config Powershell scripts via the Graph API

Since a few months, it’s possible to upload PowerShell scripts to Intune as a part of you device configuration policies. These scripts will then be pushed to the linked Windows devies and run either under the SYSTEM account, or as the logged on user.

While working on an Intune deployment, I wanted to check the PowerShell scripts that are currently in use, and found out you can’t do that through the portal. You can change the properties of the script and upload a new file, but can’t view the current script.

Looking for a way to make the script visible, I started playing around with the Graph API, to see if we can do it via this route. Spoiler: we can! 🙂

First of all, we need to authenticate to the graph API. There is some great example code on the Microsoft Graph GitHub pages that explains how to do this, so I won’t go into any detail here. The scriptblock I use to authenticate result in a $authHeader hashtable, that we can include in our REST-calls to the Graph.

First, I set a few variables in my script that I can re-use in my calls:

We need to use the beta version of the API, because the resources we need (deviceManagement/deviceManagementScrips) are not exposed in the currently stable version.

So, lets make our first call to the API to see what results we get back.

We set the URI we want to call, including the API version and resources specified earlier. Next, we call this URI using the invoke-restmethod cmdlet, including our authentication header we retrieved in the beginning. We use the ‘GET’ method, because we want to retrieve data. Because we set the resource to be deviceManagementScripts, the response will include the deviceManagementScripts currently in use.

The response is a PSObject with several properties. Of course, we have the most interest in the ‘value’ property, as this has the actual data we are looking for. So, rewrite our line of code to get just the ‘value’.

This returns the actual deviceManagementScripts that are currently in use.

In my case, this is only one script that apparently is used to redirect certain folders to OneDrive.

By referencing the id for this script in our API call, we can get more information on this particular object.

Again, we use the ‘GET method to retrieve the information from the Graph API. Because we are referencing this single object, now we don’t need to distinctly retrieve the ‘value’ property to get the actual data.

For ease of reading in this output I truncated the ‘scriptContent’ property shown here a bit, but as you can see we can retrieve all the information we have in the portal: the description, the runAsAccount (that can be either ‘user’ or ‘system’), if a signature check is enforced, and of course the filename of the script and the actual content.

The content of the script is stored (and displayed) in a Base64-encoded string. To make this human readable, we need to decode it.

First, we specify the $script64 variable to store the Base64-encoding string. Next, we decode the Base64-string to UTF8 and store it in the $decodedscript variable.

When we now display this $decodedscript variable, we see the contents of the script!

Again, I truncated to output for readability here. Since we have this variable, we can store the contents in a file and thus save the script to our local harddrive.

Pretty neat, right? But, if we can use this to download a script, why shouldn’t we be able to upload a script this way? Let’s check. The documentation by Microsoft is pretty clear: we call the same deviceManagementSripts resource, but with a POST-method in stead of a GET. In this POST we need to include a JSON in the body with the details of the script we would like to set, including the actual content of the script in the same Base64-encoding we saw earlier.

So, let’s put the building blocks together. I’ve created a mind blowing Powershell-script that I stored as c:\temp\testscript.ps1

We get the content for this file and then encode it using Base64

So now we have the $UploadScriptEncoded variable, containing the Base64-encoded script. Next, we need to build the JSON file to include in the POST to the REST API. I do this by creating a hastable with all the needed information and piping this to the ConvertTo-JSON cmdlet.

In the hashtable I specify the Displayname for the script and a short description, include the scriptContent we just encoded, specify it should run in the user-context and that we don’t want to check for a valid signature. Finaly, we give the filename for the script that will be displayed in the Intune portal where the script is referenced.

To finish up, we call the API with the given parameters to do the actual uploading.

We call the URI specified earlier, including our authentication header with the POST-method. We include the JSON-file we stored in $postbody as the body of the request, specifying that this is indeed a JSON-file.

The response indicates that file was uploaded and configured.

We can now check the Intune portal to double check if the script is there.

There you have it: using the Graph API you can do stuff you can’t do in the portal and automate many things you do Intune. For example, you can place the PowerShell scripts you deploy using Intune in a repository in (for example) VSTS and create a build sequence that uses the Graph API to update the script in Intune every time you push to master. If you haven’t played around with the Graph API, now is the time do so. The possibilities are endless.

Happy scripting!

 

Getting VPN logging from Azure

Today, I was working with a customer who had some issues with a VPN connection from Azure to his on-premises environment. After checking the configuration, everything seemed to be okay. I decided to run some logging on the VPN connection from the Azure side, to see if I could pin-point the issue. For that, I usually use the following script, of course after logging in to Azure in Powershell using add-azureaccount:

This script prompts to select your Azure Subscription, Storage Account and vNet, and then starts 300 seconds of logging on the vNetGateway in that vNet, writing the output of these logs to the Storage Account you specified.

However, when the start-azurevnetgatewaydiagnostics cmdlet was run, I received an error that the StorageContext could not be read. After some Google searches, I found out I was running in to this issue. As discussed there, uninstalling the Azure module and reinstalling it using the 3.8.0 version solved my issue and I could start the logging.

Once done, you can retrieve the logged data using the following code:

In my case, it turned out there was a mismatch in the preshared key between the Azure side of the VPN connection and the on-premises firewall.

If you want to retrieve the PSK’s of all VPN’s in a specific vNet, you can do that using PowerShell.

This prompts for your subscription and vNet name, and outputs the PSK’s of all VPN’s in that vNet.

Azure AD Group Based Licensing

When working with larger Office 365 and / or EM+S deployments, one of the pains for me has always been the automation of license assignment. You can provision users in Azure AD (and thus in Office 365) automatically using Azure AD Connect, and you could even add some magic with some PowerShell scripts to assign a license to these users based on OU or group-membership in your on-prem AD. The removal of these licenses takes even more scripting, where you would need to compare you on-prem group membership with the active licenses and remove licenses when needed.

About two weeks ago, Microsoft announced something to fill this gap: Azure AD Group based licensing. Currently, this feature is available in public preview. While in preview, you’ll need a paid Azure AD subscription to use the feature, like the one included in EM+S. Just a plain Office 365 tenant won’t be sufficient. When the feature hits GA, this prerequisite will be lifted.

Using the feature is as simpel as it sounds. From the (new) Azure Portal, you navigate to the Azure AD resource and click the ‘licensing’ tab. From there, you can navigate to all the licenses that are available in your environment.

By clicking through to one of the available licenses, you get the option to assign this license to a group.

All Microsoft Online services that require user-based licensing are supported and are displayed in this pane.

When selecting the ‘groups’ option, you can select the group to assign this license to. and search for the group you would like to use. This group can live solely in Azure AD, or can be a security group synced from you on-prem AD.

 

You can specify which parts of the license you would like to assign, you can create a pretty fine-grained solution using various groups to enable or disable specific features.

All in all, this is a pretty neat solution to provide some ease in managing user licenses for Microsoft Online services.

There is some good documentation on this feature on the Microsoft website. Of course, as is a feature that’s currently in public preview, there are some known issues and limitation, which are also pretty well documented.

For me, I can’t wait to get this feature implemented at some of the larger tenants I manage, to ease the administrative tasks of on- and offboarding users.

Backup your Windows 10 machine to Azure

Cloud first, mobile first, right? But what will I do with the data that lives on my machine, but I don’t want to (or cannot) sync with OneDrive? I need to backup that data!

Well, you can. The Azure team recently announced the posibility of backing up your Windows 10 machine to Azure!

Sounds cool? I sure do think so. So let’s check that out. 🙂

First of all, we sign in to the Azure Management Portal to make sure we have a place to backup those important file-folder data. Backup data is stored in a Backup Vault, so we navigate to new -> data services -> recovery services -> backup vault and hit the ‘quick create’ button.


The only things we need to specify is the name for the vault, the location to store the data and the subscription we wish to use for billing.


Remember: Location, Location, Location! In the mindset of Azure, Northern Europe is the UK/Dublin datacenter, while Western Europe is the Amsterdam datacenter. All those maps showing that Amsterdam is more north than Dublin are wrong, people! 😉

Anyway, we hit the ‘create vault’ button and the vault will be created. This can take some time, you can ofcourse check the status by monitoring the notifications on the bottom of the portal.


Once the vault has been created, the portal will tell us so.


We can find the newly created vault on the ‘recovery services’ tab in the Azure portal.

Right after you created the vault you can check the storage redundancy options on the configuration. This would the best moment to do so. Once an item has been registered to the vault, the storage redundancy option is locked and you can no longer modify it!
You should evaluate your business needs when choosing the best replication setting. If you plan to use it as your primary backup endpoint, you should consider using the geo-redundant option. That way, you’ll have six copies of your data. It’s replicated three times on the primary storage region, and three times in a secondary region. That way your data will be durable in two seperate regions.

The localy redundant storage maintains just the three copies on the primary region, within a single facility. That way, LRS protects your data from normal hardware failure, but not from the failure of an entire Azure facility. If you are not using Azure as your primary backup endpoint, you can go with the LRS option to save on the cost of the storage.

Once we decided on the way to replicate our data en choose the according setting, we can go ahead and download the vault credentials. These will be used to authenticate the backup source to the backup vault.

The vault credentail file is downloaded through a secure channel from the Azure portal. To do this, go to the ‘quick start’ view of the backup vault you just created.


There, you can click the ‘download vault credentials’ link to download the vault credential file. The file will be generated using a combination of the vault name and the current date.



Now, it’s time to set up the agent on each of the machines you want to protect. You can download the agent from the same window that you used to download the vault credentials file.


Once the agent has finished downloading, simply run it to start the installation.

First, we have to specify the necassery file paths. You could just ‘next’ this page, but if you’re going to backup a lot of data or have a small system partition, it could be wise to change the cache-path.


After setting proxy-settings (when necassery) and checking some prerequisites, we can start the installation.

Once the installation has completed, click the ‘proceed to registration’ button to continue with setting up the configuration.


Click the ‘browse’ button and navigate to the vault credentials file you downloaded earlier.


The file will be validated and we can continue to the next step.


Your backup will be encrypted using a passphrase. You can type your own passphrase, or have the software generate one for you. The passphrase will be stored in the location that you specify so that you can retrieve it later. It’s wise to store this passphrase on a separate location, for example on a USB-thumbdrive. Without the passphrase, you won’t be able to get access to your data!


Once done, the server will be registrered and you can launch the agent.


In the agent, we start with setting up a new backup job. Click the ‘schedule backup’ job in the actions pane.


The wizard is nice and easy. Select the files or folders you want backed up, and specify any exclusions if you’d like.


Specify when to perform a backup, with a maximum of three times a day.


Next, specify the retention policy used for your data.


Finaly, set the initial backup type. In my case, I’ll do it automaticaly using my broadband connection.

If you have lots of data to backup, you might want to seed the backup by shipping a disk to the nearest Azure datacentre.

And that’s it!


Once the process completes, go back to the actions pane and click the ‘backup now’ button to complete the initial seeding.


This will start the initital backup. You can close this window if you want, it will run in the background.


In the agent, the ‘jobs’ tab will let you now of the progress.



All done!


That’s it! Your precious data is now protected by a backup to Azure. In one of my next blog posts, I will run through the steps of restoring your data from the backup when needed. Stay tuned!