Document Azure subscription with PowerShell

Posted: 12th April 2016 by Anders Bengtsson in Azure, Scripts

I would like to share an idea around documentation for Azure subscription, and hopefully get some ideas and feedback about it. What I see at customers is that documenting what resources are deployed to Azure is a challenge. It is also a challenge to easy get an overview of configuration and settings. Fortunately with Azure Power Shell we can easily get information about all resources in Azure. I have built an example script that export some settings from Azure and write them to a Word document.

The example script will export information about Virtual Machines, Network Interfaces and Network Security Groups (NSG). If you look in the script you can see examples of reading data from Azure and writing it to the Word document. You could of course read any data from your Azure environment and document it to Word. A benefit with a script is that you can schedule the script on intervals to always have an updated documentation of all your Azure resources.

Another thing I was testing was building Visio drawings with PowerShell. To do this I used this PowerShell module. The idea with this example is to read data from Azure and then draw a picture. In my example I included virtual machines and related storage accounts and network.

Download my example PowerShell scripts here.

 

Note that this is provided “AS-IS” with no warranties at all. This is not a production ready solution for your production environment, just an idea and an example.

In this post I would like to share some ideas around monitoring Azure Backup Server and backup jobs with Microsoft Operations Management Suite (OMS). OMS comes with a solution for Azure Backup. With this solution I can see that the Azure Backup vault protect 3 servers and is using a total of X GB. If I click on “3 registered servers” I can see that these three servers are my Azure Backup Servers. The machines that are being protected by the backup servers are not shown. As a backup administrator you often need to know more than number of backup servers and used space. In this blog post I will show you how to collect and visualize that information with OMS J

The first thing to do is to install the OMS agent on the Azure Backup Server. Once the agent installation is successfully completed it is time to configure OMS to collect DPM events. Add the DPM Backup Events, DPM Alerts and CloudBackup event logs under Settings/Data. But before any events are written to these event logs, Azure Backup Server needs to be configured to publish backup events and alerts. This configuration is done in the Microsoft Azure Backup console, in the Management workspace under Options.

Once backup related events are starting to come in to OMS it is time to configure filters to visualize what we want to see. The following filter will get all successfully backup jobs. Event ID 33222 is successful backup job and event id 33223 is failed backup job.

Type=Event EventLevelName=information EventID = 33222 TimeGenerated>NOW-8HOURS | sort Computer

But as you can see in the figure all values in the computer column is the Azure Backup Server. I would like to see what data source was protected and also on which server. To do this, you can use custom fields in OMS. With custom fields we can extract data from the event and index it as a new fields.

In the next two figures I have extracted protected server and data source from 33222 event and 33223 events, from the ParameterXML parameters. As you can see, we now have a column for the protected server and one column for the data source. We could combine this to one filter, showing both failed and successfully jobs. But I think it is better with two filters when we start using these filters in My Dashboard.

We could also run a query like this to get all machines and latest successfully backup

It can also be interesting with a filter to show all protected servers that don’t have a successful backup for last X hours. In my lab environment I have some events from before I extracted fields from the events, as you can see below.

Once we have our filters and saved them to favorites we can use them in My Dashboard. We now have a quick start overview of out backup jobs on the Azure Backup Server. Of course you can add a number of filters to get more information to your dashboard.

Tao (@MrTaoYang), Stan (@StanZhelyazkov), Pete (@pzerger) and I have been working on a project for the last few weeks. We wanted to bring a learning resource for the MS Operations Management Suite to the community that is complete, comprehensive, concise…and free (as in beer). While we finish final editing passes over the next couple of weeks, we wanted to share an early copy of the book so you can start digging in while we finish our work!

Description: This preview release of “Inside the Microsoft Operations Management Suite” is an end-to-end deep dive into the full range of Microsoft OMS features and functionality, complete with downloadable sample scripts (on Github). The chapter list in this edition is shown below:

  • Chapter 1: Introduction and Onboarding
  • Chapter 2: Searching and Presenting OMS Data
  • Chapter 3: Alert Management
  • Chapter 4: Configuration Assessment and Change Tracking
  • Chapter 5: Working with Performance Data
  • Chapter 6: Process Automation and Desired State Configuration
  • Chapter 7: Backup and Disaster Recovery
  • Chapter 8: Security Configuration and Event Analysis
  • Chapter 9: Analyzing Network Data
  • Chapter 10: Accessing OMS Data Programmatically
  • Chapter 11: Custom MP Authoring
  • Chapter 12: Cross Platform Management and Automation

Download your copy here!

SharePoint Online as frontend for Azure Automation

Posted: 21st December 2015 by Anders Bengtsson in Azure

Back in the Orchestrator days we had the Service Manager self-service portal that we could use to submit items that trigger runbooks in Orchestrator. The integration between Service Manager and Orchestrator worked great and the self-service portal brought a lot of value to automation scenarios. But time change and now we have a new executor in Azure Automation J

The challenge is that there is no connector between Azure Automation and Service Manager or any other portal. In this blogpost we will look at how we can use SharePoint Online as a frontend to Azure Automation. The process of a new request will be:

  1. User submits a new item in a SharePoint list
  2. A SharePoint workflow trigger a Azure Automation runbook
  3. Azure Automation does magic
  4. Azure Automation update the list item in SharePoint
  5. The user sees the result in the SharePoint list

Setting up SharePoint

  1. Sign-in to SharePoint online
  2. Add a custom list, click on the Add list tile
  3. Download and install SharePoint Designer on your workstation
  4. Once you have installed SharePoint Designer, click on Edit List in the List toolbox, SharePoint Designer will start and load your SharePoint site
  5. In SharePoint Designer click Edit List columns
  6. In the Edit List view, use the Add New Column and Column Settings to configure the list as you need it to be. In my example I have a list with a number of fields that are needed to create a new service account in Active Directory. I have also added a column named Result that Azure Automation use to write back the result from the runbook. There is also a column named Azure Automation Status that is used to report back the response when submitting the job to Azure automation. The SP Workflow and the column will be automatic created when we connect a workflow to the list.
  7. When the list is as you like it, click SAVE and go back to SharePoint and refresh the page

The list is now created. You can click New Item in the list view and submit new items. You can click Edit this view and add the ID column. The runbook will use the ID field to keep track of which list item to work with.

Setting up Azure Automation

Next step is to setup the Azure Automation runbook and configure the webhook. More general information about webhooks can be found here.

  1. First thing we need to do is configure Azure Automation with a SharePoint Online module. Tao Yang have a good blog post about this. Tao blogpost is about import the module in SMA, but that you should not try to do J instead only follow Tao steps to build the ZIP file. You can download his ZIP file and then you add the two DLL files that he also link too.
  2. Once you have the complete ZIP file, browse to your automation account in Azure Automation and click on Assets and then Modules
  3. On the Modules page, click Add a module, and upload the ZIP file. Remember that if you are planning to use a Hybrid worker the module must be installed on all hybrid workers too
  4. After the module is imported you need to setup a connection to your SharePoint site, for example. Remember that the service account used cannot be configured with two factor authentication, the account also has to have permissions on the SharePoint site.
  5. I have put together an example runbook for this scenario, which can be found here. It will first show/output you all data that comes from the webhook. It will then connect to SharePoint and get the current list and list item. In the end of the runbook an account is created and a hash table is created to update back into SharePoint. Either use my example runbook or build a new runbook.
  6. Next step, after the runbook is in place, is to create a webhook, click on the runbook, click webhook and add a new webhook. Remember to copy the webhook URL before clicking Create

Configure the SharePoint workflow

It is now time to configure the SharePoint workflow that will trigger the runbook when a new list item is created.

  1. Open SharePoint Designer and load your SharePoint site
  2. In SharePoint Designer, click Lists and Libraries on the left side, then click on your list
  3. On the right side on SharePoint Designer, click New… next to Workflows
  4. Name the new workflow, for example Workflow 0003 in my example. Use SharePoint 2013 Workflow as platform type
  5. When the workflow is created, configure it to start automatically when a new item in the list is created
  6. Click Edit Workflow to start build the new workflow
  7. The runbook, when complete, should look like this
  8. The first step is to build a Directory, map list fields with variables
  9. Next step is call the runbook webhook, paste the webhook URL. Remember to change to HTTP POST

  10. The last step in the workflow post the response from Azure Automation back to a column in a list. When the workflow do the HTTP POST to trigger the runbook a message is sent back, that is the message that you will write back to a column. This will be a simple log if the job has been submitted to Azure Automation successfully
  11. When all steps in the workflow are configured, click SAVE and then PUBLISH in SharePoint Designer.

Testing 1-2-3

We have now built a list in SharePoint, we have built a workflow in SharePoint that will invoke a runbook. The runbook performs some action, in this example it creates an AD account, and sends the results back to the list in SharePoint.

We fill in the list item. I guess that with a bit more SharePoint skills it would be possible to hide the two last test fields when filling in the information. Those two fields are only used to store status.

After a couple of seconds we can see that the workflow has run (Stage1) and that there is a response from Azure Automation when triggering the webhook (Accepted)

In Azure Automation we can see that the jobs has completed. We can see a lot of info as Output from the runbook. In a production runbook you might want to scale down all the extra code and output J

If we go back to SharePoint and do a refresh we can see there is a result saying the account already exists in AD and no new account has been created. If we submit a new item with request for another account is works, the new account is created J

If you would like to add approval steps to your solution, read more here.

OMS black belt Jakob also have great ideas about using SharePoint Online that I recommend, read it here

Moving a VHD between storage accounts

Posted: 16th November 2015 by Anders Bengtsson in Azure

Moving a virtual machine (VM) between storage accounts sounds like an easy task, but can still be a bit complicated J In this blog post I will show how this can be done with AzCopy. AzCopy is a command-line utility designed for high-performance uploading, downloading, and copying data to and from Microsoft Azure Blob, File, and Table storage. Read more about AzCopy and download it here. When using AzCopy you need to know the key for your storage account. This key can be found in the Azure portal, both the preview portal and the classic portal.

When you are copying a VHD file for a VM you need to know the storage account URL and name of the VHD file. This can be found

  • Classic Portal. Click Virtual Machines, click Disks and look at the Location column. The first part of the Location URL is the name of the storage account.
  • Preview Portal. Click at a virtual machine, click All settings, click Disks, click on the disk and look at VHD location. The first part of the Location URL is the name of the storage account.
  • If your target is a storage account in the v2, the preview portal, and you need to know the URL to the storage account, click the storage account, click All Settings, Access Keys and then look at the Connection String. In the connection string you will find BlobEndPoint, this is the URL to use as destination.

In this scenario I will move a VHD located at https://psws5770083082067456747.blob.core.windows.net/vhds/geekcloud-geekcloud-dc01-2013-11-22.vhd to a new storage account in v2, https://contosowelr001.blob.core.windows.net/

  1. Start a Command Prompt
  2. Go to the C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy folder
  3. Run the following command to start the copy job. In the image I have covered the two keys, but I think you still get the command JAzCopy.exe /source:<The https:// URL to your storage account and container> /Dest:<The https:// URL to your destination storage account and container> /sourcekey:<The primary key for the source storage account> /destkey:<The primary key for the target storage account> /pattern:<file name>
  4. Wait…
  5. When the copy job is completed verify it was successfully

If you want to get the files you have in an Azure storage account, you can use PowerShell. In this example we first setup storage account context and then list files for each container in the storage account.

$blob_account  = New-AzureStorageContext -StorageAccountName contosowelr001 -StorageAccountKey xl6DMUmnbKvr -Protocol https
Get-AzureStorageContainer -Context $blob_account | ForEach-Object {
$container = $_.Name
get-azurestorageblob  -Container $container -Context $blob_account | Select Name
}

You can also see the files in the Azure portal, under the storage account/blobs/containers/

Using Custom Fields in OMS

Posted: 9th November 2015 by Anders Bengtsson in Microsoft Operations Management Suite

Last week I was working with custom fields in Operations Management Suite (OMS). I would like to share an example were to use custom fields. In this example we have a custom application named Contoso Invoicing Software that writes events (event Id 580) to the application event log. These events are collected by OMS but the challenges is that the event description is not a searchable field, in the first example the event description is “User profile cant be loaded”. The goal is to have the event description as a filter on the left side of the Log Explorer.

With custom fields we can tell OMS to index the event description and show it as a filter. Click next to, in this example the ParameterXML field, and select Extract fields from…

Select the text you want OMS to learn, and input a name for the new field.

Click Extract and you will see a message that OMS is now learning

Once the learning process is complete you will see a summary on the right side and can click Save Extraction.

You have now learned OMS to extract the data from events with ID 580 in the Application log. This now works on all new events, not on all events. Once new data has arrived to OMS you can filter and group based on the new custom field J

New Book on Microsoft Azure IaaS

Posted: 19th September 2015 by Anders Bengtsson in Books and courses

Pete Zerger, John McCabe and I are working on a book, Microsoft Azure IaaS: Integration, Optimization and Automation. The book is part of the Inside the Microsoft Cloud serie. The idea of the book is to provide technical details of the latest Azure features and capabilities for IT pros. The book is written by IT pros for IT pros, with a lot of real world experience. The book should be available for purchase later this year; however, some chapters are available for download earlier. The first chapter that is available for download, by Veeam®, is the networking chapter, read it  here :)

”Run As” with Azure Automation Hybrid Worker

Posted: 30th June 2015 by Anders Bengtsson in Azure

Runbooks in Azure Automation cannot access resources in your local data center since they run in the Azure cloud. The Hybrid Runbook Worker feature of Azure Automation allows you to run runbooks on machines located in your data center in order to manage local resources. The runbooks are stored and managed in Azure Automation and then delivered to one or more on-premise machines where they are run. Source.

By default the runbook will run in the context of local system account on the Hybrid Runbook Worker. This might be a challenges, as the computer account is seldom assigned any permissions, even if possible. The scenario I was working included creation of a service account in an on-premises domain. My first idea was to change the service account for the Microsoft Monitoring Agent service. But that did not work out very well L

Next idea was to in my Azure Automation runbook do a remote session to a server with domain account as credentials. The runbook shown in this blogpost is an example of how to do a remote session within an Azure Automation runbook. The runbook use two input parameters, first name and last name. The runbook picks up the account (SKYNET Super User) that will be used to remote connect to a domain controller. The account is stored encrypted in Azure Automation as an asset. The inline script session returns the name of the new user account, which is also returned as an output (Write-output) from the runbook.

I start the runbook from the Azure Portal, input the two parameters. A short while later I can see the job is completed and the output from the runbook. I can also see the new account in Active Directory.

 

Note that this is provided “AS-IS” with no warranties at all. This is not a production ready management pack or solution for your production environment, just an idea and an example.

Network Security Groups – Getting a overview

Posted: 20th June 2015 by Anders Bengtsson in Azure

I was working with Network Security Groups (NSG) earlier this week. The environment included multiple VNET, subnets, NSG and association on both VMs and Subnets. It quick became complicated to keep track of what has been configured, associations and NSG rules. Therefor I created a PowerShell script that generates a HTML based report that gives me an overview. I thought I should share this with the community, even if it is a “quick-hack-with-bad-written-code” J

To run the script, start Azure PowerShell and set up a connection to your Azure subscription. Also make sure you have a C:\TEMP folder. The script will export your Azure network configuration and read it. The script will also query your Azure subscription for information for example virtual machines with NSG associated. The HTML file will be named C:\temp\net.htm.

First part of the HTML page is an overview of VNET, subnets, Address prefix and associated NSG. The second part of the HTML page is an overview of NSG rules in use. In this example I have three NSG in use, two associated with subnets and one associated with two virtual machines. There is also one NSG that is not in use at all.

Download the script List NSGv2.

 

 

Note that this is provided “AS-IS” with no warranties at all. This is not a production ready management pack or solution for your production environment, just an idea and an example.

Building your first Azure Resource Group template for IaaS

Posted: 2nd May 2015 by Anders Bengtsson in Azure

At the BUILD conference last week Microsoft announced the public preview for template-based deployments of Compute, Networking and Storage, using the Azure Resource Manager. With this new template feature we can build complex models of services and deploy in an easy way. If you have been building large PowerShell scripts before for deployment you should take a look at this, as it is easier to work with. In this blog post I want to show you how to get started building a resource manager template for a new virtual machine.

To build the template I am using Visual Studio. If you are an IT pro like me everything with Visual Studio is often a bit scary, but in this case it is really easy J On top of Visual Studio you need to install latest version of Azure SDK and latest Visual Studio updates. I installed both the Azure SDK and Visual Studio with default settings.

  1. Start Visual Studio and select to create a new project
  2. Select to create a Azure Resource Group project
  3. Name the new project, for example Contoso Single Server Template, click OK
  4. On the Select Azure Template page, select Windows Virtual Machine and click OK. This template will deploy a Windows VM with a couple of different options. It is a good foundation to then add on more to.
  5. Visual Studio have now generated a three files for you, and this is all you need to start. You now have a working template that you can start using direct. There is a configuration file (WindowsVirtualMachine.json) that includes all the settings and details of what you want to deploy to Azure Resource Manager, and there is a parameter file (WindowsVirtualMachine.param.dev.json) that includes all user defined values that he configuration file needs. There is also a PowerShell script (Deploy-AzureResourceGroup.ps1) that is used by Visual Studio to deploy you template. You can also see AzCopy.exe in the Tools folder. AzCopy is used by PowerShell to copy files to a storage account container, if you template includes files and custom code to deploy.
  6. You can now test the template by right-click on Contoso Single Server Template and select Deploy, New Deployment. Connect to your Azure subscription and configure all parameters. Then click Deploy.

  7. You can now see that the deployment has started in the Output window in Visual Studio. You can also log on the Azure Preview portal and follow the deployment in your new resource group. Once the deployment is done, you can see the result in the Output window in Visual Studio.

    We have now deployed one instance of the new resource group template, including a virtual machine. Let’s say we want to add one more virtual machine to the template.

  8. In Visual Studio, Solution Explorer, select WindowsVirtualMachine.json and then click on the JSON Outline tab
  9. To add a virtual machine, right-click resources and select Add Resource
  10. In the Add Resource wizard, select Windows Virtual Machine. Input VM02 as Name and select the already existing storage account and virtual network. Click Add
  11. If you look in the JSON Outline window, at parameters, you can see that Visual Studio just added a number of parameters for VM02
  12. To update the already existing deployment with the new virtual machine, right-click Contoso Single Server Template and select Deploy
  13. Fill in the parameters for VM02 and then click Deploy
  14. You can watch the deployment of the second VM both from Visual Studio and the Azure Preview Portal.

We have now built a template that deploys two virtual machines. As you notice when we did the second deployment we needed to input VM admin user name and password twice. Let look at how we can use the same parameter for both virtual machines

  1. In Visual Studio, in Solution Explorer, click the WindowsVirtualMachine.json file
  2. In JSON Outline tab, click VM02, and you will see the code for VM02 is highlighted, scroll to osProfile
  3. As we want to use the same parameter for admin credentials on VM02 as on the first VM replace the VM02 parameters with the parameter we use for the first VM
  4. Delete VM02AdminUserName and VM02AdminPassword parameters in the JSON Outline tab
  5. In Visual Studio, in Solution Explorer, click the WindowsVirtualMachine.param.dev.json file
  6. In the param.dev.json file, remove the VM02AdminUserName parameter
  7. Now, do a new deployment and verify you can connect to both new virtual machines with the admin credentials

When you are done with your template you can right-click Contoso Single Server Template and choose Build (same place as Deploy). This will build your solution and you can now copy it from the project folder, default folder C:\Users\<username>\Documents\Visual Studio 2013\Projects\Contoso Single Server Template\Contoso Single Server Template. In the Scripts folder you will find the PowerShell script that you can use to deploy the template. In the Templates folder is the two JSON files. To deploy an instance of the template you can run the PowerShell script from Azure PowerShell.

Another way to deploy with the template is to paste the template code into Template Deployment in the Azure Portal. The third way to deploy using the template is to upload it to a storage account and then trigger it direct into the Template Deployment feature. A tricky part with that is that you have to replace escape all the special characters, for example

https://contoso11.blob.core.windows.net/scripts/WindowsVirtualMachine.json

Becomes

https://ms.portal.azure.com/#create/Microsoft.Template/uri/http%3A%2F%2Fcontoso11.blob.core.windows.net%2Fscripts%2FWindowsVirtualMachine.json

In this blogpost we have discussed how we can use Visual Studio to build resource group templates for IaaS resources. We can then deploy resources with the template from both the Azure Portal and from Azure PowerShell.