Home » Scripts

Category Archives: Scripts

Contoso.se

Welcome to contoso.se! My name is Anders Bengtsson and this is my blog about Azure infrastructure and system management. I am a senior engineer in the FastTrack for Azure team, part of Azure Engineering, at Microsoft.  Contoso.se has two main purposes, first as a platform to share information with the community and the second as a notebook for myself.

Everything you read here is my own personal opinion and any code is provided "AS-IS" with no warranties.

Anders Bengtsson

MVP
MVP awarded 2007,2008,2009,2010

My Books
Service Manager Unleashed
Service Manager Unleashed
Orchestrator Unleashed
Orchestrator 2012 Unleashed
OMS
Inside the Microsoft Operations Management Suite

Update Service Map groups with PowerShell

Service Map automatically discovers application components on Windows and Linux systems and maps the communication between services. With Service Map, you can view your servers in the way that you think of them: as interconnected systems that deliver critical services. Service Map shows connections between servers, processes, inbound and outbound connection latency, and ports across any TCP-connected architecture, with no configuration required other than the installation of an agent. Machine Groups allow you to see maps centered around a set of servers, not just one so you can see all the members of a multi-tier application or server cluster in one map. Source Microsoft Docs

A common question is how to update machine groups in Service Map automatically. Last week my colleague Jose Moreno and I was worked with Service Map and investigated how to automate machine group updates. The result was a couple of PowerShell examples, showing how to create and maintain machine groups with PowerShell. You can find all the examples on Jose GitHub page. With these scripts we can now use a source, for example, Active Directory groups, to set up and update machine groups in Service Map.

Process OMS Log Analytic data with Azure Automation

Log Analytic in OMS provides a rich set of data process features for example custom fields. But there are scenarios were the current feature set is not enough.

In this scenario, we have a custom logfile that log messages from an application. From time to time the log file contains information about number of files in an application queue. We would like to display number of files in queue as a graph in OMS. Custom Fields will not work in this scenario as the log entries has many different log entry formats, OMS cannot figure out the structure of the log entries when not all of them follow the same structure. OMS don´t support custom field based on a subquery of the custom log entries, which otherwise could be a solution.

The example (in this blog post) is to ship the data to Azure Automation, process it, and send it back in suitable format to Log Analytics. This can be done in two different ways,

  • 1 – Configure a alert rule in Log Analytics to send data to Azure Automation. Azure Automation process the data and send it to OMS as a new custom log
  • 2 – Azure Automation connect to Log Analytics and query the data based on a schedule. Azure Automation process the data and send it to OMS as a new custom log

It is important to remember that events in Log Analytics don´t have a ID. Either solution we choose we must build a solution that makes sure all data is processed. If there is an interruption between Log Analytics and Azure Automation it is difficult to track which events that are already processed.

One thing to note is that Log Analytic and Azure Automation show time different. It seems like Azure Automation use UTC when display time properties of the events, but the portal for Log Analytic (the OMS portal) use the local time zone (in my example UTC+2hours).  This could be a bit tricky.

1 – A Alert Rule push data to Azure Automation

In this example we need to do configuration both in Azure Automation and Log Analytics. The data flow will be

  • Event is inserted into Log Analytics
  • Event trigger Alert Rule in Log Analytics that trigger an Azure Automation runbook
  • Azure Automation get the data from the webhook and process it
  • Azure Automation send back data to Log Analytics as a new custom log

To configure this in Log Analytics and Azure Automation, follow these steps

  1. In Azure Automation, import AzureRM OperationalInsight PowerShell module. This can be done from the Azure Automation account module gallery. More information about the module here
  2. Create a new connection of type OMSWorkSpace in the in the Azure Automation account
  3. Import the example runbook, download from WebHookDataFromOMS
  4. In the runbook, update OMSConnection name, in the example named OMS-GeekPlayGround
  5. In the runbook, you need to update how the data is split and what data you would like to send back to OMS. In the example I send back Computer, TimeGenerated and Files to Log Analytic
  6. Publish the runbook
  7. In Log Analytics, configure an Alert Rule to trigger the runbook
  8. Done !

2 – Azure Automation query log analytic

In this example we don´t need to configure anything on the Log Analytic side. Instead all configuration is done on the Azure Automation side. The data flow till be

  • Events are inserted into Log Analytic
  • Azure Automation query Log Analytic based on a schedule
  • Azure Automation get data and process it
  • Azure Automation send back data to Log Analytic as a new custom log

To configure this in Azure Automation, follow these steps

  1. Import Tao Yang PS module for OMSDataInjection into your Azure Automation account. Navigate to PS Gallery and click Deploy to Azure Automation
  2. Import the AzureRM OperationalInsight PowerShell module. This can be done from Azure Automation account module gallery. More information about the module here.
  3. Create a new connection of type OMSWorkSpace in the Azure Automation account
  4. Verify that there is a connection to the Azure subscription that contains the Azure Automation account. In my example the connection is named “AzureRunAsConnection”
  5. Import the runbook, download here, GetOMSDataAndSendOMSData in TXT format
  6. In the runbook, update OMSConnection name, in the example named OMS-GeekPlayGround
  7. In the runbook, update Azure Connection name, in the example named AzureRunAsConnection
  8. In the runbook, update OMS workspace name, in the example named geekplayground
  9. In the runbook, update Azure Resource Group name, in the example named “automationresgrp”
  10. In the runbook, update the Log Analytic query that Azure Automation run to get data, in the example “Type=ContosoTestApp_CL queue”. Also update the $StartDateAndTime with correct start time. In the example Azure Automation collect data from the last hour (now minus one hour)
  11. In the runbook, you need to update how the data is split and what data you would like to send back to OMS. In the example I send back Computer, TimeGenerated and Files to Log Analytic.
  12. Configure a schedule to execute the runbook with suitable intervals.

Both solutions will send back number of files in queue as double data type to Log Analytic. One of the benefits of building a custom PowerShell object and convert it to JSON before submitting it to Log Analytic, is that you can easy control data type. If you simple submit data to Log Analytic the data type will be detected automatically, but sometimes the automatic data type is not what you except. With the custom PS object you can control it. Thanks to Stan for this tip. The data will be stored twice in Log Analytic, the raw data and the processed data from Azure Automation.

Disclaimer: Cloud is very fast moving target. It means that by the time you’re reading this post everything described here could have been changed completely.
Note that this is provided “AS-IS” with no warranties at all. This is not a production ready solution for your production environment, just an idea and an example.

Upload VHD and create new VM with managed disk

In this post I would like to share scripts and steps I used to create a new Azure VM with managed disks based on a uploaded Hyper-V VHD file. There is a number of things to do before uploading the VHD to Azure. Microsoft Docs has a good checklist here with steps how-to prepare a Windows VM to be uploaded to Azure. Some of the most important things to think about is that the disk must have a fixed size, be in VHD format and the VM must be generation 1. It is also recommended to enable RDP (😊) and install the Azure VM Agent. The overall steps are

  1. Create a Azure storage account
  2. Prepare the server according to the link above
  3. Export the disk in VHD format with fixed size
  4. Build a new VM with the exported disk. This is not required, but can be a good thing to do just to verify that the exported VHD file works before uploaded to Azure
  5. Upload the VHD file
  6. Create a new VM based on the VHD file
  7. Connect to the new VM and verify everything works
  8. Delete the uploaded VHD file

The following figure show the storage account configuration were the VHD file is stored

The following image show the upload process of the VHD file… the last image show creation of the new VM

The script I used to upload the VHD file

 

Login-AzureRmAccount -SubscriptionId 9asd3a0-aaaaa-4a1c-85ea-4d11111110be5
$localfolderpath = “C:\Export”
$vhdfilename = “LND-SRV-1535-c.vhd”
$rgName = “migration-rg”
$storageaccount = “https://migration004.blob.core.windows.net/upload/”
$localpath = $localfolderpath + “\” + $vhdfilename
$urlOfUploadedImageVhd = $storageaccount + $vhdfilename
Add-AzureRmVhd -ResourceGroupName $rgName -Destination $urlOfUploadedImageVhd -LocalFilePath $localpath -OverWrite

 

The script I used to create the new VM. Note that the script connects the new VM to the first subnet on a VNET called CONTOSO-VNET-PRODUCTION. Also note that the size of the VM is set to Standard_A2.

 

Login-AzureRmAccount -SubscriptionId 9asd3a0-aaaaa-4a1c-85ea-4d11111110be5
$vmName = “LND-SRV-1535”
$location = “West Europe”
$rgName = “migration-rg”
$vhdfile = “LND-SRV-1535-c”
$vhdsize = “25”
$sourceVHD = “https://migration004.blob.core.windows.net/upload/” + $vhdfile + “.vhd”
### Create new managed disk based on the uploaded VHD file
$manageddisk = New-AzureRmDisk -DiskName $vhdfile -Disk (New-AzureRmDiskConfig -AccountType StandardLRS -Location $location -CreateOption Import -SourceUri $sourceVHD -OsType Windows -DiskSizeGB $vhdsize) -ResourceGroupName $rgName
### Set VM Size
$vmConfig = New-AzureRmVMConfig -VMName $vmName -VMSize “Standard_A2”
### Get network for new VM
$vnet = Get-AzureRMVirtualNetwork -Name CONTOSO-VNET-PRODUCTION -ResourceGroupName CONTOSO-RG-NETWORKING
$ipName = $vmName + “-pip”
$pip = New-AzureRmPublicIpAddress -Name $ipName -ResourceGroupName $rgName -Location $location -AllocationMethod Dynamic
$nicName = $vmName + “-nic1”
$nic = New-AzureRmNetworkInterface -Name $nicName -ResourceGroupName $rgName -Location $location -SubnetId $vnet.Subnets[0].Id -PublicIpAddressId $pip.Id
$vm = Add-AzureRmVMNetworkInterface -VM $vmConfig -Id $nic.Id
### Set disk
$vm = Set-AzureRmVMOSDisk -Name $manageddisk.Name -ManagedDiskId $manageddisk.Id -CreateOption Attach -vm $vm -Windows
### Create the new VM
New-AzureRmVM -ResourceGroupName $rgName -Location $location -VM $vm
Once the second script is completed, the resource group contains a VM, a public IP address, a network adapter and the managed disk. In this example the resource group also contains the storage account.

Monitor a Minecraft server with OMS (including moonshine perf counters)

From time to time I play Minecraft with friends. As a former SCOM geek I have of course configured monitoring for this server 🙂 The server in this blogpost is a Windows server but most of the example works the same for a Minecraft server running on Linux. On the Minecraft server there are two types of resources that I would like to monitor, server performance and Minecraft logs.

The first part, server performance, is easy to solve. I installed the OMS agent on the server and enabled Windows performance monitoring for processor, memory, disk queue and network traffic. Those are all out of the box OMS features.

For Minecraft there is a log file, %Minecraft%\logs\latest.log, that Minecraft use to log everything around “the world” running in the server. In this log file you can see players joined, disconnected and some player activity like achievements or if a player dies. You can also use the log file to see if the server is running and if the world is ready. In OMS under Settings/Data/Custom Logs you can configure OMS to collect data from this log file. Note the name of the custom log, as it is the type you use to search for this events. In my example I have setup custom log named WinMinecraftLog_CL (_CL is added automatically). More info about configure custom log here.

We can use Log Search to review collected data (Type=WinMinecraftLog_CL) from the log file. Custom Fields can be used to add a new searchable field for the log severity, in this example OMS extract WARN and INFO and store it as WinMinecraftLogSeverity_CF. More information about custom fields here.

Another interesting thing to monitor on a Minecraft server is number of connected players. Unfortunately the Minecraft server don’t have a performance counter for this or an easy way to read it from the server. But you can count number of connections on the Minecraft port (default port 25565) 🙂 I have created a PowerShell script to count number of connections and write it as a new performance counter to the local server. The script also count number of unique players that have logged on to the server (number of files in the %Minecraft%\world\playerdata folder) and writes it as a performance counter. The script can be download here, WritePerfData. Thanks to Michael Repperger for the perf count example.

These two performance counters can then be collected by OMS as Windows Performance counters

Once all data is collected, both Minecraft specific and server data, OMS View Designer can be used to build a Minecraft dashboard (more info about View Designer here) The dashboard in gives us an overview of the Minecraft server, both from performance and Minecraft perspective. This example dashboard also includes a list of events from the log file, showing if there is a lot of warning events in the log file. Each tile in the dashboard is a link to OMS Log Search that can be used to drill deeper into the data.

Next step could be to index and measure more fun World specific number, for example achievements and most dangerous monster in the Minecraft world 🙂 On the server there is a folder, %Minecraft%\world\stats , with numbers about each user in the world, for example number of threes cut down or blocks built, these could also be fun numbers to collect 🙂

 

Disclaimer: Cloud is very fast moving target. It means that by the time you’re reading this post everything described here could have been changed completely.

Document Azure subscription with PowerShell

I would like to share an idea around documentation for Azure subscription, and hopefully get some ideas and feedback about it. What I see at customers is that documenting what resources are deployed to Azure is a challenge. It is also a challenge to easy get an overview of configuration and settings. Fortunately with Azure Power Shell we can easily get information about all resources in Azure. I have built an example script that export some settings from Azure and write them to a Word document.

The example script will export information about Virtual Machines, Network Interfaces and Network Security Groups (NSG). If you look in the script you can see examples of reading data from Azure and writing it to the Word document. You could of course read any data from your Azure environment and document it to Word. A benefit with a script is that you can schedule the script on intervals to always have an updated documentation of all your Azure resources.

Another thing I was testing was building Visio drawings with PowerShell. To do this I used this PowerShell module. The idea with this example is to read data from Azure and then draw a picture. In my example I included virtual machines and related storage accounts and network.

Download my example PowerShell scripts here.

 

Note that this is provided “AS-IS” with no warranties at all. This is not a production ready solution for your production environment, just an idea and an example.

Add network card to Azure VM

This is a script I wrote when Pete and I were preparing our System Center Universe session a couple of Days ago. The scenario is that you have a VM running in Azure with one network card. Now you want to add another network card. It is only possible to add a network card when creating the VM. This script will delete the current VM, keep the disk, create a new VM with two network card and attach the disk again.

Things to add to this script can be how to handle VM size. Depending on the VM size you can have different number of network cards, for example if you have 8 CPU cores you can have 4 network cards. It would be good if the script could handle that. It would also be good if the script could handle multiple disk on the original VM. That might show up in vNext 🙂

$VMName = “SCU001”
$ServiceName = “SCU001”
$InstanceSize = “ExtraLarge”
$PrimarySubNet = “Subnet-1”
$SecondarySubnet = “Subnet-2”
$VNET = “vnet001”

#Get the current disk
$disk = Get-AzureDisk | where {$_.AttachedTo -like “*RoleName: $VMName*”}

### Shutdown the VM
Get-AzureVM -Name $VMName -ServiceName $ServiceName | Stop-AzureVM -Force

# Remove the VM but keep the disk
Remove-AzureVM -Name $VMName -ServiceName $ServiceName

# Deploy a new VM with the old disk
$vmconf = New-AzureVMConfig -Name $VMName -InstanceSize $InstanceSize -DiskName $disk.DiskName |
Set-AzureSubnet -SubnetNames $PrimarySubNet |
Add-AzureNetworkInterfaceConfig -name “Ethernet2” -SubnetName $SecondarySubnet |
Add-AzureEndpoint -Protocol tcp -LocalPort 3389 -PublicPort 3389 -Name “Remote Desktop”

New-AzureVM -ServiceName $ServiceName -VNetName $vnet -VMs $vmconf

List all containers in all storage accounts

This script list all containers in all storage accounts, in your Azure subscription. It is handy when looking for a container or a blob end point.

$accounts = Get-AzureStorageAccount
Foreach ($account in $accounts)
{
$sa = Get-AzureStorageAccount -StorageAccountName $account.storageAccountName
$saKey = Get-AzureStorageKey -StorageAccountName $sa.StorageAccountName
$ctx = New-AzureStorageContext -StorageAccountName $sa.StorageAccountName -StorageAccountKey $saKey.Primary
Get-AzureStorageContainer -Context $ctx

 


 

Scaling Azure VM with Azure Automation, with help of Azure SQL

I am running a number of virtual machines in Microsoft Azure, some of them just a couple of hours and some I keep running 24/7. Often I don’t need them during the night, but some I still want to keep online to make sure database sync jobs runs. Previous I have been using a couple of different Powershell scripts that start and stop virtual machines, but now I have built a new solution, and want to share the idea with you. This new solution involves two components, except the virtual machines, it includes Azure Automation and Azure SQL. Azure Automation (current in preview) is an automation engine that you can buy as a service in Microsoft Azure. If you have been working with Service Management Automation (SMA) you will recognize the feature. You author runbooks in PowerShell workflow and execute them in Microsoft Azure, same way as with SMA, just that you don’t have to handle the SMA infrastructure. Azure SQL is another cool cloud service, a relational database-as-a-service.

Figur 1 Azure Automation

Figur 2 Azure SQL Database

In Azure Automation I have created two runbooks (VM-MorningStart and VM-EveningStop). One that runs every morning and one that runs every evening. These runbooks read configuration from the Azure SQL database about how the virtual machines should be configured during night and day. Azure Automation have a feature, schedule, which can trigger these runbooks once every day, shown in figure 3.

Figur 3 Schedule to invoke runbook

Both runbooks are built the same way. They check in the database for servers and then compare the current settings with the settings in the database. The database has one column for day time server settings and one for Night time server settings. If the current setting is not according to the database the virtual machine is shutdown, re-configured and restarted again. I use Azure AD to connect to the Azure subscription according to this blog post, and I use SQL authentication to logon to the SQL server. If a virtual machine is set to size 0 in the database it seems the machine should be turned of during the Night.

workflow VM-EveningStop
{
### SETUP AZURE CONNECTION
###
$Cred = Get-AutomationPSCredential -Name 'azureautomation@domainnameAD.onmicrosoft.com'
Add-AzureAccount -Credential $cred
### Connect to SQL
InlineScript {
$UserName = "azuresqluser"
$UserPassword = Get-AutomationVariable -Name 'Azure SQL user password'
$Servername = "sqlservername.database.windows.net"
$Database = "GoodNightServers"
$MasterDatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
$MasterDatabaseConnection.ConnectionString = "Server = $ServerName; Database = $Database; User ID = $UserName; Password = $UserPassword;"
$MasterDatabaseConnection.Open();
$MasterDatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
$MasterDatabaseCommand.Connection = $MasterDatabaseConnection
$MasterDatabaseCommand.CommandText = "SELECT * FROM Servers"
$MasterDbResult = $MasterDatabaseCommand.ExecuteReader()
#
if ($MasterDbResult.HasRows)
{
while($MasterDbResult.Read())
{
Select-AzureSubscription -SubscriptionName "Windows Azure MSDN - Visual Studio Ultimate"
$AzureVM = Get-AzureVM | Where-Object {$_.InstanceName -eq $MasterDbResult[1]}
Get-AzureVM -ServiceName $AzureVM.ServiceName | Stop-AzureVM -Force
#
If ($MasterDbResult[2] -eq "0") {
}
Else {
Get-AzureVM -ServiceName $AzureVM.ServiceName | Set-AzureVMSize $MasterDbResult[2] | Update-AzureVM
Get-AzureVM -ServiceName $AzureVM.ServiceName | Start-AzureVM
}
}
}
}
}

The database is designed according to this figure (click to make it larger)

AzureSQL

Summary: Azure automation run one runbook in the morning to start up and scale up virtual machines, and one runbook in the evening to scale down and shut down virtual machines. All Configuration is stored in a Azure SQL database.

Note that this is provided “AS-IS” with no warranties at all. This is not a production ready management pack or solution for your production environment, just a idea and an example.

Copy or rename a SMA runbook

You may have notice that in the Windows Azure portal there is no way to copy or rename a SMA runbook. But it is of course possible with a small Powershell script. The script in this blog post will copy a source SMA runbook and store all settings in a new SMA runbook. The script will also import the new runbook and delete the old one. You can comment (#) the last part if you just want to copy your source runbook and not delete it. If you keep the last part, delete source runbook, then the result of the script will be a rename of the runbook.


$path = "C:\temp"
$targetrunbookname = "newdebugexample"
$sourcerunbookname = "debugexample"
$WebServiceEndpoint = "https://wap01"
##
## Get source runbook
##
$sourcesettings = Get-SmaRunbook -WebServiceEndpoint $WebServiceEndpoint -Name $sourcerunbookname
$source = Get-SmaRunbookDefinition -WebServiceEndpoint $WebServiceEndpoint -name $sourcerunbookname -Type Published
##
## Create the new runbook as a file with source workflow and replace workflow name
##
New-item $path\$targetrunbookname.ps1 -type file
add-content -path $path\$targetrunbookname.ps1 $source.content
$word = "workflow $sourcerunbookname"
$replacement = "workflow $targetrunbookname"
$text = get-content $path\$targetrunbookname.ps1
$newText = $text -replace $word,$replacement
$newText > $path\$targetrunbookname.ps1
##
## Import new runbook to SMA and set runbook configuration
##
Import-SMArunbook -WebServiceEndpoint $WebServiceEndpoint -Path $path\$targetrunbookname.ps1 -Tags $sourcesettings.Tags
Set-SmaRunbookConfiguration -WebServiceEndpoint $WebServiceEndpoint -Name $targetrunbookname -LogDebug $sourcesettings.LogDebug -LogVerbose $sourcesettings.LogVerbose -LogProgress $sourcesettings.LogProgress -Description $sourcesettings.Description
##
## Delete sourcerunbook
##
Remove-SmaRunbook -WebServiceEndpoint $WebServiceEndpoint -name $sourcerunbookname -Confirm:$false

Note that this is provided “AS-IS” with no warranties at all. This is not a production ready solution, just an idea and an example.

Deploy a new service instance with Powershell in VMM 2012

I read on Technet that Microsoft recommend to use service templates in VMM 2012 even for a single server. Instead of using a VM template we should use a service template to deploy a new virtual machine. In Orchestrator there is an activity in the Virtual Machine Manager (VMM) integration pack that can create a new virtual machine from a template. But there is no activity to create a new instance from a service template. So I copy/paste “wrote” a script that deploys a new instance of a service, based on a service template. This is a very basic and simple example script that you can use as a foundation. The script will ask for five input parameters

  • CloudName = Target cloud in VMM for the new service
  • SvcName = Name of the new service. As I only deploy a single server I use the computer name as service name too
  • ComputerANDvmName = The name that will be used both for the virtual machine and the computer name
  • SvcTemplateName = The service template to use
  • Description = A description that will be added to the new service instance

Param(
[parameter(Mandatory=$true)]
$CloudName,
[parameter(Mandatory=$true)]
$SvcName,
[parameter(Mandatory=$true)]
$ComputerANDvmName,
[parameter(Mandatory=$true)]
$SvcTemplateName,
[parameter(Mandatory=$true)]
$Description)
Import-Module ‘C:\Program Files\Microsoft System Center 2012\Virtual Machine Manager\bin\psModules\virtualmachinemanager\virtualmachinemanager.psd1’

$cloud = Get-SCCloud -Name $CloudName
$SvcTemplate = get-scServicetemplate -Name $SvcTemplateName
$SvcConfig = New-SCServiceConfiguration -ServiceTemplate $SvcTemplate -Name $SvcName -Cloud $cloud -Description $Description
$WinSrvtierConfig = Get-SCComputerTierConfiguration -ServiceConfiguration $SvcConfig | where { $_.name -like “Windows*” }
$vmConfig = Get-SCVMConfiguration -ComputerTierConfiguration $WinSrvtierConfig
Set-SCVMConfiguration -VMConfiguration $VMConfig -name $ComputerANDvmName -computername $ComputerANDvmName
Update-SCserviceConfiguration -ServiceConfiguration $Svcconfig
$newSvcInstance = New-SCService -ServiceConfiguration $Svcconfig

 

The script first create and configure a service deployment configuration. The service deployment configuration is an object that is stored in the VMM Library that describes the new service instance, but it is not running. The last two lines in the script will pick up that service deployment configuration and deploy it to. All settings of the new service instance is stored in the service deployment configuration. In my service template, named “Contoso Small”, I have a tier named “Windows Server 2008R2 Enterprise – Machine Tier 1” that is why the script search for a tier with a name like “Windows*”.

When we have a PowerShell script we can easy use it from a runbook in Orchestrator to deploy new instances of services.

 

Note that the script is provided “AS-IS” with no warranties.