Home » Opalis (Page 2)

Category Archives: Opalis


Welcome to contoso.se! My name is Anders Bengtsson and this is my blog about Microsoft infrastructure and system management. I am a principal engineer in the FastTrack for Azure team, part of Azure CXP, at Microsoft. Contoso.se has two main purposes, first as a platform to share information with the community and the second as a notebook for myself.

Everything you read here is my own personal opinion and any code is provided "AS-IS" with no warranties.

Anders Bengtsson

MVP awarded 2007,2008,2009,2010

My Books
Service Manager Unleashed
Service Manager Unleashed
Orchestrator Unleashed
Orchestrator 2012 Unleashed
Inside the Microsoft Operations Management Suite


Welcome to contoso.se! My name is Anders Bengtsson and this is my blog about Azure infrastructure and system management. I am a senior engineer in the FastTrack for Azure team, part of Azure Engineering, at Microsoft.  Contoso.se has two main purposes, first as a platform to share information with the community and the second as a notebook for myself.

Everything you read here is my own personal opinion and any code is provided "AS-IS" with no warranties.

MVP awarded 2007,2008,2009,2010

My Books

Service Manager Unleashed

Orchestrator 2012 Unleashed

Inside the Microsoft Operations Management Suite

Export of Policies

There are many scenarios where you need to export and import Opalis policies, for example between your Opalis development environment and your Opalis production environment. There are actually two places in Opalis Integration Server where you can do a export, first on a policy tab and then on a folder. In this blog post we will look at these two.

If you select Export on a folder you will get the Export dialog box. The Export dialog box allows you to

  • Specify file location, where to store the export file
  • configure if you want to export sub folders too
  • configure if you want to export global settings (for example counters, variables and computer groups)
  • configure if you want to export global configuration (options > configuration in the Opalis Integration Server client)

If you export a folder, including a sub-folder, total of two policies, with all default settings in the export dialog box, you get a complete export including the folder structure, global settings, connection settings and policies from the folder and sub folder. As you can see in the picture, the export also includes Global settings that the two policies don’t use. If you already have the global settings you can choose if you want to override them or keep them as they are.

If you do export on a policy tab you will not get the same result as if you right-click a folder and do export. This will only give you the policy that you did the export on and global settings it use. But you will still get the global settings folder structure of the source environment. In the picture above I will get the 1., 1.1, 1.2 and 2  folders under Variables, but as I exported policy 1.2 there will be noting in the folders except for the 1.2 folder.

If for example folder 2. already exists under Variables you get a question asking you if you want to overwrite it or create a new. If you select create new, you will get a new folder named “2. (1)”. If you select override it will merge the folder with the variables to import and the ones in the folder already. You can step through all items and select overwrite or not, for example if you only want to overwrite a couple variables. If you create a new folder during import the policy will be update to use the new variable, even if there already exist one with the same name.

In both export scenario the export file contains data that you not necessary need or want to import into the target environment. To avoid overwrite a policy, connection configuration or setting by mistake you can clean up the export file between export and import. There are a couple of tools from Microsoft partners that can do this, but you can also use a extra Opalis installation. It can be a single server running Opalis where you import the export file, modify it and export to a new file. You then use the new export file to import into the target Opalis environment. In my sandbox that machine is named mrWolf, from the efficient clean-up expert “the Wolf” in Pulp Fiction.  🙂 This Opalis environment is keept total empty by default, it is then very easy to see what data was imported when I run a import. When the clean-up is done, it is easy to see what will be exported, as everything in there will be included.

Please note that this is provided “as is” with no warranties at all.


Delete a Client in Opalis

If you want to delete a client from your Opalis environment you can use Opalis Deployment Manager. But if the client machine is offline the uninstall wizard will end with a error saying “Failed to uninstall the Client”. The client will still be in Opalis Deployment Manager and you cant delete it. The resolution to this, to delete a offline client, is to try to deploy the console to the same machine again. If you run the “Deploy new client” wizard, against the machine that is offline, the setup will fail (of course) but one of the first step in the deployment wizard is to uninstall and delete the console. The result will be that the client is no longer in the Opalis Deployment Manager.

If you later re-deploy Opalis integration Server to the workstation and select only a limited number of integration packs, the workstation will still have  all the integration packs it had with the old console installation.

If you start the console after it was deleted from Opalis Deployment Manager, it can still connect to the Opalis server and Opalis Deployment Manager will be updated to include the workstation under clients again.

Get Old Operations Manager Alerts with Opalis

I read a question on the forum about the “Get Alert” object in Opalis, that it doesn’t support relative dates. That is correct and a bit sad too, it would be really nice to say “now minus 7 days” as we can do in the Operations Manager reporting console for example.

But there is of course a solution to this 🙂 You can start with a Format Date/Time object in your workflow, that will generate the relative date for you. The output can then be used as input in the Get Alerts object.

The Format Date/Time object takes a variable as input, the variable is the current time in yyyy-MM-dd h:m:s format. The Format Date/Time object then re-format the time and adjust the output date with minus 7 days.

There is a junction object in the policy too. This is used to make sure following objects only run once, regardless of the data provided in previous objects. Else “Send Platform Event” and ”Delete temp file” would run once for every alert the Get Alert object returns. Instead I use a Append File object to write all alerts to a temp text file. On the other side of the junction object I pick up all the data again with the Get Lines object, the rest of the policy will then only run once. You can download my example here, OldAlerts

Please note that this is provided “as is” with no warranties at all.

Start Maintenance Mode with Opalis

In this post I want to show you a example how you can start Operations Manager maintenance mode from Opalis. Operations Manager maintenance mode is used to prevent alerts and notifications from objects that you under maintenance. In maintenance mode, alerts, notifications, rules, monitors, automatic responses, state changes, and new alerts are suppressed at the agent. By design, Operations Manager 2007 monitor that the agent is functioning correctly even if the computer is in maintenance mode. If the Health Service and the Health Service Watcher for the agent are not in maintenance mode when you reboot the machine, you will get an alert saying heartbeat failure and failed to connect to computer. This example will put the Windows computer, the health service and the health service watcher into maintenance mode.

The policy contains a number of objects
1. Custom Start.
2. Start maintenance mode for a windows computer
3. Start maintenance mode for a health service
4. Query the Operations Manager database to get the computer GUID
5. Start maintenance mode for a health service watcher
6. Generate a platform event including a summary

The Start Maintenance Mode object puts an monitor in Operations Manager into maintenance mode. You can use the object to browse for a object.

To put the health service watcher into maintenance mode we need the GUID of the machine. The other two “start maintenance mode” objects are a bit easier as we can input the server FQDN. To get the server GUID we run a query against the OperationsManager database.

As you can see in the picture, the database query returns a bit more than the GUID. To filter out everything except the GUID we will use two of the data manipulation functions that Opalis have.

We first split the result from the database query into two parts, split by the “;”. Then from the second part, in this example {B3278151-9AC8-5B3B-8924-5F1F7CE27DE7}, we use the MID feature and tells Opalis to get 36 characters starting at position 2. The result when we run this will be three maintenance modes, as shown in the picture below

Because Operations Manager 2007 polls maintenance mode settings only once every 5 minutes, there can be a delay in an object’s scheduled removal from maintenance mode. You can download my example here, 20110211_MM . Remember that you need to edit the Query database object to configure which account to use when query the database.

Please note that this is provided “as is” with no warranties at all.

How-to run the Opalis Integration Server Client from a workstation machine (Win7)

In this post I will show you how to deploy the Opalis Integration Server Client to your Windows 7 workstation. Then how you can controll what the Windows 7 workstation user can see and do in Opalis.

In this example I am running a new installed Windows 7 Enterprise 64-bit with the Windows firewall enable. The Windows 7 box is member of the same domain as Opalis. I will give the “Server Team” permissions to work with policies under their own folder, nothing else. In Active Directory I have created a security group named GRP-OPALIS-ServerTeam. My test user, Otto, is not local administrator of the Windows 7 box and User Control Settings (UAC) is running with default settings.

By default the Opalis Integration Server system is configured to allow only members of the local Administrators group on the Management Server computer to view, modify and manage Folders, Policies, Computer Groups, Variables, Counters and Schedules however these permissions can be changed on a per object basis or inherited from a parent object using Access Control Lists (ACL) much like Windows NTFS permissions. Read more at Technet

We will start with deploying the Opalis Integration Server Client to the client machine. The Windows 7 machine is named W702.  We can do a push installation of the client with Opalis Deployment Manager (for manually installation please see KB2022962). Start Deployment Manager on your Opalis management server, then in the navigation pane, right-click Client and select “Deploy new client”.

Follow the “Welcome to the Client Deployment Wizard”. Opalis Integration Server Client is installed via SMB/CIFS.  TCP ports 135, 139, 445 and RPC dynamic port must be accessible on the target computer from the Opalis Management Server. We can configure RPC dynamic port allocation, see KB154596, but in most scenarios it is easier to create a firewall rule to allow all traffic between Opalis management server and the client, during the installation. The installation runs under the account that is running the deployment manager, that account needs to have local administration permissions on the target machine.

If Otto Eriksson, member of the GRP-OPALIS-ServerTeam group, logon to the Windows 7 machine now and start the Opalis Client he will first see a error about Opalis cant connect to a Opalis server on the local machine. But after that, if he goes to Actions > Connect and input the correct Opalis server name, he will see

The cause of this is missing DCOM permissions on the Opalis Integration Server Management Server computer. What we need to do is documented in KB2022966. In this scenario I use the GRP-OPALIS-ServerTeam when I modified the permissions. But we could of course use one general group for all Opalis client users.

If we now try to connect we will see new error. The cause of this is because we have not granted the user access to the policy structure, more info in KB2023582.

Start the Opalis Integration Server Client with a Opalis administration account. Start by give the “server team” permissions according to KB2023582, read permissions on the default Policies folder. We could of course use a general Opalis Users security group here too. Then create a folder for our “Server Team”.

If we right-click the new Server Team folder and select permissions from the menu, we can modify the permissions of this folder. In this example I will add a group named “GRP-OPALIS-ServerTeam” and give it full permissions to this object and all child objects. This is a security group in Active Directory that contains all users of the Server Team.

If we now start the Opalis Server Client in the Windows 7 box it will show us the policy structure and members of the server team can create policies under the “Server Team” folder.

But if they try to do something else, like delete another folder or create a global setting they will get a “Access id denied” error. If we need to give the group more permissions to for example some other object, we can right-click it, go into permissions and do the same as we did with the Server Team folder.

If you add a user to multiple security groups the user will get access to everything that each group have access to. Just like if you are working with NTFS permissions on a file server.

My collegue Jeffrey Fanjoy said a intresting point around folders, permissions and teams/silos in Opalis.

Another element to add is that often the biggest gain achieved through Opalis is the ability to automate processes across these silos so it may not be in the best interests of the organization to try and silo the use of the product through folders but take a step back and look at the business process being serviced by the various IT services and then leverage Opalis to automate the required IT processes to facilitate effective cross-silo service delivery. Then everybody can pat themselves on the back for how well they work together!

Update: If you are running Windows XP and want to connect to Opalis running on Server 2008 R2 you might need the 969442 hotfix too. Else you can get a error saying “A security package specific error occurred”.

Please note that this is provided “as is” with no warranties at all.

Deploying servers and software with Opalis, Service Manager and Virtual Machine Manager

In my sandbox I often need to install a new server or a new server application for example SQL, IIS or Operations Manager. Unfortunaly I have never really learned how to use Configuration Manager to do that, deploy software and operating system. But lucky for me I have learned a bit about Virtual Machine Manager (VMM), Service Manager (SCSM) and Opalis. In this post I will describe for you how I deploy servers and apps in my sandbox.

The picture below show the policy I have for deploying new virtual servers. It includes a number of objects that first decide which VM host to use and then deploys the server. It all starts with a change request in Service manager. I will try to explain each step in the policy.

  1. The first object is from the Service Manager IP. I have built my own activity class in Service Manager, a deploy server activity. The new activity management pack includes a form where I can input VM Name, description and server type. Server type is a way to set which hardware to use, for example my DC machines will only get 1GB RAM. When a instance of this class gets in status “IN PROGRESS” Opalis will pick it up and start the policy. I have a blogpost here about writting a new custom activity class and integrate with Opalis, it also includes step how to use the new activity in a change request template.

  1. The following three objects is QMI Query objects. They query each of my Hyper-V Hosts for free memory. I want to deploy the new server to the host with most free RAM. Each of the WMI Query objects have a link to a Operations Manager object that will generate an alert if the policy cant query a machine for free RAM.
  2. Next item is a RUN .NET Script object. It runs a Powershell script. This script compare each of the hosts and returns the host with most free RAM. I guess I could solve it with a number of compare value objects, but it is easier with a script and it is only one object in the policy. As you can see in the beginning of the script I use DIV to get the free memory value in GB format.
  3. The “Map Published Data” object, Map Disk, is used to map which disk to use and which template to use. Depending on which Hyper-V server that has most free RAM, I want to use different logical disks. This object also map the server type fromthe Service Manager change request to a template in Virtual Machine Manager. The template controls what kind of hardware the virtual machine will get. For example if the server type is DC it will use a template with only 1 GB RAM.
  4. The “Create VM From Template” object connects to Virtual Machine Manager and creates the new virtual machine. I notice that when you work with this object, and want to test the connection to VMM, it does that from the machine where you run the console, not from the action server where you will run the policy. In my case my workstation did not have WS management installed, so the connection test failed. Instead I did that part from the Opalis server.

  6. The two “Update Activity” objects updates the change request in Service Manager. One if the machine cant be created and one if it worked out fine. As you can see in the picture I have updated the status of the change request and also added some info about where the virtual machine was created and which template that was used. All steps to update a change from Opalis is in this post.

Now when you have a easy way to create a new virtual machine the next step would be to install something on it, for example SQL server. This can also be done with Opalis 🙂 I have built a library of policies that each install small parts of a complete server. For example I have a policy to add the .NET Framework feature on a server.

I have another policy to unattend install SQL server. This policy first creates C:\temp, if the folder don’t exists. It then copy a INI file from a network file share. The INI file is the “answering” file for the SQL installation. It contains all the settings that you normally input manually during SQL installation. The policy then run the SQL setup, “D:\setup /ConfigurationFile=C:\temp\SQL2008R2_OPSMGR.ini /QUIET”. It is pretty easy to get the SQL unattend file, when you run the SQL setup wizard, on the last page you will see a path to a configuration file. You can copy that file direct from the path. The only thing I added to the INI file was “IAcceptSQLServerLicenseTerms=”TRUE””

When you have created your own library of policies that install and prepare machines you can create different “master workflows” that will create a new machine, install the application and configure the server for you. For example a “master policy” from installing a new Operations Manager server. This policy will then trigger the other policies. I use it almost every day in the sandbox and it saves me a lot of time. The policy below is a “master policy” for a new SQL server. It starts with a change request in Service Manager, it creates a new machine, run server setup to add for example .NET Framework and then it install SQL server. It finish by updating the change request. I have a notification in Service Manager that will notify me by e-mail when then machine is ready. If you want to take a look at the policy for the new machine, you can download it here 6.1 New Virtual Machine 

Please note that this is provided “as is” with no warranties at all. Update: There is a extension to the VMM IP that you can download here

Deploy OPSMGR agent to untrusted zones with Opalis

When the agent is located in a domain separate from the domain where the Operations Manager management server is located, and no two-way trust exists between the two AD forests, certificates must be used so that authentication can take place between the agent and management server. A gateway server could also be included in a solution to a scenario like that. To configure a agent to authenticate with certificate there is a number of steps to carry out. I have a couple of blog posts around that here, here and here. As you can see it is a pretty complicated process and easy that you miss a step or something is not configure in the correct way. A solution to that could be to use a Opalis workflow. Opalis will then carry out all the steps for you, and in the same way every time. In this blogpost I will show you a workflow like that.

As you can see in the picture the workflow is devided into a number of policies. When you are building larger and complex policies it is a good practice to break it down to smaller parts. You can then also call the different parts from different policies and re-use your policies in different scenarios. I tried to put all info that I will change often in variables, for example domain name, shared folder path and CA name. It is much easier to change one variable then change configuration of 10 objects. The following list will give you a overview of each policy in the workflow. Note that it is only variables starting with 4.X that this workflow use.

All the variables

  • 4.1 is the main policy, the one that will trigger the other ones. It starts with creating a sub-folder in a shared folder. This folder is used for all kind of file transfer between management server, CA, Opalis and the agent. The 4.1 policy also includes two objects in the end that delete temporary folders on all machines that has been involved.
  • 4.2 is used to verify name resolution between the Opalis server and the agent.
  • 4.3 is used to install the CA root certificate on the agent. I presuppose that the root CA is already trusted by the Operations Manager management server. The policy also presuppose that the root CA is in the shared folder.
  • 4.4 generate a certificate request file and copy it to the shared folder. The file is generate on the agent. The shared folder is a folder on the network that all involved machines can access. It is important to make sure all the involved accounts have read and writte permissions to this folder.
  • 4.5 Copy the certificate request file from the shared folder to the CA. It submitts the request and receives a certificate (.CER). The certificate is then copied over to the shared folder. This step presuppose that the CA autoapprove the certificate. I dont want to include any manually steps, so a auto approving CA was a need. You can configure your CA to only auto approve based on templates used, more info about that here.
  • 4.6 Copy the certificate from the shared folder to the agent. It then adds the certificate to the local certificate store
  • 4.7 Copy the agent files from the shared folder to the agent. Installs the agent and verify that the Operations Manager agent service is running on the machine
  • 4.8 Configure Operations Manager to use the certificate and restarts the Operations Manager service

This is the shared folder before deploying any agents. the folder includes a sub-folder with agent installation files, in my example is the AMD64 folder renamed to Agent. The shared folder also includes the CA root certificate and a powershell script. The powershell script is used in policy 4.8. It includes on line

Get-ChildItem cert:\LocalMachine\My | where-object {$_.Issuer -eq “CN=skynet-DC01-CA, DC=skynet, DC=local”} | foreach {$_.SerialNumber} | out-file C:\temp_scom\cert.txt

The powershell command will get the serial number of the agent certificate. We will need to write this to the registry of the machine so the Operations Manager agent know which certificate to use. As you can see the command list all certificates issued by a specified CA, skynet-DC01-CA. It then writes the serial number to C:\temp_scom\cert.txt. If you have multiple certificates installed from the CA you will need to add a couple of criteria, so filter the correct certificate out.

The workflow includes a total of eight policies. We will now go into each one of them a bit deeper.

The 4.2 simple verify that the Opalis machine can get a IP of the target machine. If this is not working, nothing else in the workflow will work. It is always a good idea to start by checking all dependencies in your workflow, before you start changing anything. A idea could also be to add more tests to test that all involved accounts can write on the correct machines and folders.

The 4.3 policy starts with a creation of a new folder on the agent, the target machine. This folder, default C:\temp_scom, will be used as temporary area for all files the workflow copy or generate. The second object is a  file copy object. It is the root CA certificate that is being copied from the shared folder on the network to the agent. The last two objects first insert the certificate to the store and then adds it as a Trusted Publisher. Note that some of the “run program” or “run command” object will run until they timeout and is stopped, that will generate a warning but the policy will continue.

 The 4.4 policy generates a certificate request file on the agent. It dose this by first writing a INF file and then using Certreq create a new request from an .inf file. The policy then copy the request file over to the shared folder (the .req file).

The 4.5 policy start by creating a temporary folder on the CA. It then copy the certificate request from the shared folder to the temporary folder. Then with the CertReq command the certificate request is submitted to the CA. As I have configured the CA to auto approve requests the CertReq will also save the new certificate direct. The last object copy the new agent certificate to the shared folder.

The 4.6 policycopy the new agent certificate from the shared folder to the agent machine. It the adds the certificate to the local certificate store.

The 4.7 policy includes a number of steps. It start with creating a folder on the target machine for the agent installation files, default C:\temp_scom\agent. It then copies the agent installation files from the shared folder to the new temporary file.

The 4.8 policy start by copy the getserial.ps1 script from the shared folder to the agent. This script export the serial number of the new agent certificate. The second object runs this powershell script. The next two steps reads the serial number from the text file that the powershell script generated, and write it as a platform notification. Next step add the serial number to the register in the correct order. The Operations Manager agent service is then restarted.

That was all policies included in the workflow. Some minutes after this the target machine will show up in Operations Manager. In most environments it will show up under pending management (configure it at Administration/Global Settings/Security) and a Operations Manager administrator needs to approve it. This blog posted showed you one way to use Opalis together with Operations Manager, when deploying agents to machines in untrusted environments. A task that can be pretty complicated a includes a lot of steps. With Opalis you simple include a target machine name and click Start 🙂

For ideas and info how to build your workflow fault-tolerance, please read this post. It could also be a idea to add some more platform event objects or write to logfile objects, to get some info from the workflow. Make sure that you have a unrestricted executionpolicy on your target machine, so the getserial.ps1 script can run. Make sure no firewall is blocking the traffic and also that the target machine have powershell installed. Also, spend a couple of minutes to make sure all involved accounts have access to write and read to the shared folder. If you want to download the workflow click 4 SCOM Agent 2.

Please note that this is provided “as is” with no warranties at all.

Generate demo objects with Opalis

I often build new sandboxes and test environments for different reasons. One common scenario with all these lab environments is that I never have any data in it for example in Active Directory, in databases, in Service Manager CMDB or in Exchange mailboxes. I have solved this with a number of Opalis workflows. For example I have one to generates a lot of data in a Service Manager CMDB, one to generate mail traffic and then one for Active Directory.

The Active Directory policy is actually three policies. It is one for creating AD users, one for creating AD computers and one for generate common data to the two other policies.

When you are building larger and complex policies it is a good practice to break it down to smaller parts. You can then also call the different parts from different policies and re-use your policies in different scenarios. In this example I have a policy to generate the location data for both users and workstations. Instead of build the same functionality in two policies I have one, which the other two is using. Another thing is to configure the working path green, it gives you a good overview how you want the policy to work. Red links can be failure or fail-over, path that your policy takes when something didn’t turn out the way you were hoping.

I tried to put all info that I will change often in variables, for example domain name, AD paths and number of AD objects I want to generate. It is much easier to change one variable and then configure 10 objects to use it then changing 10 objects in the policy each time I copy the policy to a new environment.

The 1.1 policy (users) and the 1.2 (workstations) policies works almost the same

  1. Starts with a custom start
  2. Reset a counter to 0. This counter will be used to count number of objects we create
  3. Generates a four digits random number that will be used as employment number for users and single copy for workstations. We will also use the first digit here to set the department of the user object
  4.  Trigger the 1.3 policy and then waits until it is done. The 1.3 policy will generate a location and a short name for the location. We use this as for example location and office for user objects and in the name of workstations

  5. Generate a two digits random numbers that will be used to map first name
  6. Generate a two digits random number that will be used to map last name
  7. Map first name, last name and department. I have used a list that I found on the Internet with top 100 first names and top 100 last names. Think it was a Swedish list. Random numbers generated will be mapped to a first name and a last name. We will use the first digit of the 4 digits random number to set the department of the user.

  8. Next step creates the user object. As you can see we re-use a lot of the digits that we have generated, for example to set phone, postalCode and MobilePhone. We use the first name and last name in different combinations and also the variables set under global settings in Opalis. Total of 16 attributes will be configured for each user object.
  9. Next step enables the user account in Active Directory
  10. Set the counter value to +1. We reset the counter in the beginning to 0 and for each created object we add one (+1) to the value
  11. We compare the counter value with the target value. The target value is set by a variable. If we dont have enough objects we will follow the orange link and create more, else we will follow the green link and generate a platform event saying the policy is done.

There are a number of variables that you need to configure before you use the policy.

As you see in the policy there is a lot of random data. For user objects there are 100 first names, 100 last names, 10 locations and a 4 digits number used in a number of ways. That would give us more than 100,000 different user combinations. You can download the policy here, 1. Demo Data. Please note that this is provided “as is” with no warranties at all.

Exchange mail flow check with Opalis and Operations Manager

The Exchange management pack includes a mail flow check, both between internal mail server and cross organization. With the management pack we get good monitoring of all the Exchange components and also that mail servers can communicate with each other. The challenge I see sometimes is that you want to verify e-mail from a client to a client, trough all mail server, spam filters, firewalls and Internet connections. Not only between mail servers, but between clients.

One way is with Opalis. Opalis can integrate with Operations Manager 2007 R2 and there is a community integration pack for Exchange. There are some legacy exchange objects in Opalis too, but you don’t want to use legacy objects. They are there only for backwards compatibility with old Opalis versions. This legacy category is hidden by default in the console, this is all so that new customers would refrain from trying those activities in fresh work flows. More info about legacy objects here.

The Exchange Opalis Extension can be downloaded here. This integration pack is built on Exchange Web Services and is compatible with Exchange 2007 SP1, Exchange 2010 and Exchange 2010 SP1. Please note that this integration pack is not from Microsoft and is not support and tested in the same way as integration packs included in the product.

In this example the policy sends a e-mail every 15 minutes. Then it waits 5 minutes and checks if it has received a answer. The recipient of the e-mail should be a echo mailbox that simple sends a reply pack. If it has received a reply it will empty the inbox and report OK. If not it will generate an alert in Operations Manager.

  1. Every 15 minutes. Trigger the policy to run every 15 minutes
  2. Send Exchange mail. This sends a e-mail. The recipient needs to be a mailbox that simple sends a reply direct.
  3. Wait 5 minutes. There is a trigger delay on this link that waits five minutes, this is the time we give the mail environment to recive the reply
  4. Find mail. This object checks in the inbox for a new e-mail from the echo mailbox. There is always a risk some other mail has been received, so the object makes sure it has the correct subject and from address
  5. Compare Values. This object verifies that the “Find mail” object did find a e-mail. If not it will generate an alert in Operations Manager as there is something not working in the mail flow. If there is a e-mail it will return true and the work flow will move on
  6. Empty Inbox Folder. We don’t want to fill up this test mailbox, therefor we clear the inbox
  7. Write OK to eventlog. We write a simple OK to the event log. Always good with a note that we ran the policy and it was working

This policy will send a e-mail as a client would have, to a e-mail address somewhere on the Internet. The recipient will reply instant and the policy will verify there is a reply received within five minutes. The really nice thing with a mail flow check like this is that you verify everything between the recipient mailbox and your “Opalis client” mailbox. When this policy returns “OK” we know the external mail flow is really working.

Service Manager and Opalis 6.3

In this post I will show you a idea how you can connect Service Manager 2010 with Opalis. The demo scenario is very common, create user account based on a change request. In this demo the change request will be in Service Manager and the user will be created in Active Directory. 

Opalis 6.3 will include new Integration Packs for the four remaining System Center suite products – Configuration Manager, Data Protection Manager, Virtual Machine Manager and Service Manager.  These four new packs along with the existing Operations Manager pack provide deep, dynamic integration within the Datacenter.  And of course you can go cross-platform and cross-application with the 26 other Integration Packs. Opalis 6.3 will be released Q4 CY2010, in other words pretty soon 🙂 More info about 6.3 here. Keep in mind that the information in this article is based on a beta version of the Opalis IP for Service Manager and is subject to change

We will extend Service Manager with a new activity for creating accounts in Active Directory. When Opalis find a instance of the new activity, in “in progress” state, it will read the properties of it and create the account in Active Directory based on them. Opalis will then update the activity in Service Manager so it can continue with other activities in the change. To get this to work we need to perform two major steps, extend service manager and build the workflow in Opalis. The Service Manager step includes two parts, first to author the new management pack with the authoring tool and then create templates in the Service Manager console.

Before we begin, please note that the purpose of this post is to give you a idea what you can do with the new Service Manager integration pack. This is not a production ready solution, for example the Opalis workflow is very simple and without any error handle. Lets begin!

  1. Begin with start the authoring tool for Service Manager. In the management pack explorer pane, right-click classes, select “Create other class…”
  2. In the Base Class window, select Activity as base class
  3. In the Create class window, input contoso.createaccount
  4. In the contoso.createaccount pane, delete the default Property_16 property
  5. Create the following properties in the contoso.createaccount class
    1. First
    2. Last
    3. Department
    4. Opalisinfo
  6. Select the Department property and then in the details pane change data type to list
  7. In the Select a list window, click Create List…
  8. In the create list window, input Departments as internal name and displayname. Select your new list and click OK
  9. In the Management Pack Explorer pane, right-click Forms and select Create
  10. In the Base Class window, select contoso.createaccount
  11. In the Create form window, CreateAccountForm as internal name
  12. Design the form as the picture below. Bind each text box to a propety according to the list below
    1. First name = First
    2. Last name = Last
    3. Department = Department
  13. Add one extra label, and bind (binding path) it to the OpalisInfo property, without any default values. Opalis will populate it later
  14. Save your new MP
  15. Switch to the Service Manager 2010 console and import it into Service Manager 2010
  16. Navigate to Library/Lists and populate the Department list with your departments. Note that this is stored in the management pack. So if you edit the MP in the authoring tool and import it again you will have a blank department list. Instead export the MP from the service manager console and import it in the authoring tool when you need to edit it.
  17. Navigate to Library/Templates and click “Create Template” in the tasks pane
  18. In the Create Template window, input “Create Account” as Name and Description, select “contoso.createaccount” as class and contoso.autoaccount as management pack. Click OK
  19. In the “Form Host” window, click OK to save
  20. Click “Create Template” in the tasks pane again
  21. Input “New User Account” as name, select “Change Request” as class and contoso.autoaccount as management pack. Click Ok
  22. In the “form host” window, change area, priority and risk to suitable value
  23. Click the Activities tab
  24. On the Activities tab click Add, add the “Create Account” activity and click OK and save the new template.

Now we have built the new activity in Service Manager and a change template that use it. The next step is to build a workflow in Opalis that picks up this activity and create the account in Active Directory. The following picture show you how a very simple version of that workflow could look. Please note that this workflow is not production ready and only for demo. To get some ideas how you should build your workflow take a look at this (Fault-tolerance in Opalis policies) and thispost (Opalis creating accounts in Active Directory).

The following pictures show the configuration of my three workflow objects. The first object monitors for new objects of our new class (contoso.createaccount). When it finds a object it moves on to the Active Directory object that will create the user account based on information in the contoso.createaccount object. The Active Directory object could of course include a lot more information about the users, but for this example I only use department, first and last name. When the user account is created the Service Manager object udates the object, both with a new status and with information about the account. The account information is written in the opalisinfo property of the contoso.createaccount class. We did add a blank textbox in the form and bound it to the opalisinfo property. That is where this info will be displayed. Ad you can see in the picture below, the “get object” object will query Service Manager to get a list of your classes, and we can easy select our new class.

Note that we can use any property of the new class, Opalis will query Service Manager and show us all the properties we created in the Authoring Tool. It will in the last step update the service manager object, in this scenario the activity.

I don’t use many of the settings that we can use in Opalis when working with Active Directory and Service Manager. My workflow picks up the new instance of our new activity class. It creates a Active Directory user based on the information that we filled in in the change form. When the workflow has created the user, it updated the form with information about the account, see below. Some things that you might want to add is fail tolerance, what if the user name is already in use? Or if there is anything else that interrupts the workflow. See my post about Opalis failover for more ideas around that. You could also input more properties of the user in the change form, so you get more information inserted into Active Directory direct.

This post showed you a simple example of the integration between Opalis 6.3 and Service Manager 2010. The Opalis IP for Service Manager could be used in a number of ways to integrate system and services to and from Service Manager. I guess you have already figure out a number of scenarios where you can use this integration. Good Luck!